Configuration Options
This page explains how to configure Mito Data Copilot to use your own AI API keys instead of the Mito server.
Last updated
Was this helpful?
This page explains how to configure Mito Data Copilot to use your own AI API keys instead of the Mito server.
Last updated
Was this helpful?
By default, Mito uses our server to send AI requests to the model provider. If instead you want to use your own AI API keys, you can set the following configuration options.
Mito supports the following AI models through environment variables:
Set OPENAI_API_KEY
to your OpenAI API key
When using OpenAI, Mito will automatically use gpt-4.1.
Set CLAUDE_MODEL
to (e.g., "claude-3-7-sonnet-latest").
Set CLAUDE_API_KEY
to your Anthropic API key
Set GEMINI_MODEL
to (eg., "gemini-2.0-flash").
Set GEMINI_API_KEY
to your Google API key
Set OLLAMA_MODEL
to specify the model
Set OLLAMA_BASE_URL
to your Ollama server URL (e.g., "http://localhost:11434/v1")
Set AZURE_OPENAI_API_KEY
to your Azure OpenAI API key
Set AZURE_OPENAI_API_VERSION
to specify the API version
Set AZURE_OPENAI_ENDPOINT
to your Azure OpenAI endpoint URL
Set AZURE_OPENAI_MODEL
to specify the deployed model name
Important: Environment variables must be set before launching JupyterLab, as they are read when the Mito server extension initializes during startup.
Set environment variables at the system level before starting JupyterLab:
On Windows:
On macOS/Linux:
Create a .env
file in your Jupyter config directory:
Create or modify your jupyter_server_config.py
file to load these variables on startup:
Add the environment variables to your shell's configuration file for permanent setup:
On Windows:
Add environment variables through System Properties > Environment Variables.
On macOS/Linux:
Add to your .bashrc
, .zshrc
, or equivalent:
Remember that when using external AI providers:
Private data in dataframe names, column headers, or the first five rows of data might be shared with the AI provider
To maximize data protection, Mito Enterprise users can connect to a self-hosted model
If you are a user, you can configure Mito to use a Azure OpenAI endpoint instead. If you have questions about Mito Enterprise, please contact .