Configuration
All Otto configuration lives in a single `.env` file at the project root. The interactive wizard (`./otto configure`) handles the most common settings, but you can edit the file directly for full c...
All Otto configuration lives in a single .env file at the project root. The interactive wizard (./otto configure) handles the most common settings, but you can edit the file directly for full control.
After changing any values in .env, restart Otto for them to take effect:
LLM Provider
Otto requires one LLM provider. You choose during initial setup, but you can switch at any time.
Google Gemini (Recommended)
Gemini offers strong performance at a competitive price. Get an API key at aistudio.google.com/apikey.
The GEMINI_MODEL default is gemini-3.1-pro-preview-customtools. You can use any model available through the Gemini API.
OpenAI
Get an API key at platform.openai.com/api-keys.
The OPENAI_MODEL default is gpt-5-mini. Set OPEN_AI_BASE if you need to point at an OpenAI-compatible proxy:
Ollama (Local, Free)
Run an LLM on your own hardware with no API costs. Install Ollama from ollama.com, then pull a model:
Configure Otto to use it:
The OLLAMA_BASE_URL uses host.docker.internal so the Docker containers can reach Ollama running on your host machine. If Ollama runs elsewhere on your network, use that machine's IP instead.
Note: Local model quality varies significantly. For production use, Gemini or OpenAI is recommended.
Embeddings
Otto automatically uses the embedding model that matches your LLM provider. You can override this if needed:
| Variable | Default | Description |
|---|---|---|
EMBEDDING_MODEL | models/gemini-embedding-001 (Gemini) or text-embedding-3-small (OpenAI) | Embedding model for vector memory |
OLLAMA_EMBED_MODEL | nomic-embed-text:latest | Embedding model when using Ollama |
Web Search (Tavily)
Tavily gives Otto the ability to search the web for current information. The free tier includes 1,000 searches per month. Get a key at tavily.com.
When configured, web search is available as an MCP tool that Otto can invoke during task execution.
Slack Integration
Connect Otto to your Slack workspace so it can read channels, send messages, and receive tasks via Slack.
Setup
- Create a Slack App at api.slack.com/apps
- Under OAuth & Permissions, add these bot token scopes:
channels:historychannels:readchat:writeusers:readusers:read.emailfiles:read
- Install the app to your workspace and copy the Bot User OAuth Token
Environment Variables
| Variable | Required | Description |
|---|---|---|
SLACK_MCP_XOXB_TOKEN | Yes | Bot User OAuth Token (starts with xoxb-) |
SLACK_SIGNING_SECRET | Yes | App signing secret for verifying Slack requests |
SLACK_BOT_USER_ID | No | Bot user ID, used to detect self-mentions |
Email Integration
Otto can send emails via SMTP and monitor an inbox via IMAP.
Sending Email (SMTP)
For Gmail, use an App Password rather than your account password.
Receiving Email (IMAP)
When IMAP is configured, Otto polls the inbox at a regular interval and can process incoming messages.
| Variable | Default | Description |
|---|---|---|
SMTP_HOST | -- | SMTP server hostname |
SMTP_PORT | 465 | SMTP server port |
SMTP_USER | -- | Email address / username |
SMTP_PASS | -- | Email password or app password |
IMAP_HOST | -- | IMAP server hostname |
IMAP_PORT | 993 | IMAP server port |
IMAP_CHECK | 60 | Poll interval in seconds |
Storage
By default, Otto stores files on the local filesystem. For cloud deployments, you can use Google Cloud Storage or Google Drive.
Google Cloud Storage
Google Drive
| Variable | Default | Description |
|---|---|---|
STORAGE_BACKEND | local | local, gcs, or google_drive |
LOCAL_STORAGE_PATH | /tmp/otto-files | Path for local file storage |
GCS_BUCKET_NAME | -- | GCS bucket name |
GOOGLE_DRIVE_FOLDER_ID | -- | Google Drive root folder ID |
GOOGLE_SERVICE_ACCOUNT_JSON | -- | Service account credentials as JSON string |
GOOGLE_SERVICE_ACCOUNT_FILE | -- | Path to service account JSON key file |
STORAGE_QUOTA_GB | 0 (unlimited) | Maximum storage in GB |
Agent Customization
Adjust Otto's personality and behavior without modifying code.
| Variable | Default | Description |
|---|---|---|
SPECIALIZATION | General Assistance | Agent persona / focus area (e.g., "security analyst", "marketing assistant") |
ADDITIONAL_INSTRUCTIONS | -- | Extra text appended to the agent's system prompt |
OTTO_PERSONA_PATH | -- | Path to a custom persona file (SOUL.md) |
OTTO_PROJECTS_DIR | -- | Directory containing project-specific persona files |
Skills System
Otto loads skills from the skills/ directory. These settings control how skills are discovered and matched to incoming tasks.
| Variable | Default | Description |
|---|---|---|
SKILLS_REFRESH_INTERVAL | 60 | Seconds between skill directory rescans |
OTTO_SKILL_THRESHOLD | 0.35 | Minimum similarity score for a skill to match (0.0 - 1.0) |
OTTO_MAX_SKILLS | 3 | Maximum number of skills loaded per task |
OTTO_SCRIPT_TIMEOUT | 120 | Timeout in seconds for skill script execution |
Performance Tuning
These settings control how the agent processes tool calls and manages large outputs.
| Variable | Default | Description |
|---|---|---|
MAX_PARALLEL_TOOL_CALLS | 5 | Maximum concurrent tool calls per agent step |
MEDIUM_RESULT_THRESHOLD | 4000 | Character count that triggers medium-length summarization |
LARGE_RESULT_THRESHOLD | 12000 | Character count that triggers aggressive summarization |
ENABLE_INTENT_PREPROCESSING | false | Experimental: semantic intent matching for tool selection |
Memory
Otto maintains a vector memory store for cross-session knowledge. These settings control conversation compaction, which keeps context windows manageable during long-running tasks.
| Variable | Default | Description |
|---|---|---|
OTTO_MEMORY_DB | <LOCAL_STORAGE_PATH>/otto_memory.db | Path to the vector memory SQLite database |
COMPACTION_THRESHOLD | 12 | Message count before conversation compaction triggers |
MAX_RECENT_MESSAGES | 8 | Number of recent messages to keep uncompacted |
LangSmith Tracing
LangSmith provides visibility into Otto's LLM calls, tool usage, and agent execution traces. Useful for debugging and optimizing agent behavior.
| Variable | Default | Description |
|---|---|---|
LANGCHAIN_API_KEY | -- | LangSmith API key |
LANGCHAIN_TRACING_V2 | false | Enable tracing |
LANGCHAIN_PROJECT | default | Project name for grouping traces |
LANGCHAIN_ENDPOINT | https://api.smith.langchain.com | LangSmith API endpoint |
Otto syncs LANGCHAIN_* and LANGSMITH_* variants on startup, so you only need to set one set of variables.
Export traces for offline analysis:
Database
Otto uses SQLite by default with no configuration needed. For production deployments with higher concurrency, you can point at PostgreSQL.
| Variable | Default | Description |
|---|---|---|
DATABASE_URL | sqlite:///./otto.db | Database connection string |
REDIS_URL | redis://localhost:6379/0 | Redis connection URL |
REDIS_PASSWORD | -- | Redis authentication password (recommended for production) |
REDIS_TYPE | auto-detected | standard or upstash (auto-detected from URL) |
Upstash Redis
For serverless or cloud deployments, Otto supports Upstash Redis as an alternative to a self-managed Redis instance:
Deployment and Networking
These settings control how Otto is exposed and how cross-origin requests are handled.
| Variable | Default | Description |
|---|---|---|
DEPLOYMENT_MODE | self-hosted | self-hosted or hosted (controls auth middleware) |
DEPLOYMENT_TYPE | cloud_frontend | cloud_frontend or onprem_frontend (controls CORS) |
FRONTEND_URL | https://otto.yourcompany.com | Frontend URL, used for CORS and Slack links |
BACKEND_API_URL | -- | Backend URL (logged on startup for debugging) |
ALLOWED_ORIGINS | -- | Comma-separated CORS origins (fallback) |
CLOUDFLARE_TUNNEL_TOKEN | -- | Enable remote access via Cloudflare Tunnel |
Docker Image Tags
Pin specific versions when you need reproducible deployments:
Both default to latest.
MCP (Model Context Protocol)
Otto can use any MCP-compatible tool server. Tavily web search is pre-configured; additional servers can be added via JSON configuration.
| Variable | Default | Description |
|---|---|---|
MCP_CONFIG_JSON | -- | Full MCP server configuration as a JSON string |
MCP_SERVERS | -- | File path to mcp-config.json (alternative to JSON string) |
Example MCP configuration for Tavily:
Additional MCP servers can also be configured dynamically at runtime through the Otto API and are persisted in Redis.
All Environment Variables
For the complete list of every environment variable Otto supports, see the Environment Variables Reference or backend/.env.example in the repository. The settings on this page cover all variables that affect day-to-day operation. Variables not listed here are either Docker/infrastructure internals or multi-tenant hosting options that do not apply to self-hosted deployments.