Configuration - djvolz/coda-code-assistant GitHub Wiki
Configuration
Coda uses a flexible configuration system based on TOML files with support for multiple configuration sources and environment variable overrides.
Quick Setup
# Copy the example configuration
cp examples/config.example.toml ~/.config/coda/config.toml
# Edit with your settings
nano ~/.config/coda/config.toml
# Set required environment variables
export OCI_COMPARTMENT_ID="ocid1.compartment.oc1..example"
Configuration Sources
Coda loads configuration from multiple sources in priority order (highest to lowest):
- Environment variables (highest priority)
- Project-specific:
.coda/config.toml
in current directory - User-specific:
~/.config/coda/config.toml
- System-wide:
/etc/coda/config.toml
(Unix-like systems)
Settings from higher priority sources override those from lower priority sources.
Configuration Format
Coda uses TOML format for configuration files. Here's a complete example:
# Default provider to use
default_provider = "oci_genai"
# Provider configurations
[providers.oci_genai]
compartment_id = "ocid1.compartment.oc1..example"
config_profile = "DEFAULT"
config_file_location = "~/.oci/config"
[providers.litellm]
api_base = "http://localhost:8000"
default_model = "gpt-4"
[providers.ollama]
host = "http://localhost:11434"
default_model = "llama3"
# Session settings
[session]
history_file = "~/.local/share/coda/history"
max_history = 1000
autosave = true
# UI settings
[ui]
theme = "default" # Options: default, dark, light, minimal, vibrant, monokai_dark, monokai_light, dracula_dark, dracula_light, gruvbox_dark, gruvbox_light
show_model_info = true
show_token_usage = false
show_thinking_indicator = true
# Tool settings
[tools]
enabled = true
filesystem_enabled = true
git_enabled = true
web_enabled = true
system_enabled = true
require_confirmation = false # Ask before executing potentially dangerous operations
# Agent settings
[agent]
enabled = true
max_iterations = 10 # Maximum tool calls per conversation turn
show_tool_calls = true # Display tool usage in conversation
# Generation parameters
temperature = 0.7
max_tokens = 2000
# Debug mode
debug = false
Core Settings
Basic Configuration
Setting | Type | Default | Description |
---|---|---|---|
default_provider |
string | "oci_genai" |
Default AI provider to use |
debug |
boolean | false |
Enable debug mode with verbose logging |
temperature |
float | 0.7 |
Generation temperature (0-2) |
max_tokens |
integer | 2000 |
Maximum tokens to generate |
Provider Configuration
OCI GenAI Provider
[providers.oci_genai]
compartment_id = "ocid1.compartment.oc1..example" # Required
config_profile = "DEFAULT" # Optional
config_file_location = "~/.oci/config" # Optional
compartment_id
: (Required) Your OCI compartment OCIDconfig_profile
: OCI config profile name (default: "DEFAULT")config_file_location
: Path to OCI config file (default: "~/.oci/config")
LiteLLM Provider
[providers.litellm]
api_base = "http://localhost:8000"
default_model = "gpt-4"
api_key = "your-api-key" # Optional, can use env vars
api_base
: Base URL for LiteLLM proxy serverdefault_model
: Default model to useapi_key
: API key (prefer environment variables)
Ollama Provider
[providers.ollama]
host = "http://localhost:11434"
default_model = "llama3"
timeout = 30
host
: Ollama server host and portdefault_model
: Default model to usetimeout
: Request timeout in seconds
Session Configuration
[session]
history_file = "~/.local/share/coda/history"
max_history = 1000
autosave = true
database_path = "~/.cache/coda/sessions.db"
history_file
: Command history file locationmax_history
: Maximum command history entriesautosave
: Enable automatic session savingdatabase_path
: Session database location
UI Configuration
[ui]
theme = "default"
show_model_info = true
show_token_usage = false
show_thinking_indicator = true
prompt_style = "default"
theme
: UI color theme. Available themes:default
- Default balanced themedark
- Dark mode optimized for low lightlight
- Light theme for bright environmentsminimal
- Minimal colors for focused workvibrant
- High contrast with vibrant colorsmonokai_dark
- Monokai color scheme - dark variantmonokai_light
- Monokai color scheme - light variantdracula_dark
- Dracula theme - dark variantdracula_light
- Dracula theme - light variantgruvbox_dark
- Gruvbox retro groove - dark variantgruvbox_light
- Gruvbox retro groove - light variant
show_model_info
: Display current model in promptshow_token_usage
: Show token usage statisticsshow_thinking_indicator
: Display animated indicator while AI is processingprompt_style
: Prompt appearance style
Environment Variables
Environment variables override configuration file settings:
Variable | Description | Example |
---|---|---|
CODA_DEFAULT_PROVIDER |
Default provider | export CODA_DEFAULT_PROVIDER=ollama |
CODA_DEBUG |
Enable debug mode | export CODA_DEBUG=true |
CODA_TEMPERATURE |
Generation temperature | export CODA_TEMPERATURE=0.5 |
CODA_MAX_TOKENS |
Maximum tokens | export CODA_MAX_TOKENS=4000 |
Provider-Specific Environment Variables
OCI GenAI:
export OCI_COMPARTMENT_ID="ocid1.compartment.oc1..example"
export OCI_CONFIG_PROFILE="MY_PROFILE"
export OCI_CONFIG_FILE="~/my-oci-config"
LiteLLM:
export LITELLM_API_BASE="http://localhost:8000"
export LITELLM_API_KEY="your-api-key"
Ollama:
export OLLAMA_HOST="http://localhost:11434"
Configuration Examples
Personal Development Setup
default_provider = "ollama"
[providers.ollama]
host = "http://localhost:11434"
default_model = "codellama"
[session]
autosave = true
max_history = 2000
[ui]
show_model_info = true
show_token_usage = true
Team/Corporate Setup
default_provider = "oci_genai"
[providers.oci_genai]
compartment_id = "ocid1.compartment.oc1..corporate"
config_profile = "TEAM_PROFILE"
[providers.litellm]
api_base = "https://internal-llm-proxy.company.com"
[session]
autosave = true
database_path = "/shared/coda/sessions.db"
[ui]
theme = "corporate"
show_token_usage = true
Project-Specific Configuration
Create .coda/config.toml
in your project directory:
# Use specific model for this project
default_provider = "oci_genai"
[providers.oci_genai]
compartment_id = "ocid1.compartment.oc1..project_specific"
# Project-specific generation settings
temperature = 0.3 # More deterministic for code
max_tokens = 4000 # Longer responses for complex code
[session]
autosave = true
# Project sessions in project directory
database_path = ".coda/sessions.db"
Setup Guides
First-Time Setup
-
Create configuration directory:
mkdir -p ~/.config/coda
-
Copy example configuration:
cp examples/config.example.toml ~/.config/coda/config.toml
-
Edit configuration:
nano ~/.config/coda/config.toml
-
Set environment variables:
echo 'export OCI_COMPARTMENT_ID="your-compartment-id"' >> ~/.bashrc source ~/.bashrc
Validating Configuration
Test your configuration:
# Check if configuration loads correctly
coda --version
# Test with debug mode
CODA_DEBUG=true coda "Hello, world!"
# List available models
coda
> /model
Troubleshooting
Common Issues
Configuration not found:
- Check file path:
~/.config/coda/config.toml
- Verify file permissions are readable
- Use absolute paths if relative paths fail
Provider authentication fails:
- Verify environment variables are set correctly
- Check OCI config file and profile
- Test credentials with OCI CLI tools
Permission errors:
- Ensure write permissions for session database directory
- Check XDG directory permissions
- Use
chmod
to fix permission issues
Debug Configuration
Enable debug mode to troubleshoot:
CODA_DEBUG=true coda
This will show:
- Configuration loading process
- Provider initialization
- API requests and responses
- Session operations
Advanced Configuration
Custom Provider Setup
You can configure multiple provider instances:
[providers.oci_genai_dev]
compartment_id = "ocid1.compartment.oc1..dev"
config_profile = "DEV"
[providers.oci_genai_prod]
compartment_id = "ocid1.compartment.oc1..prod"
config_profile = "PROD"
Configuration Validation
Coda validates configuration on startup:
- Required fields for enabled providers
- Valid TOML syntax
- File permissions and accessibility
- Environment variable format
Related Documentation
- Getting-Started - Basic setup and installation
- OCI-GenAI-Integration - OCI-specific configuration
- Session-Management - Session storage configuration
- Troubleshooting - Configuration troubleshooting