Configuration
Talon uses a JSON configuration file to manage AI providers, models, permissions, and communication channels. This guide covers all configuration options and how to set them up.
Config File Location
Section titled “Config File Location”The talon.json configuration file is stored in your application data directory. The exact location depends on your operating system:
~/Library/Application Support/com.talon.app/talon.jsonWindows
Section titled “Windows”%APPDATA%\com.talon.app\talon.json~/.config/com.talon.app/talon.jsonConfiguration Structure
Section titled “Configuration Structure”The talon.json file follows this basic structure:
{ "models": { "providers": [ // Provider configurations ], "default_model": "provider/model" }, "default_temperature": 0.7, "permission_mode": "allow", "channels": { "slack": { "accounts": {} }, "discord": { "accounts": {} } }}AI Providers
Section titled “AI Providers”Configure one or more AI providers under the models.providers array. Each provider requires a name and API key, with optional base URL configuration.
Provider Configuration
Section titled “Provider Configuration”Each provider object requires:
- name: Provider identifier (e.g.,
openai,anthropic,groq) - api_key: Your API key for the provider
- base_url (optional): Custom endpoint URL (useful for self-hosted or proxy setups)
- model (optional): Default model for this provider
Supported Providers
Section titled “Supported Providers”Talon supports the following AI providers:
- OpenAI - GPT-4, GPT-4 Turbo, GPT-3.5 Turbo
- Anthropic - Claude 3 Opus, Claude 3 Sonnet, Claude 3 Haiku
- Google Gemini - Gemini Pro and other variants
- Groq - Fast inference with Mixtral and Llama
- Together - Open-source models and fine-tuned variants
- Ollama - Local LLM inference
- OpenAI-Compatible - Any API implementing the OpenAI API specification
Example: Multiple Providers
Section titled “Example: Multiple Providers”{ "models": { "providers": [ { "name": "openai", "api_key": "sk-...", "model": "gpt-4" }, { "name": "anthropic", "api_key": "sk-ant-...", "model": "claude-3-opus-20240229" }, { "name": "groq", "api_key": "gsk_...", "model": "mixtral-8x7b-32768" } ], "default_model": "anthropic/claude-3-opus-20240229" }}Setting the Default Model
Section titled “Setting the Default Model”Specify the default model using the models.default_model field in the format provider/model:
{ "models": { "default_model": "anthropic/claude-3-opus-20240229" }}This model will be used for all requests unless explicitly overridden in a channel or command configuration.
Local Models with Ollama
Section titled “Local Models with Ollama”For self-hosted models using Ollama, configure a custom base URL:
{ "models": { "providers": [ { "name": "ollama", "base_url": "http://localhost:11434/v1", "api_key": "ollama" } ], "default_model": "ollama/mistral" }}Custom OpenAI-Compatible APIs
Section titled “Custom OpenAI-Compatible APIs”If you’re using a proxy, gateway, or custom API that implements the OpenAI specification:
{ "models": { "providers": [ { "name": "custom-api", "api_key": "your-api-key", "base_url": "https://your-api.example.com/v1" } ] }}Default Temperature
Section titled “Default Temperature”The default_temperature setting controls the randomness and creativity of AI responses across all models. This setting applies globally unless overridden per-channel or per-command.
Temperature Range
Section titled “Temperature Range”- 0.0: Deterministic, focused, and consistent responses (best for precise tasks)
- 0.7: Balanced (default - good for general use)
- 2.0: Maximum randomness and creativity (best for brainstorming)
Example Configuration
Section titled “Example Configuration”{ "default_temperature": 0.5}Permission Modes
Section titled “Permission Modes”Permission modes control how Talon handles requests and determine whether actions execute automatically or require approval.
Available Modes
Section titled “Available Modes”- plan: Talon shows a plan of what it will do and waits for confirmation before executing
- ask: Talon asks before taking each action
- allow: Talon executes actions automatically (default)
- bypass: Talon bypasses all safety checks and permission prompts
Configuration
Section titled “Configuration”{ "permission_mode": "allow"}Permission Mode Behaviors
Section titled “Permission Mode Behaviors”| Mode | Behavior |
|---|---|
plan | Shows overall plan, requires approval before any execution |
ask | Requests confirmation before each individual action |
allow | Executes automatically, no prompts (default) |
bypass | Skips all checks and confirmations |
Channels
Section titled “Channels”Talon can integrate with multiple communication platforms. Channels are configured under the channels object, organized by type (e.g., slack, discord). Each channel type has its own required fields and authentication methods.
Channel Structure
Section titled “Channel Structure”{ "channels": { "slack": { "accounts": { "workspace-name": { // Slack-specific configuration } } }, "discord": { "accounts": { "server-name": { // Discord-specific configuration } } } }}Configuring a Channel
Section titled “Configuring a Channel”Each channel account configuration varies by platform. For detailed setup instructions specific to each channel type, refer to the Channels documentation.
Example: Slack Integration
Section titled “Example: Slack Integration”{ "channels": { "slack": { "accounts": { "my-workspace": { "bot_token": "xoxb-...", "signing_secret": "..." } } } }}Complete Example Configuration
Section titled “Complete Example Configuration”Here’s a complete example with multiple providers, custom settings, and channel configuration:
{ "models": { "providers": [ { "name": "anthropic", "api_key": "sk-ant-...", "model": "claude-3-opus-20240229" }, { "name": "openai", "api_key": "sk-...", "model": "gpt-4" }, { "name": "ollama", "base_url": "http://localhost:11434/v1", "api_key": "ollama" } ], "default_model": "anthropic/claude-3-opus-20240229" }, "default_temperature": 0.7, "permission_mode": "allow", "channels": { "slack": { "accounts": { "engineering-team": { "bot_token": "xoxb-...", "signing_secret": "..." } } }, "discord": { "accounts": { "dev-server": { "bot_token": "..." } } } }}Next Steps
Section titled “Next Steps”- Channel Setup: Learn how to configure Slack, Discord, and other channels
- Models & Providers: Explore available AI models and their capabilities
- Security: Review security best practices for storing API keys