OpenRouter Setup
This guide walks you through setting up OpenRouter as an LLM provider in Crucible.
What is OpenRouter?
Section titled “What is OpenRouter?”OpenRouter is a meta-provider that gives you access to 100+ language models through a single API key. Instead of managing separate accounts with OpenAI, Anthropic, Google, Meta, and others, you configure one provider and choose any model at runtime.
Models are referenced in provider/model format — for example, openai/gpt-4o, anthropic/claude-3.5-sonnet, or meta-llama/llama-3.1-405b.
Prerequisites
Section titled “Prerequisites”- Crucible CLI installed
- An OpenRouter account with API key
Step 1: Get an API Key
Section titled “Step 1: Get an API Key”- Visit openrouter.ai and create an account
- Go to Keys in your dashboard
- Click Create Key
- Copy the key (starts with
sk-or-)
Step 2: Set the Environment Variable
Section titled “Step 2: Set the Environment Variable”export OPENROUTER_API_KEY="sk-or-v1-xxxxxxxxxxxx"Add this to your shell profile (~/.bashrc, ~/.zshrc, etc.) to persist across sessions.
Step 3: Configure the Provider
Section titled “Step 3: Configure the Provider”Add to your crucible.toml or llm_providers.toml:
[llm]default = "openrouter"
[llm.providers.openrouter]type = "openrouter"api_key = "{env:OPENROUTER_API_KEY}"default_model = "openai/gpt-4o"temperature = 0.7max_tokens = 4096The {env:OPENROUTER_API_KEY} syntax reads the key from your environment variable at runtime, keeping secrets out of config files.
Configuration
Section titled “Configuration”Provider Type Aliases
Section titled “Provider Type Aliases”Crucible accepts several aliases for the provider type:
type = "openrouter" # preferredtype = "open_router" # also workstype = "open-router" # also worksModel Format
Section titled “Model Format”OpenRouter uses provider/model format. Some popular choices:
| Model | ID | Notes |
|---|---|---|
| GPT-4o | openai/gpt-4o | Fast, capable, good default |
| GPT-4o Mini | openai/gpt-4o-mini | Cheaper, still capable |
| Claude 3.5 Sonnet | anthropic/claude-3.5-sonnet | Strong reasoning |
| Claude 3 Haiku | anthropic/claude-3-haiku | Fast and cheap |
| Llama 3.1 405B | meta-llama/llama-3.1-405b-instruct | Open-source, large |
| Gemini Pro | google/gemini-pro-1.5 | Google’s flagship |
Browse the full model list at openrouter.ai/models.
Multiple Configurations
Section titled “Multiple Configurations”You can define multiple OpenRouter instances with different default models:
[llm.providers.or-fast]type = "openrouter"api_key = "{env:OPENROUTER_API_KEY}"default_model = "openai/gpt-4o-mini"temperature = 0.3
[llm.providers.or-smart]type = "openrouter"api_key = "{env:OPENROUTER_API_KEY}"default_model = "anthropic/claude-3.5-sonnet"temperature = 0.7max_tokens = 8192Start a Chat
Section titled “Start a Chat”# Use default OpenRouter providercru chat
# Specify provider explicitlycru chat --provider openrouterSwitch Models at Runtime
Section titled “Switch Models at Runtime”Inside a chat session, use the :model command to switch models without restarting:
:model openai/gpt-4o-miniUse with Named Providers
Section titled “Use with Named Providers”If you defined named providers (like or-fast and or-smart above):
cru chat --provider or-fastcru chat --provider or-smartTroubleshooting
Section titled “Troubleshooting””Missing API key”
Section titled “”Missing API key””Ensure OPENROUTER_API_KEY is set in your environment:
echo $OPENROUTER_API_KEYIf empty, set it and restart your terminal.
”Model not found”
Section titled “”Model not found””Check the model ID format. OpenRouter requires provider/model format:
- ✅
openai/gpt-4o - ❌
gpt-4o
Browse available models at openrouter.ai/models.
Rate limiting
Section titled “Rate limiting”OpenRouter applies per-model rate limits based on the upstream provider. If you hit limits, try a different model or wait briefly.
Billing
Section titled “Billing”OpenRouter charges per-token based on the upstream model’s pricing. Check your usage at openrouter.ai/activity.