Skip to content

OpenRouter Setup

This guide walks you through setting up OpenRouter as an LLM provider in Crucible.

OpenRouter is a meta-provider that gives you access to 100+ language models through a single API key. Instead of managing separate accounts with OpenAI, Anthropic, Google, Meta, and others, you configure one provider and choose any model at runtime.

Models are referenced in provider/model format — for example, openai/gpt-4o, anthropic/claude-3.5-sonnet, or meta-llama/llama-3.1-405b.

  • Crucible CLI installed
  • An OpenRouter account with API key
  1. Visit openrouter.ai and create an account
  2. Go to Keys in your dashboard
  3. Click Create Key
  4. Copy the key (starts with sk-or-)
Terminal window
export OPENROUTER_API_KEY="sk-or-v1-xxxxxxxxxxxx"

Add this to your shell profile (~/.bashrc, ~/.zshrc, etc.) to persist across sessions.

Add to your crucible.toml or llm_providers.toml:

[llm]
default = "openrouter"
[llm.providers.openrouter]
type = "openrouter"
api_key = "{env:OPENROUTER_API_KEY}"
default_model = "openai/gpt-4o"
temperature = 0.7
max_tokens = 4096

The {env:OPENROUTER_API_KEY} syntax reads the key from your environment variable at runtime, keeping secrets out of config files.

Crucible accepts several aliases for the provider type:

type = "openrouter" # preferred
type = "open_router" # also works
type = "open-router" # also works

OpenRouter uses provider/model format. Some popular choices:

ModelIDNotes
GPT-4oopenai/gpt-4oFast, capable, good default
GPT-4o Miniopenai/gpt-4o-miniCheaper, still capable
Claude 3.5 Sonnetanthropic/claude-3.5-sonnetStrong reasoning
Claude 3 Haikuanthropic/claude-3-haikuFast and cheap
Llama 3.1 405Bmeta-llama/llama-3.1-405b-instructOpen-source, large
Gemini Progoogle/gemini-pro-1.5Google’s flagship

Browse the full model list at openrouter.ai/models.

You can define multiple OpenRouter instances with different default models:

[llm.providers.or-fast]
type = "openrouter"
api_key = "{env:OPENROUTER_API_KEY}"
default_model = "openai/gpt-4o-mini"
temperature = 0.3
[llm.providers.or-smart]
type = "openrouter"
api_key = "{env:OPENROUTER_API_KEY}"
default_model = "anthropic/claude-3.5-sonnet"
temperature = 0.7
max_tokens = 8192
Terminal window
# Use default OpenRouter provider
cru chat
# Specify provider explicitly
cru chat --provider openrouter

Inside a chat session, use the :model command to switch models without restarting:

:model openai/gpt-4o-mini

If you defined named providers (like or-fast and or-smart above):

Terminal window
cru chat --provider or-fast
cru chat --provider or-smart

Ensure OPENROUTER_API_KEY is set in your environment:

Terminal window
echo $OPENROUTER_API_KEY

If empty, set it and restart your terminal.

Check the model ID format. OpenRouter requires provider/model format:

  • openai/gpt-4o
  • gpt-4o

Browse available models at openrouter.ai/models.

OpenRouter applies per-model rate limits based on the upstream provider. If you hit limits, try a different model or wait briefly.

OpenRouter charges per-token based on the upstream model’s pricing. Check your usage at openrouter.ai/activity.