/guides/openrouter
OpenRouter
Configure free-claude-code to use OpenRouter. Access hundreds of models including free tier options.
OpenRouter
OpenRouter provides access to hundreds of models from multiple providers through a single API. Many models have a free tier; others are pay-per-use.
Get an API key
- Visit openrouter.ai/keys
- Create an account or sign in
- Generate a new API key
- Copy the key (starts with
sk-or-)
Configure the proxy
Edit your .env file:
OPENROUTER_API_KEY="sk-or-your-key-here"
MODEL_OPUS="open_router/deepseek/deepseek-r1-0528:free"
MODEL_SONNET="open_router/openai/gpt-oss-120b:free"
MODEL_HAIKU="open_router/stepfun/step-3.5-flash:free"
MODEL="open_router/stepfun/step-3.5-flash:free"
Model format
OpenRouter models use the format:
open_router/organization/model-name
Add :free suffix for free tier models:
open_router/deepseek/deepseek-r1-0528:free
Free tier models
Popular free options as of the latest release:
| Model | Identifier | Strengths |
|---|---|---|
| DeepSeek R1 | open_router/deepseek/deepseek-r1-0528:free | Reasoning, math, coding |
| GPT-OSS 120B | open_router/openai/gpt-oss-120b:free | General knowledge |
| Step-3.5 Flash | open_router/stepfun/step-3.5-flash:free | Fast responses |
| Trinity Large | open_router/arcee-ai/trinity-large-preview:free | Coding tasks |
Browse all free models at openrouter.ai/collections/free-models.
Per-model routing
Mix free and paid models for different Claude tiers:
# Use free models for lighter tasks
MODEL_HAIKU="open_router/stepfun/step-3.5-flash:free"
# Use paid models for heavy lifting
MODEL_OPUS="open_router/anthropic/claude-3-opus"
MODEL_SONNET="open_router/anthropic/claude-3.5-sonnet"
# Fallback to free
MODEL="open_router/stepfun/step-3.5-flash:free"
Rate limits and quotas
Free tier rate limits vary by model:
- Some: 10 requests per minute
- Others: 20 requests per minute
- Limits reset on different schedules
Check OpenRouter’s dashboard for current quota usage.
Configure proxy-side limits:
PROVIDER_RATE_LIMIT=15
PROVIDER_RATE_WINDOW=60
PROVIDER_MAX_CONCURRENCY=2
Proxy configuration
If you need to route OpenRouter requests through a proxy:
OPENROUTER_PROXY="http://username:password@host:port"
Troubleshooting
“Insufficient credits” errors: You are using a paid model without adding funds. Switch to a :free suffix model or add credits.
“Model not available” errors: The specific model may be temporarily offline. Check openrouter.ai/models for status.
Slower than NVIDIA NIM: Free tier models on OpenRouter may have higher latency. For fast responses, consider NVIDIA NIM’s free tier or OpenRouter’s paid tier.
When to use OpenRouter
Use OpenRouter when:
- You need model variety for different tasks
- You want to experiment with cutting-edge models
- NVIDIA NIM doesn’t have the specific model you need
- You are willing to pay for premium models occasionally
Consider alternatives when:
- You need guaranteed low latency (NVIDIA NIM free tier is often faster)
- You want completely offline operation (use LM Studio or llama.cpp)
- You want predictable costs (NVIDIA NIM free tier has hard limits)