Skip to the content.

OpenRouter Proxy

Use any OpenRouter model with Claude Code CLI

OpenRouter Proxy is a local API bridge that enables Claude Code to work with any model available on OpenRouter, including Qwen, Llama, Mistral, GPT-4, and more.


How It Works

Claude Code CLI → Local Proxy (localhost:8787) → OpenRouter API
     │                    │                           │
     │ Anthropic format   │ Format conversion         │ OpenAI format
     └────────────────────┴───────────────────────────┘

The proxy translates between Anthropic’s API format (used by Claude Code) and OpenAI’s format (used by OpenRouter), enabling seamless compatibility.


Features

Feature Description
Multi-Model Support Use any OpenRouter model with Claude Code
Full Tool Support Agent capabilities and MCP tools work seamlessly
Streaming Real-time streaming responses
Auto-Retry Automatic retry with exponential backoff
Model Fallback Switch to backup model on rate limits
Usage Dashboard Real-time statistics at /dashboard

Quick Start

1. Clone the repository

git clone https://github.com/prabhuvikas/openrouter-proxy.git
cd openrouter-proxy

2. Configure your API key

cp .env.example .env
# Edit .env and add your OpenRouter API key

3. Start the proxy

docker compose up -d

4. Launch Claude Code

Windows:

.\launch-claude-openrouter.ps1

Mac/Linux:

./launch-claude-openrouter.sh

Model Best For Cost
qwen/qwen-2.5-coder-32b-instruct Coding tasks Free
anthropic/claude-3.5-sonnet General purpose Paid
meta-llama/llama-3.1-70b-instruct Open source Free
google/gemini-pro Versatile Free tier
mistral/mistral-large European AI Paid

View all models →


Usage Dashboard

Monitor your proxy usage in real-time:

Track requests, tokens, errors, and per-model statistics.


Documentation


Requirements



Built with Claude Code