ARouter’s provider proxy lets you forward requests directly to any supported provider’s native API, using your ARouter API key as authentication. You don’t need individual API keys for each provider.
Endpoint
POST /{provider}/{path}
GET /{provider}/{path}
Where {provider} is the provider slug and {path} is the provider’s native API path.
When to Use the Provider Proxy
| Use case | Recommended approach |
|---|
| Standard LLM chat completions | /v1/chat/completions with provider/model format |
| Provider-specific features (e.g., Anthropic Batches, Gemini cached content) | /{provider}/{path} proxy |
| Native streaming format | /{provider}/{path} proxy |
| Provider-specific parameters not in the ARouter schema | /{provider}/{path} proxy |
Supported Providers
| Provider | Slug | Base URL |
|---|
| OpenAI | openai | https://api.openai.com |
| Anthropic | anthropic | https://api.anthropic.com |
| Google Gemini | google | https://generativelanguage.googleapis.com |
| DeepSeek | deepseek | https://api.deepseek.com |
| xAI | xai | https://api.x.ai |
| Kimi (Moonshot) | kimi | https://api.moonshot.cn |
| MiniMax | minimax | https://api.minimax.chat |
| Mistral | mistral | https://api.mistral.ai |
| Groq | groq | https://api.groq.com |
| Cohere | cohere | https://api.cohere.com |
| NVIDIA | nvidia | https://integrate.api.nvidia.com |
| Dashscope (Alibaba) | dashscope | https://dashscope.aliyuncs.com |
Examples
OpenAI — Chat Completions
curl https://api.arouter.ai/openai/v1/chat/completions \
-H "Authorization: Bearer lr_live_xxxx" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-5.4",
"messages": [{"role": "user", "content": "Hello!"}]
}'
Anthropic — Messages
curl https://api.arouter.ai/anthropic/v1/messages \
-H "Authorization: Bearer lr_live_xxxx" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4.6",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello!"}]
}'
Anthropic — Message Batches
# Create a batch
curl https://api.arouter.ai/anthropic/v1/messages/batches \
-X POST \
-H "Authorization: Bearer lr_live_xxxx" \
-H "Content-Type: application/json" \
-d '{
"requests": [
{
"custom_id": "request-1",
"params": {
"model": "claude-sonnet-4.6",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello!"}]
}
}
]
}'
# Check batch status
curl https://api.arouter.ai/anthropic/v1/messages/batches/msgbatch_xxx \
-H "Authorization: Bearer lr_live_xxxx"
Google Gemini — Generate Content
curl "https://api.arouter.ai/google/v1beta/models/gemini-2.5-flash:generateContent" \
-X POST \
-H "Authorization: Bearer lr_live_xxxx" \
-H "Content-Type: application/json" \
-d '{
"contents": [
{"role": "user", "parts": [{"text": "Hello!"}]}
]
}'
Google Gemini — List Models
curl "https://api.arouter.ai/google/v1beta/models" \
-H "Authorization: Bearer lr_live_xxxx"
DeepSeek — Chat Completions
curl https://api.arouter.ai/deepseek/v1/chat/completions \
-H "Authorization: Bearer lr_live_xxxx" \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek-v3.2",
"messages": [{"role": "user", "content": "Hello!"}]
}'
Notes
- Authentication: ARouter injects the provider’s API key automatically. You only need your ARouter key.
- Native format: Requests and responses are passed through as-is — ARouter does not transform them.
- Usage tracking: Token usage is still recorded against your ARouter account.
- Path passthrough: Everything after
/{provider}/ is forwarded to the provider unchanged.
When using provider proxy for Anthropic, the anthropic-version header is injected automatically. You don’t need to include it.