Why Partner with ARouter
- Instant developer reach — Developers using ARouter can access your models immediately without signing up for a separate account or SDK
- Zero-friction switching — Developers can try your model by changing a single string (
model: "yourprovider/your-model") - Traffic growth — Your models appear in the ARouter model catalog and are accessible via the unified
/v1/modelsendpoint - Transparent pricing — You set your prices; ARouter reflects them accurately to end users
Integration Requirements
To integrate with ARouter, your API must provide:1. OpenAI-Compatible Endpoints
ARouter routes requests using the OpenAI API format. Your endpoint should accept:2. Model IDs
Each model you expose should have a unique, stable model ID. ARouter uses theprovider/model convention:
3. Usage Reporting
Your response must include theusage object with accurate token counts:
What ARouter Handles
Once integrated, ARouter handles the following transparently for your users:| Responsibility | ARouter | Your API |
|---|---|---|
| API key management | Routes requests with injected credentials | Validates credentials |
| Provider health monitoring | Circuit breaker + key pool health checks | Responds to requests |
| Rate limiting | Tracks per-key and per-user limits | Enforces your rate limits |
| Usage recording | Records tokens and cost | Returns usage in response |
| Streaming | Passes through SSE stream | Produces SSE stream |
| Error normalization | Standardizes error codes | Returns native errors |
Model Metadata
ARouter exposes rich model metadata to developers via the/v1/models endpoint. For each model you provide, you can supply:
Supported Endpoint Types
| Endpoint Type | Description |
|---|---|
| OpenAI Chat Completions | Standard chat with streaming |
| OpenAI Embeddings | Text embeddings via /v1/embeddings |
| Anthropic Messages | Native Anthropic message format |
| Gemini GenerateContent | Native Gemini format |
| Custom endpoints | Via the Provider Proxy (/{yourprovider}/{path}) |
Provider Proxy
Developers can also send requests directly to your API through ARouter’s provider proxy, bypassing the model-routing layer entirely:Getting Started
To integrate your models with ARouter:- Contact us at providers@arouter.ai with your API documentation and model catalog
- Provide credentials — API keys or OAuth credentials for ARouter to use
- Review model metadata — Confirm model IDs, pricing, context lengths, and supported parameters
- Test integration — ARouter team validates routing and response format compatibility
- Go live — Your models appear in the ARouter catalog and are accessible immediately
Data Handling
ARouter respects your data policies:- Request/response data is not stored by default — see Data Collection
- You can specify whether your endpoints support Zero Data Retention (ZDR)
- Developers can opt into or out of data collection per request