ARouter 为主流 LLM 提供商提供统一的 API key 和单一端点。如果你已经在使用 OpenAI 兼容的客户端,迁移通常只需更改 base_url、api_key,以及可选的应用归因请求头。
1. 获取 API Key
在 ARouter 控制台 注册并创建 API key。
你的 key 格式类似 lr_live_xxxxxxxxxxxx。
2. 安装 ARouter SDK
官方 @arouter/sdk 适用于任何 Node.js 或 TypeScript 项目,支持 npm、yarn 和 pnpm。
import { ARouter } from "@arouter/sdk";
const client = new ARouter({
apiKey: "lr_live_xxxx",
baseURL: "https://api.arouter.ai",
});
const response = await client.chatCompletion({
model: "openai/gpt-5.4",
messages: [{ role: "user", content: "你好!" }],
});
console.log(response.choices[0].message.content);
查看 Node.js / TypeScript SDK 指南,了解流式输出、key 管理和 x402 支付示例。
3. 使用现有 SDK
已在使用 OpenAI、Anthropic 或 Go?只需更改 base_url 和 api_key 即可。
Python (OpenAI)
Node.js (OpenAI)
Python (Anthropic)
Go
cURL
fetch
from openai import OpenAI
client = OpenAI(
base_url="https://api.arouter.ai/v1",
api_key="lr_live_xxxx",
default_headers={
"HTTP-Referer": "https://myapp.com", # 可选
"X-Title": "My AI App", # 可选
},
)
response = client.chat.completions.create(
model="openai/gpt-5.4",
messages=[{"role": "user", "content": "你好!"}],
)
print(response.choices[0].message.content)
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.arouter.ai/v1",
apiKey: "lr_live_xxxx",
defaultHeaders: {
"HTTP-Referer": "https://myapp.com", // 可选
"X-Title": "My AI App", // 可选
},
});
const response = await client.chat.completions.create({
model: "openai/gpt-5.4",
messages: [{ role: "user", content: "你好!" }],
});
console.log(response.choices[0].message.content);
import anthropic
client = anthropic.Anthropic(
base_url="https://api.arouter.ai",
api_key="lr_live_xxxx",
)
message = client.messages.create(
model="claude-sonnet-4.6",
max_tokens=1024,
messages=[{"role": "user", "content": "你好!"}],
)
print(message.content[0].text)
package main
import (
"context"
"fmt"
"log"
"github.com/arouter-ai/arouter-go"
)
func main() {
client := arouter.NewClient("lr_live_xxxx",
arouter.WithBaseURL("https://api.arouter.ai/v1"),
arouter.WithHeader("HTTP-Referer", "https://myapp.com"),
arouter.WithHeader("X-Title", "My AI App"),
)
resp, err := client.CreateChatCompletion(context.Background(), arouter.ChatCompletionRequest{
Model: "openai/gpt-5.4",
Messages: []arouter.Message{
{Role: "user", Content: "你好!"},
},
})
if err != nil {
log.Fatal(err)
}
fmt.Println(resp.Choices[0].Message.Content)
}
curl https://api.arouter.ai/v1/chat/completions \
-H "Authorization: Bearer lr_live_xxxx" \
-H "HTTP-Referer: https://myapp.com" \
-H "X-Title: My AI App" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-5.4",
"messages": [{"role": "user", "content": "你好!"}]
}'
const response = await fetch('https://api.arouter.ai/v1/chat/completions', {
method: 'POST',
headers: {
'Authorization': 'Bearer lr_live_xxxx',
'Content-Type': 'application/json',
'HTTP-Referer': 'https://myapp.com', // 可选:来源追踪
'X-Title': 'My AI App', // 可选:显示名称
},
body: JSON.stringify({
model: 'openai/gpt-5.4',
messages: [{ role: 'user', content: '你好!' }],
}),
});
const data = await response.json();
console.log(data.choices[0].message.content);
HTTP-Referer 和 X-Title 为可选项。如果你希望 ARouter 控制台分析将请求归因到特定应用或工作流,请包含这两个请求头。
4. 直接调用 API
如果不想安装 SDK,可以使用任意 HTTP 客户端调用 ARouter:
import json
import requests
response = requests.post(
"https://api.arouter.ai/v1/chat/completions",
headers={
"Authorization": "Bearer lr_live_xxxx",
"HTTP-Referer": "https://myapp.com", # 可选
"X-Title": "My AI App", # 可选
"Content-Type": "application/json",
},
data=json.dumps({
"model": "openai/gpt-5.4",
"messages": [{"role": "user", "content": "你好!"}],
}),
)
print(response.json()["choices"][0]["message"]["content"])
5. 切换不同提供商
使用 ARouter,切换提供商只需更改 model 字符串:
# OpenAI
response = client.chat.completions.create(model="openai/gpt-5.4", ...)
# Anthropic(通过 OpenAI SDK!)
response = client.chat.completions.create(model="anthropic/claude-sonnet-4.6", ...)
# Google Gemini
response = client.chat.completions.create(model="google/gemini-2.5-flash", ...)
# DeepSeek
response = client.chat.completions.create(model="deepseek/deepseek-v3.2", ...)
如果省略提供商前缀(例如只写 "gpt-5.4"),ARouter 默认使用 OpenAI。
下一步
模型路由
理解 provider/model 格式与多模型路由
结构化输出
强制模型返回有效的 JSON schema