Skip to content

调用 API

Base URL

所有 API 请求的基地址:

http://ok-link.com:10320/v1/

本地部署:如自建部署,将 ok-link.com:10320 替换为您的服务器地址和端口(默认 127.0.0.1:8180)。

认证方式

所有 API 请求需要在 HTTP 头中携带 API Key:

Authorization: Bearer sk-xxxxxxxxxxxx

API Key 在「API Key 管理」页面创建。平台同时兼容 OpenAI 和 Anthropic 两种协议格式,使用同一个 Key。

OpenAI 协议调用

对话补全

兼容 OpenAI chat/completions 接口。

bash
POST /v1/chat/completions
Authorization: Bearer sk-xxxxxxxxxxxx
Content-Type: application/json

非流式请求示例:

json
{
  "model": "deepseek-chat",
  "messages": [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Hello!"}
  ],
  "temperature": 0.7,
  "max_tokens": 1024
}

流式请求:

json
{
  "model": "deepseek-chat",
  "messages": [
    {"role": "user", "content": "Hello!"}
  ],
  "stream": true
}

流式响应的数据格式与 OpenAI 一致,采用 Server-Sent Events (SSE)。

向量嵌入

bash
POST /v1/embeddings
Authorization: Bearer sk-xxxxxxxxxxxx
Content-Type: application/json
json
{
  "model": "text-embedding-3-small",
  "input": "The text to embed"
}

模型列表

获取当前账号已订阅的模型列表:

bash
GET /v1/models
Authorization: Bearer sk-xxxxxxxxxxxx

Anthropic 协议调用

对话补全

兼容 Anthropic Messages API,端点 /v1/messages

bash
POST /v1/messages
Authorization: Bearer sk-xxxxxxxxxxxx
Content-Type: application/json

非流式请求示例:

json
{
  "model": "claude-sonnet",
  "max_tokens": 1024,
  "messages": [
    {"role": "user", "content": "Hello!"}
  ]
}

注意:Anthropic 协议中 max_tokens 为必填参数。

流式请求:

json
{
  "model": "claude-sonnet",
  "max_tokens": 1024,
  "messages": [
    {"role": "user", "content": "Hello!"}
  ],
  "stream": true
}

流式响应的数据格式与 Anthropic 官方一致,采用多事件 SSE 格式(message_start / content_block_delta / message_delta / message_stop)。

使用 System Prompt

json
{
  "model": "claude-sonnet",
  "max_tokens": 1024,
  "system": "You are a helpful assistant.",
  "messages": [
    {"role": "user", "content": "Explain AI in simple terms."}
  ]
}

代码示例

cURL

OpenAI 协议(非流式):

bash
curl http://ok-link.com:10320/v1/chat/completions \
  -H "Authorization: Bearer sk-xxxxxxxxxxxx" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek-chat",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

OpenAI 协议(流式):

bash
curl http://ok-link.com:10320/v1/chat/completions \
  -H "Authorization: Bearer sk-xxxxxxxxxxxx" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek-chat",
    "messages": [{"role": "user", "content": "Hello!"}],
    "stream": true
  }'

Anthropic 协议(非流式):

bash
curl http://ok-link.com:10320/v1/messages \
  -H "Authorization: Bearer sk-xxxxxxxxxxxx" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet",
    "max_tokens": 1024,
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

Anthropic 协议(流式):

bash
curl http://ok-link.com:10320/v1/messages \
  -H "Authorization: Bearer sk-xxxxxxxxxxxx" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-sonnet",
    "max_tokens": 1024,
    "messages": [{"role": "user", "content": "Hello!"}],
    "stream": true
  }'

Python (openai 库)

python
from openai import OpenAI

client = OpenAI(
    base_url="http://ok-link.com:10320/v1",
    api_key="sk-xxxxxxxxxxxx"
)

# OpenAI 协议 - 非流式
response = client.chat.completions.create(
    model="deepseek-chat",
    messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)

# OpenAI 协议 - 流式
stream = client.chat.completions.create(
    model="deepseek-chat",
    messages=[{"role": "user", "content": "Hello!"}],
    stream=True
)
for chunk in stream:
    if chunk.choices[0].delta.content:
        print(chunk.choices[0].delta.content, end="")

Python (anthropic 库)

python
import anthropic

client = anthropic.Anthropic(
    base_url="http://ok-link.com:10320",
    api_key="sk-xxxxxxxxxxxx"
)

# Anthropic 协议 - 非流式
message = client.messages.create(
    model="claude-sonnet",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello!"}]
)
print(message.content[0].text)

# Anthropic 协议 - 流式
stream = client.messages.create(
    model="claude-sonnet",
    max_tokens=1024,
    messages=[{"role": "user", "content": "写一首诗"}],
    stream=True
)
for chunk in stream:
    if chunk.type == "content_block_delta" and chunk.delta.type == "text_delta":
        print(chunk.delta.text, end="", flush=True)

JavaScript (fetch)

javascript
// OpenAI 协议
const openaiResp = await fetch('http://ok-link.com:10320/v1/chat/completions', {
  method: 'POST',
  headers: {
    'Authorization': 'Bearer sk-xxxxxxxxxxxx',
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    model: 'deepseek-chat',
    messages: [{ role: 'user', content: 'Hello!' }],
  }),
});
const openaiData = await openaiResp.json();
console.log(openaiData.choices[0].message.content);

// Anthropic 协议
const anthropicResp = await fetch('http://ok-link.com:10320/v1/messages', {
  method: 'POST',
  headers: {
    'Authorization': 'Bearer sk-xxxxxxxxxxxx',
    'Content-Type': 'application/json',
  },
  body: JSON.stringify({
    model: 'claude-sonnet',
    max_tokens: 1024,
    messages: [{ role: 'user', content: 'Hello!' }],
  }),
});
const anthropicData = await anthropicResp.json();
console.log(anthropicData.content[0].text);

错误码说明

状态码含义处理方式
200成功-
400请求参数错误检查请求体格式
401认证失败检查 API Key 是否有效或已被禁用
402余额不足为该模型续充 Token
403模型未订阅在模型市场中订阅该模型
404模型不存在确认模型名是否正确
429速率限制降低请求频率,实现退避重试
502上游不可达检查供应商服务状态或联系运维
503模型/服务不可用稍后重试或联系管理员

科华词元平台 - AI 代理统一管理平台