API 文档API 文档
LongCat API Platform
  • English
  • 简体中文
LongCat API Platform
  • English
  • 简体中文
  • 快速开始
  • API文档
  • 常见问题

LongCat API开放平台接口文档

概述

LongCat API开放平台专为LongCat-Flash-Chat模型提供AI模型代理服务,同时兼容OpenAI和Anthropic API格式。本文档遵循标准API格式约定。

基础URL

生产环境端点: https://api.longcat.chat

认证

所有API请求都需要在Authorization头中使用API密钥进行认证:

Authorization: Bearer YOUR_API_KEY

接口

聊天补全

POST /openai/v1/chat/completions

使用OpenAI兼容格式创建聊天补全。

请求头

  • Authorization: Bearer YOUR_API_KEY(必填)
  • Content-Type: application/json

请求体

字段类型必填说明
modelstring是模型标识符(仅支持LongCat-Flash-Chat)
messagesarray是消息对象数组,仅允许文本输入
streamboolean否是否以流式返回响应(默认:false)
max_tokensinteger否生成的最大token数,默认为1024
temperaturenumber否采样温度,范围0到1
top_pnumber否核采样参数

消息对象

字段类型必填说明
rolestring是消息作者角色。必须为以下之一:
• system - 设置助手行为和上下文
• user - 人类用户消息
• assistant - AI助手消息(用于会话历史)
contentstring是消息内容。简单文本消息字符串。

请求示例

{
  "model": "LongCat-Flash-Chat",
  "messages": [
    {
      "role": "system",
      "content": "You are a helpful assistant."
    },
    {
      "role": "user",
      "content": "Hello, how are you?"
    }
  ],
  "stream": false,
  "max_tokens": 150,
  "temperature": 0.7
}

响应(非流式)

{
  "id": "chatcmpl-123",
  "object": "chat.completion",
  "created": 1677652288,
  "model": "LongCat-Flash-Chat",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "Hello! I'm doing well, thank you for asking. How can I help you today?"
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 20,
    "completion_tokens": 15,
    "total_tokens": 35
  }
}

响应(流式)

当stream: true时,响应以Server-Sent Events (SSE)返回:

Content-Type: text/event-stream

data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"LongCat-Flash-Chat","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}]}

data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"LongCat-Flash-Chat","choices":[{"index":0,"delta":{"content":"Hello"},"finish_reason":null}]}

data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"LongCat-Flash-Chat","choices":[{"index":0,"delta":{"content":"!"},"finish_reason":null}]}

data: [DONE]

Anthropic消息

POST /anthropic/v1/messages

使用Anthropic的Claude API格式创建消息。

请求头

  • Authorization: Bearer YOUR_API_KEY(必填)
  • Content-Type: application/json

请求体

字段类型必填说明
modelstring是Claude模型名称
messagesarray是消息对象数组
max_tokensinteger否生成的最大token数
streamboolean否是否以流式返回响应(默认:false)
temperaturenumber否采样温度,范围0到1
top_pnumber否核采样参数
systemstring否用于设置上下文的系统消息

消息对象

字段类型必填说明
rolestring是消息作者角色。必须为以下之一:
• user - 人类用户消息
• assistant - Claude助手消息(用于会话历史)
注意:系统消息通过system参数单独传递
contentstring是消息内容。仅支持文本消息字符串

请求示例

{
  "model": "LongCat-Flash-Chat",
  "max_tokens": 1000,
  "messages": [
    {
      "role": "user",
      "content": "Hello, LongCat"
    }
  ],
  "stream": false,
  "temperature": 0.7
}

响应(非流式)

{
  "id": "msg_123",
  "type": "message",
  "role": "assistant",
  "content": [
    {
      "type": "text",
      "text": "Hello! How can I help you today?"
    }
  ],
  "model": "LongCat-Flash-Chat",
  "stop_reason": "end_turn",
  "stop_sequence": null,
  "usage": {
    "input_tokens": 10,
    "output_tokens": 8
  }
}

响应(流式)

当stream: true时,响应遵循Anthropic的SSE格式:

Content-Type: text/event-stream

event: message_start
data: {"type": "message_start", "message": {"id": "msg_123", "type": "message", "role": "assistant", "content": [], "model": "LongCat-Flash-Chat", "stop_reason": null, "stop_sequence": null, "usage": {"input_tokens": 10, "output_tokens": 0}}}

event: content_block_start
data: {"type": "content_block_start", "index": 0, "content_block": {"type": "text", "text": ""}}

event: content_block_delta
data: {"type": "content_block_delta", "index": 0, "delta": {"type": "text_delta", "text": "Hello"}}

event: content_block_delta
data: {"type": "content_block_delta", "index": 0, "delta": {"type": "text_delta", "text": "!"}}

event: content_block_stop
data: {"type": "content_block_stop", "index": 0}

event: message_delta
data: {"type": "message_delta", "delta": {"stop_reason": "end_turn", "stop_sequence": null}, "usage": {"output_tokens": 8}}

event: message_stop
data: {"type": "message_stop"}

错误响应

API使用常规HTTP响应码指示成功或失败:

HTTP状态码

状态码状态名称说明
200OK请求成功
400Bad Request请求参数无效或JSON格式错误
401UnauthorizedAPI密钥无效或缺失
403ForbiddenAPI密钥无权限访问请求资源
429Too Many Requests超出速率限制
500Internal Server Error服务器遇到意外情况
502Bad Gateway上游服务器响应无效
503Service Unavailable服务器暂时不可用

错误响应格式

所有错误返回如下结构的JSON对象:

{
  "error": {
    "message": "人类可读的错误描述",
    "type": "error_type_identifier",
    "code": "specific_error_code",
    "param": "parameter_name_if_applicable"
  }
}

错误类型与代码

错误类型错误代码HTTP状态说明
authentication_errorinvalid_api_key401提供的API密钥无效
authentication_errormissing_api_key401未提供API密钥
permission_errorinsufficient_quota403API密钥配额不足
invalid_request_errorinvalid_parameter400参数值无效
invalid_request_errormissing_parameter400缺少必填参数
invalid_request_errorinvalid_json400JSON格式无效
rate_limit_errorrate_limit_exceeded429短时间内请求过多
server_errorinternal_error500服务器内部错误

错误响应示例

API密钥无效

{
  "error": {
    "message": "Invalid API key provided",
    "type": "authentication_error",
    "code": "invalid_api_key"
  }
}

缺少必填参数

{
  "error": {
    "message": "Missing required parameter: 'messages'",
    "type": "invalid_request_error",
    "code": "missing_parameter",
    "param": "messages"
  }
}

超出速率限制

{
  "error": {
    "message": "Rate limit exceeded. Please try again in 60 seconds",
    "type": "rate_limit_error",
    "code": "rate_limit_exceeded"
  }
}

速率限制

速率限制按API密钥执行。超出限制时会收到429状态码。

SDK兼容性

本API设计兼容:

  • OpenAI Python SDK(用于/openai/端点)
  • Anthropic Python SDK(用于/anthropic/端点)
  • 任何支持相应API格式的HTTP客户端

示例

使用OpenAI Python SDK

import openai

# 配置LongCat-Flash-Chat API
openai.api_base = "https://api.longcat.chat/openai"
openai.api_key = "your-api-key"

response = openai.ChatCompletion.create(
    model="LongCat-Flash-Chat",
    messages=[
        {"role": "user", "content": "Hello!"}
    ]
)

使用Anthropic Python SDK

import anthropic

# 配置LongCat-Flash-Chat API
client = anthropic.Anthropic(
    api_key="Bearer your-api-key",
    base_url="https://api.longcat.chat"
)
default_headers={
        "Content-Type": "application/json",
        "Authorization": "Bearer your-api-key",
    }


message = client.messages.create(
    model="LongCat-Flash-Chat",
    max_tokens=150,
    messages=[
        {"role": "user", "content": "Hello, LongCat!"}
    ]
)

使用cURL

# OpenAI风格请求
curl -X POST https://api.longcat.chat/openai/v1/chat/completions \
  -H "Authorization: Bearer your-api-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "LongCat-Flash-Chat",
    "messages": [{"role": "user", "content": "Hello!"}],
    "stream": false
  }'

# Anthropic风格请求
curl -X POST https://api.longcat.chat/anthropic/v1/messages \
  -H "Authorization: Bearer your-api-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "LongCat-Flash-Chat",
    "max_tokens": 1000,
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

📋 需要帮助? 请查阅我们的FAQ获取常见问题和故障排查指南。

最近更新:: 2025/9/5 20:22
Contributors: zhuqi09
Prev
快速开始
Next
常见问题