📦 多语言 SDK 集成示例
TOENK API 完全兼容 OpenAI API,支持所有主流语言的 OpenAI SDK。以下示例覆盖了初始化客户端、发送聊天请求、处理响应和错误处理。
🐍 Python SDK
使用 openai Python 包。安装:pip install openai
Python (openai 库)
from openai import OpenAI from openai import APIError, RateLimitError, APIConnectionError import time # 初始化客户端 client = OpenAI( api_key="your-api-key-here", base_url="https://toenk-api.com/v1" ) # 发送聊天请求 def chat(user_input): try: response = client.chat.completions.create( model="deepseek-chat", # 可选: gpt-4o, deepseek-v4-flash 等 messages=[ {"role": "system", "content": "你是有用的助手。"}, {"role": "user", "content": user_input} ], max_tokens=1000, temperature=0.7 ) return response.choices[0].message.content except RateLimitError: print("限速,等待重试...") time.sleep(10) return chat(user_input) # 递归重试 except APIError as e: print(f"API错误: {e}") return None except Exception as e: print(f"未知错误: {e}") return None # 调用 result = chat("介绍一下TOENK API平台") print(result)
🟢 Node.js SDK
使用 openai npm 包。安装:npm install openai
Node.js (openai npm 包)
const OpenAI = require('openai'); // 初始化客户端 const client = new OpenAI({ apiKey: "your-api-key-here", baseURL: "https://toenk-api.com/v1" }); // 流式聊天 async function streamChat() { try { const stream = await client.chat.completions.create({ model: "deepseek-chat", messages: [ { role: "system", content: "你是有用的助手。" }, { role: "user", content: "讲个关于AI的故事" } ], stream: true, max_tokens: 2000 }); for await (const chunk of stream) { const content = chunk.choices[0]?.delta?.content || ''; process.stdout.write(content); } } catch (error) { if (error.status === 429) { console.error('限速,请稍后重试'); } else if (error.status === 401) { console.error('API Key 无效'); } else { console.error('错误:', error.message); } } } streamChat(); // 非流式调用 async function simpleChat() { const response = await client.chat.completions.create({ model: "deepseek-chat", messages: [{ role: "user", content: "Hello" }] }); console.log(response.choices[0].message.content); }
🔵 Go SDK
使用 github.com/sashabaranov/go-openai 包。安装:go get github.com/sashabaranov/go-openai
Go (go-openai 库)
package main import ( "context" "fmt" "log" openai "github.com/sashabaranov/go-openai" ) func main() { // 初始化客户端 config := openai.DefaultConfig("your-api-key-here") config.BaseURL = "https://toenk-api.com/v1" client := openai.NewClientWithConfig(config) // 发送聊天请求 resp, err := client.CreateChatCompletion( context.Background(), openai.ChatCompletionRequest{ Model: "deepseek-chat", Messages: []openai.ChatCompletionMessage{ { Role: openai.ChatMessageRoleSystem, Content: "你是有用的助手。", }, { Role: openai.ChatMessageRoleUser, Content: "介绍一下TOENK API", }, }, MaxTokens: 1000, Temperature: 0.7, }, ) if err != nil { log.Fatalf("请求失败: %v", err) } fmt.Println(resp.Choices[0].Message.Content) }
☕ Java SDK
使用 Azure OpenAI SDK(兼容 OpenAI API)。Maven 依赖:com.azure:azure-ai-openai:1.0.0-beta.9
Java (Azure OpenAI SDK)
import com.azure.ai.openai.OpenAIClient; import com.azure.ai.openai.OpenAIClientBuilder; import com.azure.ai.openai.models.*; import java.util.Arrays; public class ToenkExample { public static void main(String[] args) { // 初始化客户端 OpenAIClient client = new OpenAIClientBuilder() .endpoint("https://toenk-api.com/v1") .credential(new AzureKeyCredential("your-api-key-here")) .buildClient(); // 构建消息 ChatCompletionsOptions options = new ChatCompletionsOptions( Arrays.asList( new ChatRequestSystemMessage("你是有用的助手。"), new ChatRequestUserMessage("介绍一下TOENK API") ) ); options.setModel("deepseek-chat"); options.setMaxTokens(1000); options.setTemperature(0.7); // 发送请求 ChatCompletions chatCompletions = client.getChatCompletions( "deepseek-chat", options ); // 处理响应 for (ChatChoice choice : chatCompletions.getChoices()) { System.out.println(choice.getMessage().getContent()); } } }
🌐 cURL 命令行
无需安装任何 SDK,直接使用 cURL 测试 API。
cURL 基本请求
curl https://toenk-api.com/v1/chat/completions \ -H "Authorization: Bearer your-api-key-here" \ -H "Content-Type: application/json" \ -d '{ "model": "deepseek-chat", "messages": [ {"role": "system", "content": "你是有用的助手。"}, {"role": "user", "content": "用一句话介绍TOENK API"} ], "max_tokens": 200, "temperature": 0.7 }'
cURL 流式请求
curl https://toenk-api.com/v1/chat/completions \ -H "Authorization: Bearer your-api-key-here" \ -H "Content-Type: application/json" \ -d '{ "model": "deepseek-chat", "messages": [{"role": "user", "content": "讲个故事"}], "stream": true }'
cURL 查看模型列表
curl https://toenk-api.com/v1/models \
-H "Authorization: Bearer your-api-key-here"