- Home
- /
- Learn
- /
- LLM Deployment from an Instance to Cluster
- /
- LLM API Compatibility
Chapter 3 of 8•1 min read
LLM API Compatibility
การตั้งค่า API Endpoint ให้รองรับมาตรฐาน API ยอดนิยม
Chapter 3: LLM API Compatibility
การออกแบบ API Endpoint ให้รองรับมาตรฐาน API ยอดนิยม
OpenAI API Format
{
"model": "llama-2-7b",
"messages": [
{"role": "user", "content": "Hello"}
],
"temperature": 0.7
}
Claude API Format
{
"model": "llama-2-7b",
"messages": [
{"role": "user", "content": "Hello"}
],
"max_tokens": 1024
}
Benefits of API Compatibility
- Easy Migration - Clients switch models tanpa code changes
- Tool Ecosystem - Use existing SDKs and libraries
- Developer Experience - Familiar interface for users
Implementation Pattern
from fastapi import FastAPI
app = FastAPI()
@app.post("/v1/chat/completions")
async def chat_completions(request: ChatRequest):
# Convert to internal format
# Process with serving framework
# Return OpenAI-compatible response
pass