Chapter 3 of 81 min read

LLM API Compatibility

การตั้งค่า API Endpoint ให้รองรับมาตรฐาน API ยอดนิยม

Chapter 3: LLM API Compatibility

การออกแบบ API Endpoint ให้รองรับมาตรฐาน API ยอดนิยม

OpenAI API Format

{
  "model": "llama-2-7b",
  "messages": [
    {"role": "user", "content": "Hello"}
  ],
  "temperature": 0.7
}

Claude API Format

{
  "model": "llama-2-7b",
  "messages": [
    {"role": "user", "content": "Hello"}
  ],
  "max_tokens": 1024
}

Benefits of API Compatibility

  1. Easy Migration - Clients switch models tanpa code changes
  2. Tool Ecosystem - Use existing SDKs and libraries
  3. Developer Experience - Familiar interface for users

Implementation Pattern

from fastapi import FastAPI

app = FastAPI()

@app.post("/v1/chat/completions")
async def chat_completions(request: ChatRequest):
    # Convert to internal format
    # Process with serving framework
    # Return OpenAI-compatible response
    pass