OpenAI-Compatible API
Compile Labs provides full compatibility with the OpenAI API, making it easy to use any OpenAI-compatible model through our unified interface.
Chat Completions
curl https://api.compilelabs.com/v1/chat/completions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-5",
"messages": [
{"role": "user", "content": "Hello, how are you?"}
],
"temperature": 0.7,
"max_tokens": 100
}'
Streaming Responses
curl https://api.compilelabs.com/v1/chat/completions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "openai/gpt-5",
"messages": [
{"role": "user", "content": "Tell me a story"}
],
"stream": true
}'
Using OpenAI Python SDK
import openai
client = openai.OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.compilelabs.com/v1"
)
response = client.chat.completions.create(
model="openai/gpt-5",
messages=[
{"role": "user", "content": "Explain quantum computing"}
]
)
print(response.choices[0].message.content)
Streaming with Python SDK
import openai
client = openai.OpenAI(
api_key="YOUR_API_KEY",
base_url="https://api.compilelabs.com/v1"
)
stream = client.chat.completions.create(
model="openai/gpt-5",
messages=[
{"role": "user", "content": "Tell me a story"}
],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
System Prompts
response = client.chat.completions.create(
model="openai/gpt-5",
messages=[
{"role": "system", "content": "You are a helpful coding assistant."},
{"role": "user", "content": "Write a Python function to calculate fibonacci numbers"}
],
temperature=0.7,
max_tokens=500
)
Multi-Turn Conversations
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is the capital of France?"},
{"role": "assistant", "content": "The capital of France is Paris."},
{"role": "user", "content": "What is its population?"}
]
response = client.chat.completions.create(
model="openai/gpt-5",
messages=messages
)
print(response.choices[0].message.content)
Using Different Models
Compile Labs supports multiple OpenAI-compatible models:
# OpenAI models
gpt_response = client.chat.completions.create(
model="openai/gpt-5",
messages=[{"role": "user", "content": "Hello!"}]
)
# Meta models
llama_response = client.chat.completions.create(
model="moonshotai/kimi-k2-0905",
messages=[{"role": "user", "content": "Hello!"}]
)
# DeepSeek models
deepseek_response = client.chat.completions.create(
model="deepseek/deepseek-v3",
messages=[{"role": "user", "content": "Hello!"}]
)
List Available Models
curl "https://api.compilelabs.com/v1/models?type=text_to_text" \
-H "Authorization: Bearer YOUR_API_KEY"
SDK Compatibility
Compile Labs is compatible with:
- OpenAI SDK (
openai Python package)
- Any OpenAI-compatible library (LlamaIndex, LangChain, etc.)
Use your existing OpenAI SDK code with Compile Labs - just change the base_url to https://api.compilelabs.com/v1 and you’re ready to go!
Next Steps