OpenAI SDK
The simplest integration path. Change base_url to Cachecore's gateway and your existing OpenAI code gets L1 + L2 caching with no other changes.
Python
from openai import OpenAI
client = OpenAI(
base_url="https://api.cachecore.it/v1",
api_key="cc_live_xxxxx.eyJ..."
)
response = client.chat.completions.create(
model="gpt-5.4-mini",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Summarise this contract in 3 bullet points."},
]
)
Node.js / TypeScript
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.cachecore.it/v1",
apiKey: process.env.CACHECORE_TOKEN,
});
const response = await client.chat.completions.create({
model: "gpt-5.4-mini",
messages: [{ role: "user", content: "Classify this support ticket." }],
});
Environment variable pattern
Store your token in the environment, not in code:
export CACHECORE_TOKEN=cc_live_xxxxx.eyJ...
import os
from openai import OpenAI
client = OpenAI(
base_url="https://api.cachecore.it/v1",
api_key=os.environ["CACHECORE_TOKEN"],
)
Supported models
Cachecore proxies all model names transparently. The model parameter is forwarded to OpenAI as-is:
| Model | Pricing (input / cached input / output) |
|-------|----------------------------------------|
| gpt-5.4 | $2.50 / $0.25 / $15.00 per 1M tokens |
| gpt-5.4-mini | $0.75 / $0.075 / $4.50 per 1M tokens |
| gpt-5.4-nano | $0.20 / $0.02 / $1.25 per 1M tokens |
Cachecore's caching layer operates independently of OpenAI's native prompt caching. Both can be active simultaneously.
Limitations of the base URL swap
The base URL swap gives you caching automatically, but for advanced features you need the Python Client:
- Tenant isolation for multi-tenant SaaS (namespace scoping per end-user)
- Dependency invalidation (tag responses with data deps, invalidate when data changes)
- Cache bypass for specific requests (skip cache for write operations)
Supported endpoints
Cachecore proxies /v1/chat/completions. All parameters, response formats, and streaming are passed through unchanged.