Continue + OpenCompress
Add one field to your config and save 40-60% on every Continue request.
Works with OpenAI, Anthropic, Google, and OpenRouter keys.
Open your Continue configuration file and add a model with the OpenCompress endpoint. Set provider to "openai" and apiBase to our URL.
config.yaml (recommended)
# ~/.continue/config.yaml
models:
- name: "GPT-4o via OpenCompress"
provider: openai
model: gpt-4o
apiBase: "https://www.opencompress.ai/api/v1"
apiKey: "your-key-here"
- name: "Claude Sonnet via OpenCompress"
provider: openai
model: anthropic/claude-sonnet-4.6
apiBase: "https://www.opencompress.ai/api/v1"
apiKey: "your-key-here"config.json (legacy)
// ~/.continue/config.json (deprecated, but still supported)
{
"models": [
{
"name": "GPT-4o via OpenCompress",
"provider": "openai",
"model": "gpt-4o",
"apiBase": "https://www.opencompress.ai/api/v1",
"apiKey": "your-key-here"
}
]
}Note: OpenAI (sk-proj-...), Anthropic (sk-ant-...), Google (AIza...), and OpenRouter (sk-or-...) keys all work as the apiKey. We auto-detect the provider.
Restart Continue (or reload VS Code) and select the model. All requests are compressed automatically — same models, same experience, 40-60% lower token cost. Free tier: 5 req/min.
Note: Check response headers for X-OpenCompress-Tokens-Saved to see compression in action.
Test from the terminal to confirm compression is working:
curl https://www.opencompress.ai/api/v1/chat/completions \
-H "Authorization: Bearer YOUR_KEY" \
-H "Content-Type: application/json" \
-d '{"model":"gpt-4o-mini","messages":[{"role":"user","content":"Hello"}]}'Free tier has no stats and 5 req/min. Sign up for a usage dashboard, 120 req/min, and $10 free credit.
Go to Dashboard →FAQ
Common questions about Continue + OpenCompress.
Should I use config.yaml or config.json?
config.yaml is recommended — config.json is deprecated but still supported. Both work identically with OpenCompress.
Do you store my API key?
No. Your key is forwarded directly to the upstream provider in memory and never persisted.
Does tab completion work?
Yes. Continue's tab autocomplete also uses the configured model provider, so compression applies to all interactions.
Can I use Anthropic models?
Yes. Set provider to "openai" and use your sk-ant-* key. We auto-translate between OpenAI and Anthropic API formats.
Start Saving
Every Continue request gets cheaper. One config change, zero code modifications.