Migration Guide
Switching to AIOrouter from any OpenAI-compatible API is a one-line change. No new SDKs, no refactoring, no breaking changes.
From OpenAI
# Before (OpenAI)
from openai import OpenAI
client = OpenAI(api_key="sk-...")
# After (AIOrouter)
from openai import OpenAI
client = OpenAI(
base_url="https://api.aiorouter.ca/v1",
api_key="aiorouter_..."
)
// Before (OpenAI)
import OpenAI from 'openai';
const client = new OpenAI({ apiKey: 'sk-...' });
// After (AIOrouter)
import OpenAI from 'openai';
const client = new OpenAI({
baseURL: 'https://api.aiorouter.ca/v1',
apiKey: 'aiorouter_...'
});
From GitHub Copilot (via API)
If you're using Copilot's API features, switching to AIOrouter gives you:
- Full API access (not IDE-locked)
- 7 models instead of one
- Predictable monthly billing instead of usage-based
- Canada-resident infrastructure for PIPEDA compliance
# Instead of Copilot's limited API
# Use the full OpenAI-compatible API:
client = OpenAI(
base_url="https://api.aiorouter.ca/v1",
api_key="aiorouter_..."
)
From OpenRouter
# Before (OpenRouter)
client = OpenAI(
base_url="https://openrouter.ai/api/v1",
api_key="sk-or-..."
)
# After (AIOrouter)
client = OpenAI(
base_url="https://api.aiorouter.ca/v1",
api_key="aiorouter_..."
)
Key differences:
- Billing: Monthly subscription vs per-token
- Data residency: Canada (Montreal) vs US
- PIPEDA: Compliant vs not applicable
From Any OpenAI-Compatible API
If your current provider uses the OpenAI API format:
- Change the
base_urltohttps://api.aiorouter.ca/v1 - Change the
api_keyto your AIOrouter key - Update the
modelname to one of our models
That's it. All existing code, prompts, and parameters continue to work.
Environment Variables
Recommended approach for CI/CD and production:
# .env
OPENAI_BASE_URL=https://api.aiorouter.ca/v1
OPENAI_API_KEY=aiorouter_your_key_here
import os
from openai import OpenAI
client = OpenAI(
base_url=os.getenv("OPENAI_BASE_URL", "https://api.aiorouter.ca/v1"),
api_key=os.getenv("OPENAI_API_KEY")
)
Framework Compatibility
LangChain
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="deepseek-v4-pro",
openai_api_key="aiorouter_...",
openai_api_base="https://api.aiorouter.ca/v1"
)
LlamaIndex
from llama_index.llms.openai import OpenAI
llm = OpenAI(
model="deepseek-v4-pro",
api_key="aiorouter_...",
api_base="https://api.aiorouter.ca/v1"
)
Need Help?
- API Reference — all endpoints
- Model Catalog — choose the right model
- support@aiorouter.ca — email support