Model Routing
8 articles about model routing.
How to Choose Which Model for Each Task in AI Agents
Tiered model routing sounds smart but adds complexity. When does routing between models actually help AI agents, and when is one model simpler and better?
Multi-LLM Agent Routing - Using Different Models for Different Subtasks
How AI agents route between multiple LLMs - using Claude for orchestration, smaller models for classification, and specialized models for code generation or
When Cheaper AI Models Are Good Enough for Daily Development
Sonnet handles Python wrappers and routine coding just fine. Opus shines for architecture decisions. How to route AI model usage by task complexity and save
Tips for Secondary Models - When to Use Haiku vs Opus in AI Agents
Choosing the right model tier for different AI agent tasks saves money without sacrificing quality. Learn when to use cheap models like Haiku and when to
Wonder Behind a Load Balancer - Routing Models by Task Complexity
Load balancing between AI models by task complexity cuts costs without sacrificing quality. Route simple tasks to cheap models and complex tasks to capable
How to Cut AI Agent Costs 50-70% with Model Routing
Route simple tasks to local Ollama models, complex ones to Claude. Combine that with aggressive state summarization and context pruning to keep token usage
Optimizing 23 AI Agent Cron Jobs from $14/Day to $3/Day
Practical cost reduction for AI agent cron jobs - how we cut daily spend from $14 to $3 by optimizing prompts, routing models, and batching tasks.
Using Opus as Orchestrator, Delegating to Sonnet and Haiku
The real win of using Opus as an orchestrator that delegates to Sonnet and Haiku is not cost savings - it is context window management. Opus burns through