How to Choose Which Model for Each Task in AI Agents

Fazm Team··2 min read

How to Choose Which Model for Each Task in AI Agents

The idea sounds clean: route simple tasks to a cheap model, complex tasks to an expensive one. In practice, most teams that try tiered model routing end up reverting to a single model. Here is why - and when routing actually makes sense.

The Routing Problem

To route tasks to different models, you need a classifier that decides task complexity. That classifier is itself an LLM call - or a brittle heuristic. You are adding latency, another failure point, and a new thing to maintain. If the classifier gets it wrong, a simple task hits the expensive model anyway, or a complex task gets a bad answer from the cheap one.

The irony is that deciding which model to use often requires the same reasoning capability as doing the task itself.

When One Model Wins

If your agent handles a narrow domain - desktop automation, code review, email triage - one well-chosen model is usually simpler and more reliable. The consistency alone is worth it. You don't have to debug why the agent behaved differently on Tuesday because the router sent a task to a different model.

For Fazm, we found that a single model with good prompting handles the full range of desktop tasks. The latency savings from routing didn't justify the complexity.

When Routing Makes Sense

Routing works when you have clearly separable task types with very different requirements. Screen element identification versus multi-step planning is a clean split. If you're processing thousands of requests per hour and cost matters at scale, routing can cut bills by 50-70%.

The key is having task boundaries that a simple rule can identify - no classifier needed. If you need an LLM to decide which LLM to use, you've already lost.

Start Simple

Use one model. Measure where it's overkill. Only add routing when you have data showing clear cost or latency wins on specific task categories.

Fazm is an open source macOS AI agent. Open source on GitHub.

More on This Topic

Related Posts