Back to Blog

Ebbinghaus Decay Curves for AI Agent Memory - Beyond Vector Similarity

Fazm Team··2 min read
memoryai-agentebbinghausdecayvector-similarityforgetting

Ebbinghaus Decay Curves for AI Agent Memory

Most AI agent memory systems work the same way. Store everything as embeddings, run vector similarity when you need context, return the top results. It works until it does not. The problem is that vector similarity treats a three-month-old preference the same as one from yesterday.

Why Vector Similarity Falls Short

Vector search answers one question: "What stored memory is most semantically similar to this query?" It does not answer: "Is this memory still relevant?" or "Has the user changed their mind since then?"

An agent that remembers your old coffee order with the same weight as your current one will keep suggesting the wrong thing. Similarity is not relevance. Time matters.

The Ebbinghaus Approach

Hermann Ebbinghaus showed that human memory decays on a predictable curve. Newly learned information fades quickly unless it gets reinforced through repetition. AI agent memory can follow the same pattern.

Here is how it works in practice:

  1. Every memory gets a decay score that decreases over time
  2. Accessing a memory resets its decay - frequently used facts stay strong
  3. Memories that fall below a threshold get archived - not deleted, but deprioritized
  4. Reinforcement through use naturally surfaces what matters most

The Hard Part Is Knowing What to Forget

The real challenge is not implementing the decay function. It is handling edge cases. Some memories are accessed rarely but are critically important - your API keys, your deployment preferences, your security policies. These need explicit protection from decay.

A practical approach is to tag memories with categories. Preferences decay. Facts decay. But policies and safety constraints get pinned permanently.

Why This Matters for Desktop Agents

A desktop agent that runs daily accumulates thousands of observations. Without decay, its context window fills with noise. With Ebbinghaus-style decay, the agent naturally maintains a lean, relevant memory - spending tokens on what actually matters right now.

The best memory system is not the one that remembers everything. It is the one that remembers the right things.


More on This Topic

Fazm is an open source macOS AI agent. Open source on GitHub.

Related Posts