Why Ebbinghaus Decay Curves Beat Flat Vector Stores for Agent Memory

Fazm Team··3 min read

Why Ebbinghaus Decay Curves Beat Flat Vector Stores for Agent Memory

Most AI agent memory systems take the simplest possible approach: store everything as vector embeddings and retrieve by similarity. It works for demos. It fails in production. The Ebbinghaus decay curve - borrowed from cognitive psychology - offers a fundamentally better model for how AI agents should remember and forget.

The Problem with Flat Vector Stores

A vector store treats every memory as equally valid forever. Your coffee preference from six months ago has the same retrieval weight as one from yesterday. A project context from a completed project competes with your current work for attention in the context window.

As the store grows, retrieval quality degrades. More memories mean more noise in similarity search results. The agent starts surfacing irrelevant context that dilutes the useful information.

How Ebbinghaus Decay Works

Hermann Ebbinghaus discovered that human memory follows a predictable decay curve. New information fades rapidly unless it gets reinforced through repetition. The same principle maps beautifully to agent memory:

  • Fresh memories start strong - recently stored information gets high retrieval priority
  • Unused memories decay - information that the agent never accesses gradually loses priority
  • Accessed memories get reinforced - every time a memory is retrieved and useful, its strength resets
  • Decayed memories archive, not delete - old memories move to cold storage, available if explicitly searched

Why This Matters in Practice

An agent with decay-based memory naturally adapts to your changing preferences and projects. When you switch from a React project to a Rust project, the React-specific memories gradually fade from active retrieval while Rust memories build up. No manual cleanup needed.

The decay curve also solves the context window budget problem. Instead of cramming everything into the available tokens, the agent naturally prioritizes recent, frequently-accessed information. The memory system itself does the filtering.

Implementation Details

The practical implementation adds a last_accessed timestamp and a strength score to each memory. A background process periodically recalculates strength using the Ebbinghaus formula. Retrieval combines vector similarity with strength scoring, so relevant AND recent memories win over relevant but stale ones.

This is not theoretical. Memory systems with decay curves consistently outperform flat vector stores in long-running agent deployments where context quality matters more than total recall.

Fazm is an open source macOS AI agent. Open source on GitHub.

More on This Topic

Related Posts