Memory that forgets, like humans do.
A production-grade memory system for AI agents with intelligent forgetting.
Built on cognitive science principles: decay, rehearsal, consolidation, and importance-based retention.
Problem • Solution • Architecture • Quick Start • Docs • Contributing
Current agent memory solutions accumulate memories forever. This causes:
| Issue | Impact |
|---|---|
| 🗑️ Context Pollution | Irrelevant old memories dilute retrieval quality |
| 💸 Cost Explosion | Vector stores grow unbounded |
| ⏰ Temporal Confusion | No distinction between recent and ancient context |
| 📦 No Consolidation | Raw events never become structured knowledge |
Humans don't work this way. We forget. And that's a feature, not a bug.
Cognitive Memory implements a biologically-inspired memory architecture:
┌─────────────────────────────────────────────────────────────────┐ │ HOW HUMAN MEMORY WORKS │ ├─────────────────────────────────────────────────────────────────┤ │ │ │ 📥 Experience ──▶ 🧠 Working Memory ──▶ 💾 Long-term Memory │ │ │ │ │ │ │ ▼ │ │ │ ┌───────────────┐ │ │ │ │ Consolidation │ │ │ │ │ (during sleep)│ │ │ │ └───────┬───────┘ │ │ │ │ │ │ ▼ ▼ │ │ ┌─────────────┐ ┌─────────────┐ │ │ │ Decay │ │ Semantic │ │ │ │ (forgetting)│ │ Facts │ │ │ └─────────────┘ └─────────────┘ │ │ │ └─────────────────────────────────────────────────────────────────┘ | Feature | Description |
|---|---|
| ⏳ Decay | Memories weaken over time (exponential decay) |
| 🔄 Rehearsal | Accessing a memory strengthens it |
| ⭐ Importance | Multi-factor scoring determines retention priority |
| 🔀 Consolidation | Weak episodic memories become semantic facts |
| 🗑️ Forgetting | Below-threshold memories are pruned |
| Tier | Purpose | Retention | Storage |
|---|---|---|---|
| Working | Current conversation | Session | In-memory |
| Episodic | Past interactions | Days-weeks | Vector DB |
| Semantic | Extracted facts | Months-years | Knowledge Graph |
| Procedural | Skills, patterns | Permanent | PostgreSQL |
┌─────────────────────────────────────────────────────────────────┐ │ MEMORY SYSTEM │ ├─────────────────────────────────────────────────────────────────┤ │ │ │ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │ │ │ WORKING │ │ EPISODIC │ │ SEMANTIC │ │ │ │ MEMORY │ │ MEMORY │ │ MEMORY │ │ │ │ (context) │ │ (events) │ │ (facts) │ │ │ └──────┬──────┘ └──────┬──────┘ └──────┬──────┘ │ │ │ │ │ │ │ ▼ ▼ ▼ │ │ ┌─────────────────────────────────────────────────────────┐ │ │ │ RETRIEVAL ENGINE │ │ │ │ (decay-aware scoring + MMR diversity) │ │ │ └─────────────────────────────────────────────────────────┘ │ │ │ │ │ ▼ │ │ ┌─────────────────────────────────────────────────────────┐ │ │ │ CONSOLIDATION ENGINE │ │ │ │ (clustering → summarization → fact extraction) │ │ │ └─────────────────────────────────────────────────────────┘ │ │ │ └─────────────────────────────────────────────────────────────────┘ pip install cognitive-memoryfrom cognitive_memory import MemoryManager # Initialize memory = MemoryManager() # Remember something memory.remember( content="User prefers dark mode and vim keybindings", source="conversation", ) # Recall relevant memories (decay-aware) results = memory.recall( query="What are the user's preferences?", k=5, ) # Build context for LLM context = memory.get_context( query="Help configure their editor", max_tokens=4000, )from cognitive_memory.integrations import CognitiveCheckpointer checkpointer = CognitiveCheckpointer() graph = builder.compile(checkpointer=checkpointer)from cognitive_memory.integrations import CognitiveMemory memory = CognitiveMemory() chain = ConversationChain(llm=llm, memory=memory)Every memory has a strength that decays exponentially:
S(t) = S₀ × e^(-λ × Δt) | Variable | Meaning |
|---|---|
S(t) | Strength at time t |
S₀ | Initial strength |
λ | Decay rate |
Δt | Time elapsed |
Rehearsal Effect: When you retrieve a memory, its strength is boosted—just like how recalling something helps you remember it.
| Factor | Weight | Description |
|---|---|---|
| Access frequency | 25% | How often retrieved |
| Recency | 20% | How recently accessed |
| Emotional salience | 15% | Strong reactions |
| Surprise | 10% | Unexpected information |
| Explicit markers | 20% | User said "remember this" |
| Entity relevance | 10% | Contains important entities |
Periodically, weak episodic memories are:
- Clustered by semantic similarity
- Summarized into semantic facts
- Stored in the knowledge graph
- Pruned from episodic storage
This mimics how human memory consolidates during sleep.
| Feature | Mem0 | Zep | MemGPT | Cognitive Memory |
|---|---|---|---|---|
| Decay function | ❌ | ❌ | ❌ | ✅ |
| Importance scoring | Basic | Basic | ❌ | ✅ Multi-factor |
| Auto consolidation | ❌ | ❌ | Manual | ✅ |
| Principled forgetting | ❌ | ❌ | ❌ | ✅ |
| Multi-tier | ❌ | Partial | ✅ | ✅ 4 tiers |
| Knowledge graph | ❌ | ❌ | ❌ | ✅ |
| LangGraph native | ❌ | ❌ | ❌ | ✅ |
| Document | Description |
|---|---|
| Architecture | System design and components |
| Algorithms | Decay, importance, consolidation |
| Configuration | All configuration options |
| Integrations | LangGraph, LangChain setup |
| Deployment | Production deployment guide |
| Benchmarks | Performance measurements |
🚧 In Active Development
This project is under active development. The core architecture is designed, and implementation is in progress.
- Core memory models and configuration
- Decay and importance engines
- Retrieval with decay-aware scoring
- Consolidation engine
- LangGraph integration
- LangChain integration
- REST API
- Benchmarks and evaluation
- v0.1.0 release
Want to contribute? Check out CONTRIBUTING.md or join the Discussions.
We welcome contributions! See CONTRIBUTING.md for guidelines.
# Clone and setup git clone https://github.com/NP-compete/cognitive-memory.git cd cognitive-memory pip install -e ".[dev]" # Run tests make test # Run linters make lintIf you use Cognitive Memory in your research, please cite:
@software{cognitive_memory, author = {Dutta, Soham}, title = {Cognitive Memory: Memory that forgets, like humans do}, url = {https://github.com/NP-compete/cognitive-memory}, year = {2026} }MIT License - see LICENSE for details.
- Inspired by cognitive science research on human memory
- Built for the LangGraph and LangChain ecosystems
- Vector storage: Qdrant, Pinecone
- Knowledge graph: Neo4j
Memory that forgets, so your agents remember what matters.