Memory Layer
A visualization of how AI memory works across different time scales. This interactive diagram demonstrates the hierarchical structure of AI memory and how information flows between different memory systems.
Academic Exhibit: Layered Memory Architectures for LLMs
Layered Memory Architectures for Large Language Models
Towards Cognitively Inspired and Enhanced Natural Language Processing
Traditional Context WindowCurrent
Current LLMs use a simple sliding window, like viewing the world through a narrow keyhole. As new information comes in, old information is completely lost.
2048 tokens
Fixed Window Size
- ⚠️ Only sees last 2048 tokens
- ⚠️ Equal weight to all tokens
- ⚠️ Important context gets pushed out
- ⚠️ No memory persistence
Current Conversation Focus:general
All memory types weighted equally
Memory Types
Context Window
Short-term Memory
Medium-term Memory
Long-term Memory