Memory Layer

A visualization of how AI memory works across different time scales. This interactive diagram demonstrates the hierarchical structure of AI memory and how information flows between different memory systems.

Academic Exhibit: Layered Memory Architectures for LLMs

Layered Memory Architectures for Large Language Models

Towards Cognitively Inspired and Enhanced Natural Language Processing

Layered Memory: A Hierarchical Approach

Current LLMs use a simple sliding window mechanism

Our approach: structured nebula of information layers

Distinct temporal scales and access patterns

Significance of Layered Memory

Enhanced context retention

Improved reasoning capabilities

Dynamic information flow

Future Implications

Paradigm shift in AI memory

Cognitive architecture evolution

Enhanced human-AI interaction

Traditional Context WindowCurrent

Current LLMs use a simple sliding window, like viewing the world through a narrow keyhole. As new information comes in, old information is completely lost.

2048 tokens
Fixed Window Size
  • ⚠️ Only sees last 2048 tokens
  • ⚠️ Equal weight to all tokens
  • ⚠️ Important context gets pushed out
  • ⚠️ No memory persistence
Current Conversation Focus:general
All memory types weighted equally
Memory Types
Context Window
Short-term Memory
Medium-term Memory
Long-term Memory

Memory Layer

Long-term Memory

Medium-term Memory

Short-term Memory

Context Window

Context Distribution

long-term
25%
medium-term
25%
short-term
25%
context
25%

Memory Assistant