Conversation context + knowledge retrieval. Auto-trimming, persistence, and vector search. Zero dependencies.
$ npm i agentic-memory
copy
Send messages and watch auto-trimming in action. Set max to 500 tokens, then flood it.
Manages the messages array so you don't have to.
Sliding window drops oldest pairs. Or summarize mode compresses old context. You choose.
CJK-aware token counter. No tiktoken dependency (40MB). ~90% accurate, good enough for budget management.
localStorage, file, or custom adapter. Conversations survive page reloads and server restarts.
Branch a conversation to explore different paths. Like git branch for chat context.
Serialize full state to JSON. Move conversations between clients, backup, or analyze.
createManager() handles multiple chats. Get by ID, list, delete. Each with its own storage.