CHRONICLE stores, retrieves, and evolves agent memory across sessions. Hybrid search. Encrypted. Self-hostable. Drop-in compatible with LangChain, CrewAI, and OpenAI.
Three operations. Infinite possibilities. CHRONICLE handles the entire memory lifecycle so your agents don’t have to.
Agents write memories as structured events — decisions, observations, user context, outcomes. CHRONICLE timestamps, tags, and encrypts every entry automatically.
Hybrid vector + keyword retrieval returns the most relevant memories in under 50ms. Semantic similarity finds context even when exact terms differ.
Memories consolidate over time. Frequently accessed and emotionally salient entries strengthen. Stale memories decay gracefully, keeping context sharp.
CHRONICLE models human-inspired memory architecture so agents think and recall like intelligent systems, not SQL queries.
Ordered sequences of events with timestamps, actors, and outcomes. Gives agents autobiographical recall — every session, decision, and interaction persisted with full context.
Facts, relationships, and concepts organized in a queryable knowledge graph. Agents accumulate durable beliefs about users, domains, and the world — surviving session boundaries.
Skill memories encoding successful action sequences, tool-use patterns, and step-by-step workflows. Agents learn from what worked and refine their capabilities over time.
Active context window management for in-progress reasoning. CHRONICLE tracks which memories are in active use, ensuring agents never lose their current thread across tool calls.
One pip install. Works with every major AI framework out of the box — LangChain, CrewAI, OpenAI Assistants, and raw HTTP for custom stacks.
from chronicle import ChronicleMemory
# Initialize with your agent's identity
memory = ChronicleMemory(
api_key="chron_sk_...",
agent_id="agent-prod-1",
)
# Store a memory (async-safe)
memory.store(
"User prefers concise responses",
type="semantic",
tags=["preferences", "style"],
)
# Hybrid search — vector + keyword
results = memory.search(
"user communication style",
limit=5,
types=["semantic", "episodic"],
)
{
"memories": [
{
"id": "mem_8f3a2d1c",
"content": "User prefers concise responses",
"type": "semantic",
"score": 0.97
}
],
"relevance_scores": [0.97, 0.84, 0.71],
"total": 3
}
Every feature your AI memory layer needs — without the complexity of rolling your own.
Combines dense vector embeddings with BM25 keyword matching. Get the best of semantic similarity and exact-term recall in a single ranked result set.
Every memory is encrypted at rest using AES-256-GCM and in transit with TLS 1.3. Your agent's context never leaves your control unencrypted.
Memories survive restarts, redeploys, and crashes. Agents resume exactly where they left off — no cold-start amnesia, no lost context.
Each agent operates in a private, cryptographically separated memory namespace. No cross-contamination between agents, tenants, or environments.
Important and frequently accessed memories strengthen over time. CHRONICLE's consolidation engine ensures the most relevant context always surfaces first.
Full data sovereignty with a single Docker image. Run on your own infrastructure — cloud, on-prem, or air-gapped. No vendor lock-in, no egress fees.
All plans include AES-256 encryption, hybrid search, and drop-in framework adapters. No seat fees. No surprises.
Start persisting memory today, no credit card required.
For teams shipping production agents.
For high-throughput agent systems.
For regulated and large-scale deployments.
Deploy in minutes. Self-host on your own infrastructure or use our managed cloud. No configuration required.