Store, Observe, Learn
The Agent Memory Stack
Think of it as Supabase for agent memory.
Scale from local demos to production without rebuilding context infrastructure — a three-layer memory stack that cover anything your agents need to remember.
How It Works
The capabilities that power production AI agents — store context, observe behavior, learn from experience, and monitor everything.
Store and retrieve agent messages in any LLM format — OpenAI, Anthropic, Gemini.
Read the docsfrom acontext import AcontextClient
client = AcontextClient(api_key="sk-ac-...")
# Create a session
session = client.sessions.create()
# Store messages (any provider format)
client.sessions.store_message(
session.id,
blob={"role": "user", "content": "Hello!"}
)
# Retrieve in any format: openai, anthropic, gemini
messages = client.sessions.get_messages(
session.id, format="anthropic"
)Store and retrieve agent messages in any LLM format — OpenAI, Anthropic, Gemini.
DocsPlatform Capabilities
The production-grade infrastructure your agents need — short-term memory, mid-term state, long-term skill, and more.
Context Engineering
Edit, compress, and summarize context on-the-fly — token_limit, middle_out, and session summary strategies keep your agents efficient without modifying stored messages.
Multimodal Short-term Memory
Unified, persistent storage for all agent data — messages, files, and skills — eliminating fragmented backends (DB, S3, Redis).
Artifact Disk
Filesystem-like workspace to store and share multi-modal outputs (.md, code, reports), ready for multi-agent collaboration.
Background Observer
Automatically extracts tasks from agent conversations and tracks their status in real-time — from pending to running to success or failure.
Self-Learning
Attach sessions to a Learning Space and Acontext automatically distills successful task outcomes into skills — agents improve with every run without manual curation.
SDKs & Integrations
Ready to use with OpenAI, Anthropic, LangGraph, Agno, and other popular agent frameworks.
Self-Host in Seconds
Run the full Acontext stack on your own infrastructure with a single command.
curl -fsSL https://install.acontext.io | shCopied!