Open Source · Early Access

Memory that
persists

Persistent memory for AI coding assistants. Ranked durable context, startup session ingestion, and provider-native hydration let Claude, Codex, and Gemini continue work instead of starting cold.

$ curl -fsSL https://...contynu/.../install.sh | sh click to copy

AI coding tools forget everything between sessions

You spend an hour with Claude building a feature. You switch to Codex to iterate. Codex has no idea what you just decided. You switch to Gemini for a review. Gemini starts from scratch.

Every model handoff is a cold start. Every context switch loses work. Every session begins with "let me re-explain the entire project."

Contynu fixes this. Models write their own memories via MCP tools — and the next model reads them automatically.

How Contynu gives AI tools persistent memory

Cross-Model Memory Transfer

Memory flows seamlessly between Claude, Codex, and Gemini. Each model receives context in its optimal format — XML for Claude, AGENTS.md-first continuation for Codex, and structured text for Gemini. Switch models without losing a single fact.

🧠

Model-Driven Memory

AI models write their own memories via MCP tools — including write, update, delete, search, and Dream Phase consolidation. The model decides what's worth remembering. No transcript heuristics, no fake summaries.

🔍

Invisible Continuity

Contynu now ranks memories against the latest prompt, carries forward a working set, and ingests missing Claude/Codex/Gemini session files at startup. The next model sees what matters now, not a memory dump.

Scoped Memory System

Memories have three scopes: user (follows you everywhere), project (this codebase only), and session (ephemeral). Six kinds: facts, decisions, constraints, todos, user facts, and project knowledge.

🐙

OpenClaw Plugin

Give OpenClaw agents permanent memory. It records prompts, writes project knowledge, checkpoints before compaction, writes back to MEMORY.md, and keeps the full Contynu MCP tool surface available. Addresses issues #5429, #25947, and more.

🔒

Local-First & Zero Config

All data stays on your machine. No cloud, no accounts, no fees. Replace claude with contynu claude. Auto-detection, auto-registration, auto-hydration, and startup release checks are built in.

Get started in 30 seconds

One install. One command prefix. All your AI coding tools remember everything.

Install — Linux / macOS
$ curl -fsSL https://github.com/alentra-dev/contynu/releases/latest/download/install.sh | sh
Install — Windows (PowerShell)
PS> irm https://github.com/alentra-dev/contynu/releases/latest/download/install.ps1 | iex

Prebuilt binaries for Linux, macOS, and Windows — view all downloads

Use with any AI coding assistant
# Instead of running your AI tool directly...
$ contynu claude     # wraps Claude Code with persistent memory
$ contynu codex      # wraps Codex CLI — picks up where Claude left off
$ contynu gemini     # wraps Gemini CLI — has full context from both
Models write and search memories via MCP
# Models actively curate their own memory:
> write_memory(kind="decision", text="Use HMAC-SHA256 for tokens", importance=0.9)
 Memory saved with scope: project

# And recall it in the next session:
> search_memory(query="authentication")
 Found: "Use HMAC-SHA256 for tokens" (decision, importance: 0.9)
Startup updates and continuity checks
# On launch, Contynu checks for a newer binary for this OS/arch:
> A newer Contynu release is available for this environment: v0.x.y
> [m] manual update  [a] auto update now  [s] skip for this launch

# It also ingests missing external session memory before the next turn.

OpenClaw agents forget. Contynu makes them remember.

The contynu-openclaw plugin gives every OpenClaw agent permanent memory that survives compaction, model switches, and session boundaries. It records prompts after each turn, checkpoints before compaction, writes ranked facts back to MEMORY.md, and uses the bundled contynu mcp-server for direct memory access.

The problem

OpenClaw's context compaction is lossy — critical decisions, safety constraints, and project context get silently destroyed. Users have lost days of work (#5429). The dreaming system crashes (#61951). There's no session memory between restarts (#39885).

The fix

Contynu checkpoints before compaction fires, writes importance-ranked facts to MEMORY.md, and gives agents the full MCP tool surface for write, search, update, delete, prompt recording, and Dream Phase consolidation. The plugin runs Contynu subprocesses non-interactively, so normal OpenClaw hooks stay silent and deterministic.

Setup (one time)
$ contynu openclaw setup
$ openclaw plugins install contynu-openclaw --dangerously-force-unsafe-install
$ openclaw plugins enable contynu-openclaw

Frequently asked questions

How does Contynu give AI models persistent memory?

Contynu provides MCP tools that let AI models write, update, search, and consolidate memories directly. When you start a new session, Contynu first ingests missing external session state, then delivers a ranked packet focused on current goals, recent changes, constraints, and durable context. The model is a conscious participant in its own memory management.

Which AI coding assistants does Contynu support?

Contynu works with Claude Code, OpenAI Codex CLI, Google Gemini CLI, and OpenClaw agents. It renders context in the optimal format for each model: XML for Claude, AGENTS.md-first Markdown for Codex, and structured text for Gemini.

How is this different from CLAUDE.md or memory files?

Memory files are static text that models can't update during a session. Contynu provides live MCP tools plus Dream Phase consolidation, working-set carry-forward, and startup ingestion of missing external session files. It behaves more like continuous state than a static memory note.

How do updates work?

Interactive Contynu launches check GitHub Releases for a newer binary that matches the OS and architecture of the currently running environment. If one exists, Contynu offers an exact manual update command or an auto-update path using the release installer. The MCP server subcommand skips the prompt because stdio must stay clean.

Is my data sent to the cloud?

No. Contynu is entirely local-first. All memories, prompts, and session data are stored in a SQLite database on your machine. There are no cloud services, accounts, telemetry, or external dependencies. Your data never leaves your filesystem.

Is Contynu free and open source?

Yes. Contynu is open source under the Mozilla Public License 2.0. No fees, no subscriptions, no usage limits. The full source code is available on GitHub.

Early Access

Stop re-explaining your project

Your AI coding tools should remember what happened. Contynu makes sure they do.

Install takes 30 seconds, works immediately with Claude, Codex, Gemini, and OpenClaw. We want your feedback.