← Back to Blog
Tutorial

Give Claude Persistent Memory with Knol MCP Server

Step-by-step guide to setting up Knol as an MCP server for Claude Desktop. Your AI assistant will remember users, preferences, and context across every session.

Why Claude Needs Persistent Memory

Every time you start a new Claude conversation, you start from scratch. Claude doesn't remember your name, your projects, your coding preferences, or the debugging session you had yesterday. You have to re-explain context every single time.

Knol's MCP server fixes this. Once connected, Claude can store and retrieve memories across sessions — facts, preferences, project context, and relationships. It's like giving Claude a brain that persists.

Setting Up Knol (60 Seconds)

First, get Knol running locally:

git clone https://github.com/aiknol/knol.git
cd knol
docker compose up -d

That's it. Knol is now running on localhost:3000 with PostgreSQL, vector search, knowledge graphs, and the full context engineering stack.

Connecting to Claude Desktop

Open your Claude Desktop MCP configuration:

# macOS
code ~/Library/Application\ Support/Claude/claude_desktop_config.json

# Windows code %APPDATA%/Claude/claude_desktop_config.json ```

Add the Knol MCP server:

{
  "mcpServers": {
    "knol-memory": {
      "command": "npx",
      "args": ["@aiknol/knol-mcp-server"],
      "env": {
        "KNOL_API_URL": "http://localhost:3000",
        "KNOL_API_KEY": "your-api-key"
      }
    }
  }
}

Restart Claude Desktop. You should see the MCP tools icon appear, indicating Knol is connected.

What Claude Can Do Now

With Knol connected, Claude has access to six memory tools:

**knol_store_memory** — Claude can save important facts from your conversation. "User prefers TypeScript over JavaScript" or "Working on a React dashboard for project Atlas."

**knol_search_memory** — Before answering, Claude can search past memories. "What do I know about this user's tech stack?" returns relevant context from previous sessions.

**knol_get_user_context** — Pull a complete summary of everything known about the current user. Preferences, projects, recent interactions, and relationships.

**knol_graph_query** — Traverse the knowledge graph. "What projects is this user connected to?" or "Who else works on project Atlas?"

Real Example

Session 1 (Monday):

You: I'm building a dashboard with Next.js and Tailwind. The backend is Rust with Axum.
Claude: [stores: user tech stack = Next.js, Tailwind, Rust, Axum]

Session 2 (Wednesday):

You: Can you help me add a new API endpoint?
Claude: [searches memory, finds Rust/Axum preference]
Claude: Sure! Since you are using Axum, here is the endpoint...

No re-explanation needed. Claude already knows your stack.

Beyond Claude Desktop

The same MCP server works with Cursor, Windsurf, and any MCP-compatible tool. Your memory is shared across all of them — context from a Claude conversation is available when you are coding in Cursor.

Privacy and Self-Hosting

All memory stays on your machine. Knol runs locally, your data never leaves your infrastructure. For teams, deploy Knol on your own servers with multi-tenant isolation, encryption at rest, and full audit logging.

The MCP server is open-source. Star us on GitHub to follow development.