Integrations

One Brain.
Every Tool.

Cerebrumma connects to every major AI coding tool via the Model Context Protocol. Set it up once — your Brain travels with you everywhere.

Available MCP tools

read_memoryFetch all episodic, semantic, and personal entries
search_memoryKeyword search across all memory layers
get_protocolsReturn all procedural skills and rules
add_entryWrite a new memory entry from inside the AI session
get_statusShow entry counts across every layer
Claude Code
Full support

The deepest integration. Cerebrumma auto-configures Claude Code during install.

Setup

  1. 1Run the installer — Claude Code is auto-configured if already installed.
  2. 2If you installed Claude Code after Cerebrumma, run the command below.
  3. 3Restart Claude Code.
  4. 4Type /mcp to confirm — you should see Cerebrumma · ✔ connected · 5 tools.

The installer uses claude mcp add which writes to ~/.claude.json. This is different from ~/.claude/settings.json — make sure you use the command, not manual JSON editing.

Config file

~/.claude.json (via claude mcp add)

Config

bash
claude mcp add cerebrumma -s user ~/.cerebrumma/run-mcp.sh
GitHub CopilotFull support

Connect Cerebrumma to GitHub Copilot in VS Code via the MCP servers config.

Setup

  1. 1Create .vscode/mcp.json in your project root (or open user config via MCP: Open User Configuration).
  2. 2Add the config below. Note: Copilot uses "servers" not "mcpServers".
  3. 3Reload VS Code.
  4. 4Copilot Chat and the cloud agent will have access to all 5 Brain tools.

GitHub Copilot uses the "servers" key — not "mcpServers" like most other tools. The "type": "stdio" field is required. Copilot's cloud agent supports Tools only (not Resources).

Config file

.vscode/mcp.json (workspace) or user MCP config

Config

json
{
  "servers": {
    "cerebrumma": {
      "type": "stdio",
      "command": "uv",
      "args": ["run", "--project", "~/.cerebrumma/mcp_server", "cerebrumma-mcp"]
    }
  }
}
Cursor
Full support

Connect Cerebrumma to Cursor's AI via MCP in one config file.

Setup

  1. 1Create or open .cursor/mcp.json in your project root.
  2. 2Add the config below.
  3. 3Restart Cursor.
  4. 4Cursor's AI can now read and write to your Brain.

For global config across all projects, use ~/.cursor/mcp.json instead.

Config file

.cursor/mcp.json

Config

json
{
  "mcpServers": {
    "cerebrumma": {
      "command": "uv",
      "args": ["run", "--project", "~/.cerebrumma/mcp_server", "cerebrumma-mcp"]
    }
  }
}
ClineFull support

Cline (VS Code extension) has first-class MCP support with a dedicated config file.

Setup

  1. 1Open the Cline extension in VS Code.
  2. 2Click the MCP servers icon in the Cline toolbar, or edit the config file directly.
  3. 3Add the cerebrumma server entry below.
  4. 4Tools are invoked automatically in Cline's agent task loop — no manual trigger needed.

Cline maintains its own MCP config file separate from VS Code's .vscode/mcp.json. The alwaysAllow field can be added to skip per-call approval prompts for specific tools.

Config file

~/.cline/data/settings/cline_mcp_settings.json

Config

json
{
  "mcpServers": {
    "cerebrumma": {
      "command": "uv",
      "args": ["run", "--project", "~/.cerebrumma/mcp_server", "cerebrumma-mcp"]
    }
  }
}
Windsurf
Full support

Wire your Brain into Windsurf's Cascade AI via the MCP config.

Setup

  1. 1Open Windsurf → Settings → MCP (or edit the config file directly).
  2. 2Add the cerebrumma server entry below.
  3. 3Restart Windsurf.
  4. 4Cascade AI will automatically use your Brain during sessions.

You can also configure it via Windsurf → Preferences → MCP Servers.

Config file

~/.codeium/windsurf/mcp_config.json

Config

json
{
  "mcpServers": {
    "cerebrumma": {
      "command": "uv",
      "args": ["run", "--project", "~/.cerebrumma/mcp_server", "cerebrumma-mcp"]
    }
  }
}
Continue.devFull support

Add Cerebrumma to Continue.dev's config — works in Agent mode across VS Code and JetBrains.

Setup

  1. 1Open ~/.continue/config.yaml (or create it).
  2. 2Add the mcpServers block below.
  3. 3Switch to Agent mode in Continue — MCP tools only work in Agent mode, not Chat.
  4. 4Your Brain is available to the agent on every session.

MCP tools are only available in Continue's Agent mode. They do not function in Chat or Autocomplete modes.

Config file

~/.continue/config.yaml

Config

yaml
# Add to ~/.continue/config.yaml
mcpServers:
  - name: cerebrumma
    command: uv
    args:
      - run
      - --project
      - ~/.cerebrumma/mcp_server
      - cerebrumma-mcp
ZedFull support

Zed calls MCP servers "context servers" — add Cerebrumma to your global Zed settings.

Setup

  1. 1Open ~/.config/zed/settings.json.
  2. 2Add the context_servers block below. Note: Zed uses context_servers, not mcpServers.
  3. 3Reload Zed.
  4. 4Cerebrumma's Brain tools are available to Zed's AI assistant.

Zed uses the non-standard context_servers key — not mcpServers. The source: "custom" field is required for manually-added servers.

Config file

~/.config/zed/settings.json

Config

json
{
  "context_servers": {
    "cerebrumma": {
      "source": "custom",
      "command": "uv",
      "args": ["run", "--project", "~/.cerebrumma/mcp_server", "cerebrumma-mcp"]
    }
  }
}
Amazon Q DeveloperFull support

Amazon Q Developer supports MCP in both the CLI and VS Code / JetBrains plugins.

Setup

  1. 1Create or open ~/.aws/amazonq/default.json.
  2. 2Add the cerebrumma server entry below.
  3. 3Restart Amazon Q Developer.
  4. 4For project-level config, use .amazonq/default.json in the repo root instead.

Enterprise (Pro tier) admins can allowlist or disable MCP org-wide. Workspace-level config (.amazonq/default.json) takes precedence over global config.

Config file

~/.aws/amazonq/default.json

Config

json
{
  "mcpServers": {
    "cerebrumma": {
      "command": "uv",
      "args": ["run", "--project", "~/.cerebrumma/mcp_server", "cerebrumma-mcp"]
    }
  }
}
Roo CodeFull support

Roo Code (VS Code fork of Cline) supports per-project MCP config you can commit to your repo.

Setup

  1. 1Create .roo/mcp.json in your project root.
  2. 2Add the cerebrumma server entry below.
  3. 3Commit it to version control to share Brain context with your team.
  4. 4For global config, use the global mcp_settings.json via Roo Code preferences.

The .roo/mcp.json file can be committed to share MCP config across a team — useful for a shared project Brain.

Config file

.roo/mcp.json

Config

json
{
  "mcpServers": {
    "cerebrumma": {
      "command": "uv",
      "args": ["run", "--project", "~/.cerebrumma/mcp_server", "cerebrumma-mcp"]
    }
  }
}
OpenAI Codex
Context injection

Use Cerebrumma's dream cycle to inject curated Brain context into Codex sessions via AGENTS.md.

Setup

  1. 1Run cerebrum dream to generate a distilled summary of your Brain.
  2. 2Save the output as AGENTS.md (or codex.md) in your project root.
  3. 3Codex reads this file as context at the start of every session.
  4. 4Re-run after sessions to keep the context fresh.

Native MCP support for Codex CLI is on our roadmap. Context injection via AGENTS.md works today.

Config file

AGENTS.md or codex.md

Config

bash
# Generate a Brain snapshot for Codex
cerebrum dream > AGENTS.md
AiderContext injection

Aider doesn't support MCP natively — inject Cerebrumma context via the read: convention in your Aider config.

Setup

  1. 1Run cerebrum dream to generate a Brain snapshot and save it as BRAIN.md.
  2. 2Add the read: block below to your .aider.conf.yml.
  3. 3Aider will load BRAIN.md as read-only context on every session.
  4. 4Regenerate BRAIN.md periodically by re-running cerebrum dream.

Aider does not have a native MCP client. This file-export approach gives you the same context injection benefit without MCP.

Config file

~/.aider.conf.yml or .aider.conf.yml

Config

bash
# 1. Export a Brain snapshot
cerebrum dream > BRAIN.md

# 2. Add to ~/.aider.conf.yml or .aider.conf.yml in project
read:
  - BRAIN.md
Google Antigravity
Full support

Connect Cerebrumma to Antigravity's AI via the standard MCP protocol.

Setup

  1. 1Open Antigravity settings and navigate to the MCP / tools section.
  2. 2Add a new MCP server with the config below.
  3. 3Restart Antigravity.
  4. 4Your Brain context is injected automatically into every AI session.

Exact settings path may vary by Antigravity version. Look for 'MCP Servers' or 'Tools' in preferences.

Config file

Antigravity MCP settings

Config

json
{
  "mcpServers": {
    "cerebrumma": {
      "command": "uv",
      "args": ["run", "--project", "~/.cerebrumma/mcp_server", "cerebrumma-mcp"]
    }
  }
}

Don't have Cerebrumma yet?

Install in one command. Runs locally.

curl -fsSL http://get.cerebrumma.com | sh