Skip to content

MCP Setup Guide

This guide walks you through setting up the LumenFlow MCP server with your AI coding assistant.

MCP (Model Context Protocol) is an open standard for AI-to-tool communication. Instead of relying on file-based instructions, MCP provides a programmatic interface for AI assistants to interact with development tools.

Structured Tools

Tools with typed inputs and outputs, not just text parsing

Resource Access

Direct access to LumenFlow data via URI patterns

Consistent Interface

Same tools work across different AI clients

Safety Maintained

All operations respect LumenFlow workflow rules

Before setting up MCP:

  1. LumenFlow initialized in your project

    pnpm exec lumenflow --client <your-client>
  2. AI client with MCP support

    • Claude Code 0.2.x+
    • Cursor (MCP-enabled)
    • Any MCP-compatible client
  1. Install the MCP package (if not already included)

    pnpm add -D @lumenflow/mcp
  2. Verify the binary is available

    npx @lumenflow/mcp --help

Claude Code supports MCP servers via configuration files.

Option 1: Project-level configuration

Create or edit .claude/mcp.json:

{
  "mcpServers": {
    "lumenflow": {
      "command": "npx",
      "args": ["@lumenflow/mcp"],
      "env": {
        "LUMENFLOW_PROJECT_ROOT": "${workspaceFolder}"
      }
    }
  }
}

Option 2: Global configuration

Add to your Claude Code settings (~/.config/claude/mcp.json on Linux, ~/Library/Application Support/claude/mcp.json on macOS):

{
  "mcpServers": {
    "lumenflow": {
      "command": "npx",
      "args": ["@lumenflow/mcp"]
    }
  }
}

Verify the connection:

In Claude Code, the LumenFlow tools should appear in the available tools list:

  • context_get
  • wu_list
  • wu_status
  • wu_create
  • wu_claim
  • wu_done
  • gates_run
  • file_read
  • git_status
  • plan_create
  • signal_cleanup
  • wu_proto

The full MCP surface is documented in /reference/mcp and currently includes 98 tools (90 normalized public-CLI parity tools + 8 MCP-only extras).

Once configured, your AI assistant can use LumenFlow tools directly.

User: Create a WU for adding authentication

AI uses: wu_create {
  "lane": "Framework: Core",
  "title": "Add user authentication",
  "description": "Context: No auth. Problem: Users cannot log in. Solution: Add auth.",
  "acceptance": ["Users can register", "Users can log in"],
  "code_paths": ["src/auth/"],
  "exposure": "backend-only"
}

AI: Created WU-123. Would you like me to claim it?

User: Yes, claim it

AI uses: wu_claim {
  "id": "WU-123",
  "lane": "Framework: Core"
}

AI: WU-123 claimed. Worktree created at worktrees/framework-core-wu-123.

The AI can check current context at any time:

AI uses: context_get {}

Result: {
  "location": { "type": "worktree", ... },
  "wu": { "id": "WU-123", "status": "in_progress" }
}

AI assistants can read LumenFlow data via resources:

Read resource: lumenflow://backlog
Read resource: lumenflow://wu/WU-123
Read resource: lumenflow://context
ApproachBest ForWhen to Use
MCPProgrammatic AI interactionAI clients with MCP support
CLIHuman operators, scriptsTerminal workflows, CI/CD
File-basedUniversal AI compatibilityAny AI that can read markdown

Check: Is the package installed?

pnpm list @lumenflow/mcp

Check: Can you run it directly?

npx @lumenflow/mcp --help

Check: Is MCP enabled in your AI client?

Check: Is the configuration file in the correct location?

Check: Are there any errors in the AI client logs?

Check: Is LUMENFLOW_PROJECT_ROOT set correctly?

Fix: Use explicit path or ensure the AI client is opened in the project directory.

Check: Does the user running the AI client have access to the project?

Check: Are all LumenFlow files readable?

The MCP server:

  • Runs with the permissions of the user starting the AI client
  • Only accesses files within the project root
  • Respects all LumenFlow workflow rules (worktree discipline, gates)
  • Cannot bypass safety hooks or constraints

For custom integrations:

import { createMcpServer } from '@lumenflow/mcp';

const server = createMcpServer({
  projectRoot: '/path/to/project',
  logLevel: 'debug',
});

// List tools
console.log(server.listTools());

// List resources
console.log(server.listResources());
console.log(server.listResourceTemplates());

// Start the server
await server.start();

Gates can take longer to run. The default timeout for gates_run is 10 minutes. This is configured in the tool implementation and cannot be overridden via MCP.