Structured Tools
Tools with typed inputs and outputs, not just text parsing
This guide walks you through setting up the LumenFlow MCP server with your AI coding assistant.
MCP (Model Context Protocol) is an open standard for AI-to-tool communication. Instead of relying on file-based instructions, MCP provides a programmatic interface for AI assistants to interact with development tools.
Structured Tools
Tools with typed inputs and outputs, not just text parsing
Resource Access
Direct access to LumenFlow data via URI patterns
Consistent Interface
Same tools work across different AI clients
Safety Maintained
All operations respect LumenFlow workflow rules
Before setting up MCP:
LumenFlow initialized in your project
AI client with MCP support
Install the MCP package (if not already included)
Verify the binary is available
Claude Code supports MCP servers via configuration files.
Option 1: Project-level configuration
Create or edit .claude/mcp.json:
Option 2: Global configuration
Add to your Claude Code settings (~/.config/claude/mcp.json on Linux, ~/Library/Application Support/claude/mcp.json on macOS):
Verify the connection:
In Claude Code, the LumenFlow tools should appear in the available tools list:
context_getwu_listwu_statuswu_createwu_claimwu_donegates_runfile_readgit_statusplan_createsignal_cleanupwu_protoThe full MCP surface is documented in /reference/mcp and currently includes 98 tools (90 normalized public-CLI parity tools + 8 MCP-only extras).
Add to your Cursor MCP configuration:
For any MCP-compatible client, configure:
| Setting | Value |
|---|---|
| Command | npx |
| Arguments | ["@lumenflow/mcp"] |
Environment variables:
| Variable | Description | Default |
|---|---|---|
LUMENFLOW_PROJECT_ROOT | Project root directory | process.cwd() |
LUMENFLOW_MCP_LOG_LEVEL | Log level | info |
Once configured, your AI assistant can use LumenFlow tools directly.
The AI can check current context at any time:
AI assistants can read LumenFlow data via resources:
| Approach | Best For | When to Use |
|---|---|---|
| MCP | Programmatic AI interaction | AI clients with MCP support |
| CLI | Human operators, scripts | Terminal workflows, CI/CD |
| File-based | Universal AI compatibility | Any AI that can read markdown |
Check: Is the package installed?
Check: Can you run it directly?
Check: Is MCP enabled in your AI client?
Check: Is the configuration file in the correct location?
Check: Are there any errors in the AI client logs?
Check: Is LUMENFLOW_PROJECT_ROOT set correctly?
Fix: Use explicit path or ensure the AI client is opened in the project directory.
Check: Does the user running the AI client have access to the project?
Check: Are all LumenFlow files readable?
The MCP server:
For custom integrations:
Gates can take longer to run. The default timeout for gates_run is 10 minutes.
This is configured in the tool implementation and cannot be overridden via MCP.