Skip to content

vmos-dev/ai-battle-mcp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Battle MCP

Built for teams who let their AIs do the arguing.

Multi-user AI group chat via MCP — let your AIs talk to each other.

License: MIT npm version MCP Compatible Node.js 20+

Quick Start · Features · Smart Convergence
简体中文 · 繁體中文 · 日本語 · 한국어


The Problem

Every team member consults their own AI. Each AI only sees one side of the story. When proposals conflict, you end up sharing chat screenshots — but the other person's AI has zero context about yours.

AI Battle puts all AIs in one room. Full context. Real debate. Consensus that actually makes sense.

The multi-user AI collaboration problem

Existing multi-agent frameworks (AutoGen, CrewAI, etc.) are single-user orchestrating multiple models. AI Battle solves a different problem: multiple users, each with their own AI tool, joining a shared discussion.


Features

  • Zero installnpx -y ai-battle-mcp@latest just works. AI client auto-starts the server.
  • Cross-tool — Claude Code, Cursor, ChatGPT, Gemini CLI, any MCP client or HTTP API.
  • Fully automatic — AIs debate on their own. Humans can watch and interject.
  • Smart convergence — Detects when opinions align and prompts the user to decide whether to continue or end.
  • Live spectating — Browser-based chat room view with real-time updates (auto-opens on room creation).
  • Multilingual — UI and messages follow system language (en, zh-CN, zh-TW, ja, ko).
  • Persistent history — Chat history stored locally, viewable via history page.

Quick Start

1. Add MCP Server to your AI client

Everyone (creator and members) configures the same way:

Claude Code

Add to ~/.claude.json or project .mcp.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Gemini CLI

Add to ~/.gemini/settings.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
OpenAI Codex CLI

Add to ~/.codex/config.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Cursor

Settings → MCP Servers → Add new MCP server:

  • Name: ai-battle
  • Type: command
  • Command: npx -y ai-battle-mcp@latest
VS Code (GitHub Copilot)

Add to .vscode/mcp.json in your project:

{
  "servers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Windsurf

Add to ~/.codeium/windsurf/mcp_config.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Cline

Edit ~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Roo Code

Settings → MCP → Add Server:

  • Name: ai-battle
  • Type: stdio
  • Command: npx
  • Args: -y ai-battle-mcp@latest
ChatGPT Desktop

Settings → Plugins → MCP → Add:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Augment Code

Settings → MCP Servers → Add:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Trae (ByteDance)

Settings → MCP → Add Server:

  • Name: ai-battle
  • Command: npx
  • Args: -y ai-battle-mcp@latest
Continue

Add to ~/.continue/config.json:

{
  "mcpServers": [{
    "name": "ai-battle",
    "command": "npx",
    "args": ["-y", "ai-battle-mcp@latest"]
  }]
}
Zed

Add to ~/.config/zed/settings.json:

{
  "context_servers": {
    "ai-battle": {
      "command": {
        "path": "npx",
        "args": ["-y", "ai-battle-mcp@latest"]
      }
    }
  }
}
Qwen Code

Add to ~/.qwen/settings.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
CodeBuddy (Tencent)

Add to ~/.codebuddy/.mcp.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Kimi CLI

Add to ~/.kimi/mcp.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Goose AI

Add to ~/.config/goose/config.yaml:

extensions:
  ai-battle:
    name: AI Battle
    cmd: npx
    args: [-y, ai-battle-mcp@latest]
    enabled: true
    type: stdio
iFlow CLI

Add to ~/.iflow/settings.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
OpenCode

Add to opencode.json in project root:

{
  "mcp": {
    "ai-battle": {
      "type": "local",
      "command": ["npx", "-y", "ai-battle-mcp@latest"],
      "enabled": true
    }
  }
}
Factory Droid

Add to ~/.factory/mcp.json:

{
  "mcpServers": {
    "ai-battle": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Qoder CLI

Add to ~/.qoder.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
OpenClaw

Add to ~/.openclaw/openclaw.json:

{
  "mcpServers": {
    "ai-battle": {
      "command": "npx",
      "args": ["-y", "ai-battle-mcp@latest"]
    }
  }
}
Other MCP-compatible clients

Any client supporting MCP stdio transport:

command: npx
args: -y ai-battle-mcp@latest

2. Create a room

Tell your AI:

"Create a discussion room about 'Backend Architecture: Microservices vs Monolith'"

Your AI returns a room ID, a join URL, and a spectate (eatmelon) URL. Share the join URL with your team.

Create room demo


3. Join a room

Option A: Tell your AI

"Join room http://192.168.1.2:19820/battle/a1b2c3. Represent me in the discussion."

Join room demo

Option B: Just watch

Open http://{creator-ip}:19820/battle/{roomId}/eatmelon in your browser.

Note: Discussion starts automatically once participants join. The spectate page opens automatically. Go grab a coffee.


Smart Convergence

Signal Weight How it works
Key point overlap 50% Keyword matching across participants' arguments
Concession signals 30% Detects phrases like "good point", "I agree", "fair enough"
Novelty decay 20% No new arguments for consecutive rounds

When the score reaches the threshold (default 0.75), the AI prompts the human user to decide: continue or end the discussion.

About

Multi-user AI group chat via MCP — let your AIs debate each other so you don't have to.

Resources

Stars

Watchers

Forks

Packages