Submit

Mcp Rubber Duck

@nesquikm

Query multiple LLMs in parallel from AI coding tools. Ask a question to OpenAI, Gemini, Groq, or any OpenAI-compatible API — and CLI agents like Claude Code, Codex, and Gemini CLI — then compare answers, run consensus votes, structured debates, and judge evaluations. Rubber duck debugging, but the ducks talk back.

Server Config

{
  "mcpServers": {
    "rubber-duck": {
      "command": "mcp-rubber-duck",
      "env": {
        "MCP_SERVER": "true",
        "OPENAI_API_KEY": "<YOUR_OPENAI_KEY>",
        "GEMINI_API_KEY": "<YOUR_GEMINI_KEY>",
        "DEFAULT_PROVIDER": "openai"
      }
    }
  }
}
© 2025 MCP.so. All rights reserved.

Build with ShipAny.