Submit

Ownrig - Ai Hardware Compatibility

@OwnRig

AI hardware compatibility data for running LLMs locally. Query 50 models, 25 GPUs, 9 ready-to-buy machines, and 663 compatibility entries. Get hardware recommendations, check VRAM requirements, and find buy links.
Overview

ownrig-mcp

1.0.0 • Public • Published

OwnRig MCP Server

AI hardware compatibility data for any MCP-compatible assistant. Query 50 models, 25 devices, 14 builds, 9 ready-to-buy machines, and 663 compatibility entries.

Transport: stdio

Install

npm install -g ownrig-mcp

Or use directly with npx:

npx ownrig-mcp

Tools

ToolDescription
query_modelGet details for a specific AI model (VRAM, formats, use cases)
query_deviceGet specs for a GPU or Apple Silicon device
query_compatibilityCheck if a model runs on a device (tokens/sec, VRAM fit)
list_modelsList models with optional use_case / family filter
list_devicesList devices with optional type / min_vram filter
list_buildsList curated builds with optional tier / profile filter
list_systemsList ready-to-buy machines (Mac, Dell, ASUS) with optional brand / type filter
query_systemGet full details for a specific ready-to-buy machine
recommend_buildFull recommendation engine — 3 paths (model→hw, workflow→hw, hw→models)
find_models_for_device"What can I run on my RTX 4090?"
find_devices_for_model"What GPU do I need for Llama 3.1 70B?"
list_workflowsList workflow profiles (tools → hardware requirements)

Usage with Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "ownrig": {
      "command": "npx",
      "args": ["-y", "ownrig-mcp"]
    }
  }
}

Usage with Cursor

Add to .cursor/mcp.json in your project:

{
  "mcpServers": {
    "ownrig": {
      "command": "npx",
      "args": ["-y", "ownrig-mcp"]
    }
  }
}

Usage from source (development)

If you have the OwnRig repo cloned:

# From project root
npm install
npm run generate:rec-data
npm run mcp

The mcp script builds a self-contained bundle via esbuild (resolving all @/ path aliases) then runs it. Running tsx mcp-server/index.ts directly does not work because the engine uses TypeScript path aliases that tsx cannot resolve transitively across module boundaries.

For your MCP client config, point to the built bundle:

{
  "mcpServers": {
    "ownrig": {
      "command": "node",
      "args": ["mcp-server/dist/index.mjs"],
      "cwd": "/path/to/ownrig"
    }
  }
}

Example queries

Once connected, ask your AI assistant:

  • "What GPU do I need to run Llama 3.1 70B locally?"
  • "Can an RTX 4090 run Qwen 3 32B?"
  • "Recommend a build for running AI coding tools with Cursor"
  • "What models can I run on my M4 Max MacBook Pro?"
  • "Compare the Mac Studio M4 Ultra vs a custom build for AI"

Data

This package includes a snapshot of OwnRig's verified hardware compatibility data. The data is updated with each package release.

  • 50 AI models with VRAM requirements per quantization level
  • 25 GPUs and Apple Silicon devices with specs and pricing
  • 14 curated PC builds with component lists and benchmarks
  • 9 ready-to-buy machines (Mac, Dell, ASUS, NVIDIA)
  • 663 model × device × quantization compatibility entries
  • 7 workflow profiles mapping AI tools to hardware needs

Source: ownrig.com | License: MIT

Server Config

{
  "mcpServers": {
    "ownrig": {
      "command": "npx",
      "args": [
        "-y",
        "ownrig-mcp"
      ]
    }
  }
}
© 2025 MCP.so. All rights reserved.

Build with ShipAny.