Research And Data
5067 results found
Search1API
One API for Search, Crawling, and Sitemaps
Jina AI MCP Tools
A Model Context Protocol (MCP) server that integrates with Jina AI Search Foundation APIs.
Aws Kb Retrieval Server
An MCP server implementation for retrieving information from the AWS Knowledge Base using the Bedrock Agent Runtime.
Perplexity Ask MCP Server
A Model Context Protocol Server connector for Perplexity API, to enable web search without leaving the MCP ecosystem.
AgentQL MCP Server
Model Context Protocol server that integrates AgentQL's data extraction capabilities.
Sentry
Retrieving and analyzing issues from Sentry.io
Sequential Thinking
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
Serper MCP Server
A Serper MCP Server
A Sleek AI Assistant & MCP Client
5ire is a cross-platform desktop AI assistant, MCP client. It compatible with major service providers, supports local knowledge base and tools via model context protocol servers .
Mcp Server Chatsum
summarize chat message
TestDino
TestDino MCP boosts your AI assistant with powerful tools and analysis capabilities. It lets your AI analyze test runs, perform root-cause analysis, and detect failure patterns
Regennexus Uap
## Features - **Device Control**: GPIO, PWM, sensors, cameras - **Robotic Arms**: Amber B1, Lucid One with gripper support - **Mesh Networking**: Auto-discovery across LAN - **Security**: AES-256 encryption, token auth ## Installation pip install regennexus
Connectsafelyai
Connect ConnectSafely to AI assistants like Claude Desktop, Cursor, and n8n using the Model Context Protocol. Enable seamless AI-driven LinkedIn automation and smart workflows.
DocsGPT - Open Source
Open-source RAG platform to build secure, intelligent AI agents for enterprise data.
MCP-MESSENGER
**SlashMCP** is a production-grade AI workspace that connects LLMs to real-world data and tools through an intuitive chat interface. Built on the Model Context Protocol (MCP), it enables seamless interaction with multiple AI providers (OpenAI, Claude, Gemini) while providing powerful capabilities for document analysis, financial data queries, web scraping, and multi-agent workflow orchestration. ### Key Features: - **Multi-LLM Support**: Switch between GPT-4, Claude, and Gemini at runtime—no restart needed - **Smart Command Autocomplete**: Type `/` to discover and execute MCP server commands instantly - **Document Intelligence**: Drag-and-drop documents with automatic OCR extraction and vision analysis - **Financial Data Integration**: Real-time stock quotes, charts, and prediction market data via Alpha Vantage and Polymarket - **Browser Automation**: Web scraping and navigation using Playwright MCP - **Multi-Agent Orchestration**: Intelligent routing with specialized agents for command discovery, tool execution, and response synthesis - **Dynamic MCP Registry**: Add and use any MCP server on the fly without code changes - **Voice Interaction**: Browser-based transcription and text-to-speech support ### Use Cases: - Research and analysis workflows - Document processing and extraction - Financial market monitoring - Web data collection and comparison - Multi-step task automation **Live Demo:** [ slashmcp.vercel.app ]( https://slashmcp.vercel.app ) **GitHub:** [ github.com/mcpmessenger/slashmcp ]( https://github.com/mcpmessenger/slashmcp ) **Website:** [ slashmcp.com](https://slashmcp.com )
Faim Time Series Forecasting
An MCP server for zero-shot time-series forecasting powered by foundation models such as Chronos 2 and TiRex, with support for multivariate and probabilistic forecasts.
Pixelmcp Ai自动配图画图
让你的Cursor、Claude接入AI画图,让AI生成的网页不再单调!支持各种封面、插画、装饰图、游戏素材等,简单配置开箱即用!
Mcp Math Calculator
精确数学计算的 MCP 服务器 - 为 AI 提供 100% 准确的数学计算能力
Wikimedia Image Search
An MCP server enabling AI assistants or agents to search Wikimedia Commons images with metadata and visual thumbnails.
Ai Music Mcp
The World's First AI Music MCP Beyond images and video, your agent can now generate music.
Etsy Seo Assistant
# Etsy SEO Generator AI-powered Etsy product listing generator for Claude Desktop Generate perfect SEO titles, descriptions, and tags in seconds --- ## What is this? A Claude Desktop integration that generates complete, SEO-optimized Etsy product listings instantly. Perfect for Etsy sellers who want to: - Save 3+ hours per product listing - Rank higher in Etsy search results - Write compelling product descriptions - Never run out of creative tag ideas ## Quick Start ### A) CLI-only 1) Install and launch ``` npm install -g seerxo seerxo ``` 2) Sign in (recommended) ``` seerxo-mcp login ``` Sign in with Google in your browser and approve; the CLI saves your API key automatically (no manual envs needed). 3) Manual setup (optional) ``` seerxo-mcp configure --email your-email@example.com --api-key your-api-key ``` Use this if you already have an API key and just want to write it locally. ### Sample CLI session ``` SEERXO SEERXO • Etsy SEO Agent • v1.2.53 Describe your Etsy product → get title, description & tags. Interactive mode (help for all commands) • Type a short description of your product • Add a category with "|" (pipe) if you want Boho bedroom wall art set | Wall Art Tip Minimalist nursery wall art in black & white line art. Set of 3 abstract line art prints | Wall Art Quick commands help Show commands status Show config & key state login Open approval link to sign in configure Set email & API key generate Guided prompt (product/category) quit Exit interactive mode [seerxo] › login Requesting SEERXO CLI login... Open this link in your browser to approve CLI login: https://api.seerxo.com/auth/google?redirect=... Waiting for approval... Login approved. Credentials saved locally. You can now run "seerxo-mcp" in Claude Desktop. [seerxo] › generate Product: boho wall art Category (optional): Wall Art Title: Boho Wall Art Set of 3 | Minimalist Line Art Prints Description: ... Tags: boho wall art, line art prints, minimalist decor, ... [seerxo] › ``` ### B) Claude Desktop + MCP 1) Install CLI (same as above) and sign in with `seerxo-mcp login`. 2) Add this to your Claude Desktop config: **macOS:** ~/Library/Application Support/Claude/claude_desktop_config.json **Windows:** %APPDATA%/Claude/claude_desktop_config.json ``` { "mcpServers": { "seerxo": { "command": "seerxo-mcp", "env": { "SEERXO_EMAIL": "your-email@example.com", "SEERXO_API_KEY": "your-api-key" } } } } ``` Note: SEERXO_EMAIL and SEERXO_API_KEY are written to ~/.seerxo-mcp/config.json after CLI login; you can copy from there if you prefer. 3) Restart Claude Desktop Close and reopen Claude Desktop completely. 4) Start Using That's it! Just ask Claude: ``` Generate an Etsy listing for my handmade ceramic coffee mug ``` **Free Tier:** 5 generations per month **Premium:** Unlimited generations - Upgrade at seerxo.com Note: The previous package `seerxo-mcp` is deprecated. Use `npm install -g seerxo`. --- ## Examples ### Simple Request ``` Create Etsy SEO for "vintage leather journal" ``` ### With Category ``` Generate an Etsy listing for handmade candles in the Home & Living category ``` ### With Details ``` I'm selling boho macrame wall hangings. Create an optimized Etsy listing with title, description, and tags. ``` --- ## What You Get Each generation includes: ### SEO Title - Under 140 characters (Etsy requirement) - Primary keywords included - Compelling and click-worthy ### Product Description - Engaging opening hook - Key features and benefits - Usage scenarios - Call-to-action ### 13 Optimized Tags - Mix of broad and specific keywords - Etsy search-optimized - Trending search terms included ### Price Suggestion - Based on similar Etsy products - Market competitive range --- ## Web Interface Prefer not to use Claude Desktop? Try our web interface: **seerxo.com** - Live demo - Instant results - No installation needed --- ## Sample Output **Input:** "Handmade ceramic coffee mug" **Output:** ``` TITLE Handmade Ceramic Coffee Mug | Artisan Pottery | Unique Kitchen Gift | Microwave Safe DESCRIPTION Elevate your morning coffee ritual with this beautifully handcrafted ceramic mug. Each piece is lovingly made by skilled artisans, ensuring no two mugs are exactly alike. The perfect addition to your kitchen collection or a thoughtful gift for coffee lovers. Featuring a comfortable ergonomic handle and smooth glazed finish. Features: • Handmade with premium ceramic • Microwave and dishwasher safe • 12oz capacity • Unique one-of-a-kind design Perfect for daily use or special occasions. Makes an excellent housewarming or birthday gift. TAGS handmade mug, ceramic coffee cup, pottery mug, artisan mug, unique gift, coffee lover gift, handcrafted, kitchen decor, tea cup, housewarming gift, birthday present, ceramic pottery, handmade gift SUGGESTED PRICE $28-$45 ``` --- ## Support - GitHub Issues - support@seerxo.com - seerxo.com --- ## License MIT License - see LICENSE file for details. --- **Built for Etsy sellers by Seerxo**
Crawleo Mcp Server
Crawleo MCP Server Real-time web search and crawling capabilities for AI assistants through Model Context Protocol (MCP). Overview Crawleo MCP enables AI assistants to access live web data through two powerful tools: web.search - Real-time web search with multiple output formats web.crawl - Deep content extraction from any URL Features ✅ Real-time web search from any country/language ✅ Multiple output formats - Enhanced HTML, Raw HTML, Markdown, Plain Text ✅ Device-specific results - Desktop, mobile, or tablet view ✅ Deep content extraction with JavaScript rendering ✅ Zero data retention - Complete privacy ✅ Auto-crawling option for search results Getting Your API Key Visit crawleo.dev Sign up for a free account Navigate to your dashboard Copy your API key Setup Instructions 1. Claude Desktop Location of config file: macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json Linux: ~/.config/Claude/claude_desktop_config.json Configuration: Open the config file and add: json{ "mcpServers": { "crawleo": { "url": "https://api.crawleo.dev/mcp", "transport": "http", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } } ``` Replace `YOUR_API_KEY_HERE` with your actual API key from crawleo.dev. **Steps:** 1. Open the config file in a text editor 2. Add the Crawleo MCP configuration 3. Save the file 4. Restart Claude Desktop completely (quit and reopen) 5. Start a new conversation and ask Claude to search the web! **Example usage:** ``` "Search for the latest AI news and summarize the top 5 articles" "Find Python web scraping tutorials and extract code examples" 2. Cursor IDE Location of config file: macOS: ~/.cursor/config.json or ~/Library/Application Support/Cursor/config.json Windows: %APPDATA%\Cursor\config.json Linux: ~/.config/Cursor/config.json Configuration: json{ "mcpServers": { "crawleo": { "url": "https://api.crawleo.dev/mcp", "transport": "http", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } } ``` **Steps:** 1. Locate and open your Cursor config file 2. Add the Crawleo MCP configuration 3. Save the file 4. Restart Cursor 5. The MCP tools will be available in your AI assistant **Example usage in Cursor:** ``` "Search for React best practices and add them to my code comments" "Find the latest documentation for this API endpoint" 3. Windsurf IDE Location of config file: macOS: ~/Library/Application Support/Windsurf/config.json Windows: %APPDATA%\Windsurf\config.json Linux: ~/.config/Windsurf/config.json Configuration: json{ "mcpServers": { "crawleo": { "url": "https://api.crawleo.dev/mcp", "transport": "http", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } } Steps: Open the Windsurf config file Add the Crawleo MCP server configuration Save and restart Windsurf Start using web search in your coding workflow 4. GitHub Copilot ⚠️ Note: As of now, GitHub Copilot does not natively support MCP servers. MCP integration is currently available for Claude Desktop, Cursor, Windsurf, and other MCP-compatible applications. If GitHub Copilot adds MCP support in the future, the configuration would be similar to other tools. Alternative: Use Cursor IDE (which supports both Copilot-like features AND MCP) for the best of both worlds. 5. OpenAI Platform (Custom Integration) OpenAI's platform doesn't directly support MCP, but you can integrate Crawleo through function calling: Using OpenAI API with Crawleo: pythonimport openai import requests # Define Crawleo as a function for OpenAI tools = [ { "type": "function", "function": { "name": "web_search", "description": "Search the web in real-time", "parameters": { "type": "object", "properties": { "query": { "type": "string", "description": "Search query" }, "markdown": { "type": "boolean", "description": "Return results in Markdown format" } }, "required": ["query"] } } } ] # When OpenAI calls the function, execute it: def execute_web_search(query, markdown=True): response = requests.post( "https://api.crawleo.dev/mcp", headers={"Authorization": "Bearer YOUR_API_KEY_HERE"}, json={ "method": "web.search", "params": { "query": query, "markdown": markdown } } ) return response.json() # Use with OpenAI response = openai.ChatCompletion.create( model="gpt-4", messages=[{"role": "user", "content": "Search for AI news"}], tools=tools ) ``` --- ## Available Tools ### web.search Search the web in real-time with customizable parameters. **Parameters:** - `query` *(required)* - Search term - `max_pages` - Number of result pages (default: 1) - `setLang` - Language code (e.g., "en", "ar") - `cc` - Country code (e.g., "US", "EG") - `device` - Device type: "desktop", "mobile", "tablet" (default: "desktop") - `enhanced_html` - Get clean HTML (default: true) - `raw_html` - Get raw HTML (default: false) - `markdown` - Get Markdown format (default: true) - `page_text` - Get plain text (default: false) - `auto_crawling` - Auto-crawl result URLs (default: false) **Example:** ``` Ask your AI: "Search for 'Python web scraping' and return results in Markdown" ``` --- ### web.crawl Extract content from specific URLs. **Parameters:** - `urls` *(required)* - List of URLs to crawl - `rawHtml` - Return raw HTML (default: false) - `markdown` - Convert to Markdown (default: false) - `screenshot` - Capture screenshot (optional) - `country` - Geographic location **Example:** ``` Ask your AI: "Crawl https://example.com and extract the main content in Markdown" ``` --- ## Troubleshooting ### MCP server not appearing 1. **Check config file location** - Make sure you're editing the correct file 2. **Verify JSON syntax** - Use a JSON validator to check for syntax errors 3. **Restart the application** - Completely quit and reopen (not just reload) 4. **Check API key** - Ensure your API key is valid and active at crawleo.dev ### Authentication errors - Verify your API key is correct - Make sure the key is wrapped in quotes - Check that "Bearer " prefix is included in the Authorization header - Confirm your account has available credits at crawleo.dev ### No results returned - Check your internet connection - Verify the search query is not empty - Try a simpler search query first - Check API status at crawleo.dev --- ## Usage Examples ### Research Assistant ``` "Search for recent developments in quantum computing and summarize the key findings" ``` ### Content Analysis ``` "Search for competitor pricing pages and extract their pricing tiers" ``` ### Code Documentation ``` "Find the official documentation for FastAPI and extract the quickstart guide" ``` ### News Monitoring ``` "Search for today's news about artificial intelligence from US sources" ``` ### Market Research ``` "Search for customer reviews of iPhone 15 and analyze sentiment" Pricing Crawleo MCP uses the same affordable pricing as our API: 10,000 searches → $20 100,000 searches → $100 250,000 searches → $200 Check your usage and manage your subscription at crawleo.dev Privacy & Security ✅ Zero data retention - We never store your search queries or results ✅ Secure authentication - API keys transmitted over HTTPS ✅ No tracking - Your usage patterns remain private Support Documentation: crawleo.dev/docs API Status: crawleo.dev/status Contact: support@crawleo.dev Links 🌐 Website: crawleo.dev 📚 Documentation: crawleo.dev/docs 🔑 Get API Key: crawleo.dev Built with ❤️ by Ahmed Ellaban Empowering AI with real-time web knowledge. Would you like me to: Add troubleshooting for specific errors? Create a video script for setup tutorial? Add more code examples for different languages? Create a quick-start guide separate from this README? RetryAEthis is example of doing that using githup copilot { "servers": { "MongoDB": { "type": "stdio", "command": "npx", "args": [ "-y", "mongodb-mcp-server", "--connectionString", "mongodb+srv://Smarty:Smartydb@cluster0.bla8e.mongodb.net/", "--readOnly" ] }, "Crawleo": { "url": "https://api.crawleo.dev/mcp", "transport": "http", "headers": { "Authorization": "Bearer sk_2adaff4e_215a0da400999b9c2ca8c2002643f206cab10dc4bcb1f34f798ccd328c1f4117", } } } }and this using open ai platformfrom openai import OpenAI client = OpenAI() response = client.responses.create( model="gpt-5-nano", input=[ { "role": "user", "content": [ { "type": "input_text", "text": "search for latest news about openai models\n\n" } ] } ], text={ "format": { "type": "text" }, "verbosity": "medium" }, reasoning={ "effort": "medium" }, tools=[ { "type": "mcp", "server_label": "Crawleo", "server_url": "https://api.crawleo.dev/mcp", "server_description": "Crawleo MCP Server - Real-Time Web Knowledge for AI", "authorization": "sk_2adaff4e_215a0da400999b9c2ca8c2002643f206cab10dc4bcb1f34f798ccd328c1f4117", "allowed_tools": [ "search_web", "crawl_web" ], "require_approval": "always" } ], store=True, include=[ "reasoning.encrypted_content", "web_search_call.action.sources" ] )Show morePerfect! Here's the updated README with the correct configurations for GitHub Copilot and OpenAI Platform: Crawleo MCP Server Real-time web search and crawling capabilities for AI assistants through Model Context Protocol (MCP). Overview Crawleo MCP enables AI assistants to access live web data through two powerful tools: web.search - Real-time web search with multiple output formats web.crawl - Deep content extraction from any URL Features ✅ Real-time web search from any country/language ✅ Multiple output formats - Enhanced HTML, Raw HTML, Markdown, Plain Text ✅ Device-specific results - Desktop, mobile, or tablet view ✅ Deep content extraction with JavaScript rendering ✅ Zero data retention - Complete privacy ✅ Auto-crawling option for search results Getting Your API Key Visit crawleo.dev Sign up for a free account Navigate to your dashboard Copy your API key (starts with sk_) Setup Instructions 1. Claude Desktop Location of config file: macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json Linux: ~/.config/Claude/claude_desktop_config.json Configuration: json{ "mcpServers": { "crawleo": { "url": "https://api.crawleo.dev/mcp", "transport": "http", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } } ``` Replace `YOUR_API_KEY_HERE` with your actual API key from crawleo.dev. **Steps:** 1. Open the config file in a text editor 2. Add the Crawleo MCP configuration 3. Save the file 4. Restart Claude Desktop completely (quit and reopen) 5. Start a new conversation and ask Claude to search the web! **Example usage:** ``` "Search for the latest AI news and summarize the top 5 articles" "Find Python web scraping tutorials and extract code examples" 2. Cursor IDE Location of config file: macOS: ~/.cursor/config.json or ~/Library/Application Support/Cursor/config.json Windows: %APPDATA%\Cursor\config.json Linux: ~/.config/Cursor/config.json Configuration: json{ "mcpServers": { "crawleo": { "url": "https://api.crawleo.dev/mcp", "transport": "http", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } } ``` **Steps:** 1. Locate and open your Cursor config file 2. Add the Crawleo MCP configuration 3. Save the file 4. Restart Cursor 5. The MCP tools will be available in your AI assistant **Example usage in Cursor:** ``` "Search for React best practices and add them to my code comments" "Find the latest documentation for this API endpoint" 3. Windsurf IDE Location of config file: macOS: ~/Library/Application Support/Windsurf/config.json Windows: %APPDATA%\Windsurf\config.json Linux: ~/.config/Windsurf/config.json Configuration: json{ "mcpServers": { "crawleo": { "url": "https://api.crawleo.dev/mcp", "transport": "http", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } } Steps: Open the Windsurf config file Add the Crawleo MCP server configuration Save and restart Windsurf Start using web search in your coding workflow 4. GitHub Copilot Location of config file: For GitHub Copilot in VS Code or compatible editors, you need to configure MCP servers. Configuration: Create or edit your MCP config file and add: json{ "servers": { "Crawleo": { "url": "https://api.crawleo.dev/mcp", "transport": "http", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } } Complete example with multiple servers: json{ "servers": { "Crawleo": { "url": "https://api.crawleo.dev/mcp", "transport": "http", "headers": { "Authorization": "Bearer YOUR_API_KEY_HERE" } } } } ``` **Steps:** 1. Open your GitHub Copilot MCP configuration 2. Add the Crawleo server configuration 3. Save the file 4. Restart VS Code or your IDE 5. GitHub Copilot can now use Crawleo for web searches! **Example usage:** ``` Ask Copilot: "Search for the latest Python best practices" Ask Copilot: "Find documentation for this library" 5. OpenAI Platform (Direct Integration) OpenAI now supports MCP servers directly! Here's how to use Crawleo with OpenAI's API: Python Example: pythonfrom openai import OpenAI client = OpenAI() response = client.responses.create( model="gpt-4", input=[ { "role": "user", "content": [ { "type": "input_text", "text": "search for latest news about openai models" } ] } ], text={ "format": { "type": "text" }, "verbosity": "medium" }, reasoning={ "effort": "medium" }, tools=[ { "type": "mcp", "server_label": "Crawleo", "server_url": "https://api.crawleo.dev/mcp", "server_description": "Crawleo MCP Server - Real-Time Web Knowledge for AI", "authorization": "YOUR_API_KEY_HERE", "allowed_tools": [ "web.search", "web.crawl" ], "require_approval": "always" } ], store=True, include=[ "reasoning.encrypted_content", "web_search_call.action.sources" ] ) print(response) Key Parameters: server_url - Crawleo MCP endpoint authorization - Your Crawleo API key allowed_tools - Enable web.search and/or web.crawl require_approval - Set to "always", "never", or "conditional" Node.js Example: javascriptimport OpenAI from 'openai'; const client = new OpenAI(); const response = await client.responses.create({ model: 'gpt-4', input: [ { role: 'user', content: [ { type: 'input_text', text: 'search for latest AI developments' } ] } ], tools: [ { type: 'mcp', server_label: 'Crawleo', server_url: 'https://api.crawleo.dev/mcp', server_description: 'Crawleo MCP Server - Real-Time Web Knowledge for AI', authorization: 'YOUR_API_KEY_HERE', allowed_tools: ['web.search', 'web.crawl'], require_approval: 'always' } ] }); console.log(response); ``` --- ## Available Tools ### web.search Search the web in real-time with customizable parameters. **Parameters:** - `query` *(required)* - Search term - `max_pages` - Number of result pages (default: 1) - `setLang` - Language code (e.g., "en", "ar") - `cc` - Country code (e.g., "US", "EG") - `device` - Device type: "desktop", "mobile", "tablet" (default: "desktop") - `enhanced_html` - Get clean HTML (default: true) - `raw_html` - Get raw HTML (default: false) - `markdown` - Get Markdown format (default: true) - `page_text` - Get plain text (default: false) - `auto_crawling` - Auto-crawl result URLs (default: false) **Example:** ``` Ask your AI: "Search for 'Python web scraping' and return results in Markdown" ``` --- ### web.crawl Extract content from specific URLs. **Parameters:** - `urls` *(required)* - List of URLs to crawl - `rawHtml` - Return raw HTML (default: false) - `markdown` - Convert to Markdown (default: false) - `screenshot` - Capture screenshot (optional) - `country` - Geographic location **Example:** ``` Ask your AI: "Crawl https://example.com and extract the main content in Markdown" ``` --- ## Troubleshooting ### MCP server not appearing 1. **Check config file location** - Make sure you're editing the correct file 2. **Verify JSON syntax** - Use a JSON validator to check for syntax errors 3. **Restart the application** - Completely quit and reopen (not just reload) 4. **Check API key** - Ensure your API key is valid and active at crawleo.dev ### Authentication errors - Verify your API key is correct (should start with `sk_`) - Make sure the key is wrapped in quotes - Check that "Bearer " prefix is included in the Authorization header (for Claude/Cursor/Windsurf) - For OpenAI Platform, use the key directly in the `authorization` field - Confirm your account has available credits at crawleo.dev ### No results returned - Check your internet connection - Verify the search query is not empty - Try a simpler search query first - Check API status at crawleo.dev ### Tool names not recognized Make sure you're using the correct tool names: - Use `web.search` (not `search_web`) - Use `web.crawl` (not `crawl_web`) --- ## Usage Examples ### Research Assistant ``` "Search for recent developments in quantum computing and summarize the key findings" ``` ### Content Analysis ``` "Search for competitor pricing pages and extract their pricing tiers" ``` ### Code Documentation ``` "Find the official documentation for FastAPI and extract the quickstart guide" ``` ### News Monitoring ``` "Search for today's news about artificial intelligence from US sources" ``` ### Market Research ``` "Search for customer reviews of iPhone 15 and analyze sentiment" Pricing Crawleo MCP uses the same affordable pricing as our API: 10,000 searches → $20 100,000 searches → $100 250,000 searches → $200 Check your usage and manage your subscription at crawleo.dev Privacy & Security ✅ Zero data retention - We never store your search queries or results ✅ Secure authentication - API keys transmitted over HTTPS ✅ No tracking - Your usage patterns remain private Support Documentation: crawleo.dev/docs API Status: crawleo.dev/status Contact: support@crawleo.dev Links 🌐 Website: crawleo.dev 📚 Documentation: crawleo.dev/docs 🔑 Get API Key: crawleo.dev
Crawleo MCP Server
Real-time web search and crawling capabilities for AI assistants through Model Context Protocol (MCP)
Withings
A Model Context Protocol (MCP) server that brings your Withings health data into MCP client. Access your sleep patterns, body measurements, workouts, heart data, and more through natural conversation. 🔒 Privacy First: This is my personal project, and the repository is intentionally public to demonstrate transparency. The code shows that no personal information is logged or stored maliciously. All sensitive data (tokens, user IDs) is encrypted at rest and automatically redacted from logs. You can review the entire codebase to verify this commitment to privacy. ⚠️ Disclaimer: This server is provided as-is without any guarantees or warranties. While I've made every effort to ensure security and privacy, I make no guarantees about availability, data integrity, or security. Use at your own risk. For production use cases, consider self-hosting your own instance.
Xpoz
Remote-only social media MCP server for Twitter/X, Instagram, and TikTok data & insights.
Ai Music Mcp
The World's First AI Music MCP Beyond images and video, your agent can now generate music.
HONO AI HCM MCP Server
Enterprise-Grade HR Assistant leveraging Model Context Protocol for seamless employee data and HR policy queries
Plus Ai Mcp
A Model Context Protocol (MCP) server for automatically generating professional PowerPoint and Google Slides presentations using the Plus AI presentation API
P Link.io
Give agents the power of : - purchasing things online using the link (HTTP 402 complying link) - send money to an email address or phone number (eventually creating an account for the recipient and sending him the credentials) - create payment request links (HTTP 402 compliant = payable by agents)
Langfuse MCP
Production-ready Langfuse MCP Server with 28 tools for LLM analytics, cost monitoring, and dataset management. Features secure dual-mode operation (readonly & readwrite), real-time cost analysis, trace debugging, and comprehensive usage insights. Deploy instantly with npx for seamless Claude Desktop and other LLM Client integration.
Stack Overflow
The Stack Overflow Model Context Protocol (MCP) enables AI tools and agents to access trusted developer knowledge from Stack Overflow. It standardizes how knowledge is exposed, discovered, and retrieved for use in agent workflows and developer environments.
Discourse Mcp Server
A Model Context Protocol (MCP) stdio server that exposes Discourse forum capabilities as tools for AI agents. Works with all deployments of discourse.org, self hosted or cloud hosted. just generate an API key to connect.
Hyperliquid Mcp Trading
Model Context Protocol (MCP) server that integrates with Hyperliquid, a decentralized perpetual futures exchange. This server exposes Hyperliquid's trading APIs as MCP tools, allowing AI assistants like Claude to interact with the exchange programmatically.
Scrapfly MCP Server
Get real-time web data, automatically handle anti-bot systems, extract structured data, and capture screenshots with your favorite tools, such as Cursor or LangChain.