Submit

MCP Gemini Server

@bsmi021

This project provides a dedicated MCP (Model Context Protocol) server that wraps the @google/genai SDK. It exposes Google's Gemini model capabilities as standard MCP tools, allowing other LLMs (like Cline) or MCP-compatible systems to leverage Gemini's features as a backend workhorse.
Overview

What is MCP Gemini Server?

MCP Gemini Server is a dedicated server that wraps the @google/genai SDK, exposing Google's Gemini model capabilities as standard MCP tools. It allows other LLMs or MCP-compatible systems to leverage Gemini's features as a backend workhorse.

How to use MCP Gemini Server?

To use the MCP Gemini Server, clone the project, install dependencies, build the project, and configure your MCP client with the server settings. Ensure you have a valid API key from Google AI Studio.

Key features of MCP Gemini Server?

  • Core text generation capabilities (standard and streaming).
  • Function calling to execute client-defined functions.
  • Stateful chat management across multiple turns.
  • File handling for uploading, listing, retrieving, and deleting files.
  • Caching mechanisms to optimize prompts.

Use cases of MCP Gemini Server?

  1. Integrating Gemini model capabilities into various applications.
  2. Enabling advanced conversational AI features in chat applications.
  3. Managing and processing files in conjunction with AI model interactions.

FAQ from MCP Gemini Server?

  • What are the prerequisites for using MCP Gemini Server?
    You need Node.js (v18 or later) and an API Key from Google AI Studio.

  • Can I use Vertex AI credentials?
    No, the server only supports Google AI Studio API keys for file handling and caching APIs.

  • What types of errors can I expect?
    Common errors include invalid API keys, invalid parameters, and issues related to file handling.

© 2025 MCP.so. All rights reserved.

Build with ShipAny.