What is MCP Gemini Server?
MCP Gemini Server is a dedicated server that wraps the @google/genai SDK, exposing Google's Gemini model capabilities as standard MCP tools. It allows other LLMs or MCP-compatible systems to leverage Gemini's features as a backend workhorse.
How to use MCP Gemini Server?
To use the MCP Gemini Server, you need to install it via Smithery or manually set it up by cloning the project, installing dependencies, building the project, and configuring your MCP client with the server settings.
Key features of MCP Gemini Server?
- Core text generation capabilities (standard and streaming).
- Function calling to execute client-defined functions.
- Stateful chat management across multiple turns.
- File handling for uploading, listing, retrieving, and deleting files.
- Caching to optimize prompts and manage cached content.
Use cases of MCP Gemini Server?
- Integrating Gemini's text generation capabilities into applications.
- Managing conversational AI interactions with stateful chat.
- Handling files and caching for efficient data management in AI applications.
FAQ from MCP Gemini Server?
-
What are the prerequisites for using MCP Gemini Server?
You need Node.js (v18 or later) and an API Key from Google AI Studio. -
Is there support for Vertex AI credentials?
No, the server does not support Vertex AI authentication; it only works with Google AI Studio API keys. -
How do I handle errors?
The server returns structured errors using the MCP standardMcpErrortype, which includes error codes and messages for troubleshooting.