A minimal agentic app to interact with OLLAMA models leveraging multiple MCP server tools using BeeAI framework.
Overview
What is mcp-ollama-beeai?
mcp-ollama-beeai is a minimal client application designed to interact with local OLLAMA models by leveraging multiple MCP server tools through the BeeAI framework.
How to use mcp-ollama-beeai?
To use mcp-ollama-beeai, you need to set up a local OLLAMA server and configure your MCP agents in the mcp-servers.json file. After that, clone the repository, install the dependencies, and start the application to access it via your browser.
Key features of mcp-ollama-beeai?
- Interaction with local OLLAMA models.
- Configuration of multiple MCP agents for enhanced functionality.
- User-friendly interface for selecting agents and tools.
- Integration with the BeeAI framework for easy setup of ReAct agents.
Use cases of mcp-ollama-beeai?
- Building AI-driven applications that require model interactions.
- Utilizing various MCP agents for different tasks like database operations and data fetching.
- Experimenting with local AI models in a controlled environment.
FAQ from mcp-ollama-beeai?
- What are the prerequisites for using mcp-ollama-beeai?
You need to have a local OLLAMA server running and sufficient memory (at least 16GB RAM) for the models to perform effectively.
- Can I use remote servers instead of local?
Yes, you can configure the application to use remote servers for model interactions.
- How do I configure MCP agents?
You can add your MCP agents in the
mcp-servers.jsonfile located in the root folder of the application.