an mcp server with candle inference
Overview
What is open-web-agent-rs?
open-web-agent-rs is a Rust-based web agent that features an embedded OpenAI compatible inference server, specifically designed to support Gemma models.
How to use open-web-agent-rs?
To use open-web-agent-rs, follow these steps:
- Copy the example environment file:
cp .env.example .env - Install dependencies:
bun i - Run the local inference server:
(cd local_inference_server && cargo run --release -- --server) - Start the Docker container:
docker compose up -d searxng - Start the development server:
bun dev
Key features of open-web-agent-rs?
- Rust-based implementation for performance and safety
- Embedded OpenAI compatible inference server
- Support for Gemma models only
Use cases of open-web-agent-rs?
- Building AI-driven applications that require inference capabilities.
- Developing web agents that can interact with users and provide intelligent responses.
- Experimenting with Rust and AI model integration.
FAQ from open-web-agent-rs?
- What models does open-web-agent-rs support?
It currently supports Gemma models only.
- Is open-web-agent-rs easy to set up?
Yes! The setup process is straightforward with clear instructions provided.
- What programming language is used?
The project is developed in Rust.