Submit

open-web-agent-rs

@seemueller-io

an mcp server with candle inference
Overview

What is open-web-agent-rs?

open-web-agent-rs is a Rust-based web agent that features an embedded OpenAI compatible inference server, specifically designed to support Gemma models.

How to use open-web-agent-rs?

To use open-web-agent-rs, follow these steps:

  1. Copy the example environment file: cp .env.example .env
  2. Install dependencies: bun i
  3. Run the local inference server: (cd local_inference_server && cargo run --release -- --server)
  4. Start the Docker container: docker compose up -d searxng
  5. Start the development server: bun dev

Key features of open-web-agent-rs?

  • Rust-based implementation for performance and safety
  • Embedded OpenAI compatible inference server
  • Support for Gemma models only

Use cases of open-web-agent-rs?

  1. Building AI-driven applications that require inference capabilities.
  2. Developing web agents that can interact with users and provide intelligent responses.
  3. Experimenting with Rust and AI model integration.

FAQ from open-web-agent-rs?

  • What models does open-web-agent-rs support?

It currently supports Gemma models only.

  • Is open-web-agent-rs easy to set up?

Yes! The setup process is straightforward with clear instructions provided.

  • What programming language is used?

The project is developed in Rust.

© 2025 MCP.so. All rights reserved.

Build with ShipAny.