#local-llm
11 results found
local-llm-obsidian-knowledge-base
A template repository that includes a dev container for running a local LLM and included knowledge base. Add a git repo using `git subtree` or `git submodule` and update it using an MCP Client/Server relationship i.e. `VS Code` extension like `Cline` and the `Filesystem`/`Obsidian-MCP` MCP server.
MCP-Ollama Client
Lightweight MCP client that uses a local Ollama LLM to query multiple MCP servers defined in config.json
Oboyu (覚ゆ)
Self-hosted MCP Japanese text indexing & search—chunking+embeddings with BM25×vector rerank
PydanticAI MCP Experiment
Minimalist examples to provide your own MCP servers to your local llm models.
LangChain MCP Client Streamlit App
This Streamlit application provides a user interface for connecting to MCP (Model Context Protocol) servers and interacting with them using different LLM providers (OpenAI, Anthropic, Google...).
Cursor Chat History Vectorizer & Dockerized Search MCP
API service to search vectorized Cursor IDE chat history using LanceDB and Ollama
Tome - Magical AI Spellbook
a magical desktop app that puts the power of LLMs and MCP in the hands of everyone
MCP Client for Ollama (ollmcp)
A Python-based client for interacting with Model Context Protocol (MCP) servers using Ollama. Features include multi-server support, dynamic model switching, tool management, and a rich terminal interface.