#ai-memory
24 results found
cognee-mcp-server
Mirror of
Mcp Knowledge Graph
MCP server enabling persistent memory for Claude through a local knowledge graph - fork focused on local development
Easy Memory
MCP persistent memory service that gives AI assistants long-term memory across sessions and projects. Powered by Qdrant vector search + Ollama/Gemini embedding with hybrid retrieval (vector + BM25). Features include: semantic memory save/search/forget, content deduplication via SHA-256, automatic sensitive data redaction, dual-shell architecture (MCP stdio + HTTP REST), Web Admin Panel with analytics, multi-user auth & API key management, and audit logging.
The Pensieve MCP Server
One simply siphons the excess thoughts from one's mind, pours them into the basin, and examines them at one's leisure. It becomes easier to spot patterns and links, you understand, when they are in this form.
Cuba Memroys
Persistent memory MCP for AI agents — Knowledge graph + Hebbian learning + Anti-hallucination. 12 tools, 1 dependency, zero manual setup.
Penfield - Persistent Memory and Knowledge Graphs for AI Agents
Penfield gives AI agents long-term memory that compounds over time. Instead of starting every session from zero, your agent remembers conversations, learns preferences, connects ideas, and picks up exactly where it left off. Core capabilities: - 16 tools covering memory CRUD, knowledge graph, context management, artifacts, and reflection - Hybrid search with tunable weights — BM25 (keyword), vector (semantic), and graph traversal - Knowledge graph with 24 relationship types across 8 categories (evolution, evidence, hierarchy, causation, implementation, conversation, sequence, dependencies) - Context checkpoints for session handoff between agents or channels - Artifact storage for files, diagrams, and reference docs - 11 memory types intelligent filtering - Personality system with configurable personas via the portal Also available as a Claude Connector, Claude Code integration, native OpenClaw plugin (4-5x faster), and direct API.
Strata
Strata is a self hosted AI memory server. Your AI remembers everything across every session, on your own hardware. Key features: . Semantic search (find memories by meaning, not keywords) . Per-agent API keys with granular permissions . 3D constellation viewer with live agent activity . File vault (attach real documents to memories) . CSV audit log (full transparency on every agent action) . Pre-Strata history import (your memory doesn't start at install day) . Global MCP kill switch (emergency brake, only a human can undo) . Automatic deduplication . 10 structured thought types . Backend using PostgreSQL . Runs on a Raspberry Pi Always on, always local, always yours.
🧠 Memory MCP Server
A MCP (Model Context Protocol) server providing long-term memory for LLMs
Claude Faf Mcp
Persistent project context for AI. 33 MCP tools for creating, scoring, and syncing project DNA via IANA-registered .faf format (application/vnd.faf+yaml). Tell AI who you're building for, what you're building, and why — it never forgets. 391 tests across 6 platforms.
Stateless Agent Memory Engine (SAME)
Memory with integrity for AI coding agents. SAME tracks provenance, flags stale knowledge, and surfaces contradictions; so your AI trusts what's current, not what's outdated. 17 MCP tools: semantic search, cross-vault federation, session handoffs, decision logging, knowledge graph, trust-aware retrieval, consolidation, health analysis. Provenance on every write. Stale notes rank lower automatically. SQLite + vector search. Ollama, OpenAI, LM Studio, or keyword-only. One 12MB Go binary. No cloud, no API keys, no telemetry. Your notes never leave your machine. Claude Code, Cursor, Windsurf, Codex CLI, Gemini CLI, and any MCP client.