Submit

Cursor Chat History Vectorizer & Dockerized Search MCP

@markelaugust74

API service to search vectorized Cursor IDE chat history using LanceDB and Ollama
Overview

What is Cursor Chat History Vectorizer & Dockerized Search MCP?

Cursor Chat History Vectorizer & Dockerized Search MCP is an API service designed to search vectorized chat history from the Cursor IDE using LanceDB and Ollama.

How to use Cursor Chat History Vectorizer?

To use this project, extract chat history from your local Cursor IDE data, generate text embeddings using a local Ollama instance, and store them in a LanceDB vector database. Then, run the Dockerized FastAPI application to access the search API.

Key features of Cursor Chat History Vectorizer?

  • Data extraction from Cursor IDE's SQLite files.
  • Generation of text embeddings for user prompts.
  • Storage of prompts and embeddings in a LanceDB database.
  • Dockerized FastAPI application for searching the vectorized history.

Use cases of Cursor Chat History Vectorizer?

  1. Searching through past chat interactions in the Cursor IDE.
  2. Performing Retrieval Augmented Generation (RAG) for enhanced data analysis.
  3. Integrating with local LLMs for advanced querying of chat history.

FAQ from Cursor Chat History Vectorizer?

  • What is required to run the extraction script?
    You need Python 3.7+, Ollama installed, and access to your Cursor workspace storage.

  • How do I run the search API?
    Build and run the Docker container after extracting the database, and access it via the mapped port.

  • Can I use this project without Docker?
    While Docker is recommended for ease of use, you can run the FastAPI application directly if you set up the environment correctly.

© 2025 MCP.so. All rights reserved.

Build with ShipAny.