Documentación del TFG: Interconexión entre Espacios de Datos e Inteligencia Artificial Generativa
@jaimealruiz
Diseño e Implementación de interconexión entre LLM y Espacios de Datos mediante Model Context Protocol (MCP)
Overview
What is the project about?
The project focuses on designing and implementing a functional and scalable architecture that allows a language model (LLM) to interact with a data space using the Model Context Protocol (MCP).
How to use the project?
To use the project, the LLM client sends natural language questions to the MCP server, which processes these queries and returns the results without the LLM directly accessing the database.
Key features of the project?
- Secure and modular architecture separating LLM processing from data access.
- Utilizes FastAPI for the MCP server with REST endpoints for data queries.
- Supports SQL query generation from natural language questions.
Use cases of the project?
- Enabling LLMs to retrieve and analyze data without direct database access.
- Facilitating complex data queries through natural language processing.
- Future scalability towards advanced architectures like Retrieval-Augmented Generation (RAG).
FAQ from the project?
- Can the LLM access the database directly?
No, all data access must go through the MCP server for security and modularity.
- What technologies are used in this project?
The project uses Python, FastAPI, and DuckDB for data management.
- Is the architecture designed for scalability?
Yes, the architecture is modular and designed to evolve with future enhancements.