Access Jewish texts and commentaries via Sefaria MCP server for efficient retrieval and reference
The Sefaria Jewish Library MCP Server provides a standardized interface for Large Language Models (LLMs) and other AI applications to access and reference a vast collection of Jewish texts. This server leverages the Model Context Protocol (MCP) to facilitate seamless integration between AI tools and rich, annotated text data from the Sefaria library.
The Sefaria MCP Server boasts two core capabilities that are crucial for AI integration:
Retrieve Jewish Texts by Reference: Users can fetch specific Jewish texts based on their reference, such as "Genesis 1:1" or "משנה ברכות פרק א משנה א". This ability ensures that LLMs and other applications can easily query the server to retrieve relevant content.
Retrieve Commentaries on a Given Text: The server supports fetching commentaries on specific textual references, allowing users to gain deeper insights into texts through historical and scholarly perspectives.
Imagine an LLM that needs to summarize Genesis 1:1 in a way that is both accurate and contextually rich. With the Sefaria MCP Server enabled, the LLM can issue a command like this:
reference: "Genesis 1:1"
Upon receiving the reference, the server retrieves the text along with its associated commentaries from the Sefaria database. The AI application can then process this data to generate a summary that includes both the literal translation and scholarly insights.
A researcher working on historical texts might need to analyze a specific passage in context, such as "Shmot Prak B pesuk G" (Exodus 2:3). By sending a request like this:
reference: "שמות פרק ב פסוק ג"
The Sefaria MCP Server will provide the full text and relevant commentaries, facilitating thorough historical research using modern AI tools.
The Sefaria MCP Server is built on the principles of the Model Context Protocol (MCP), which defines a set of rules and actions for AI applications to interact with external data sources. By adhering to MCP, this server ensures compatibility with various AI clients and tools.
To run the server locally, follow these steps:
Clone the repository:
git clone https://github.com/sivan22/mcp-sefaria-server.git
cd mcp-sefaria-server
Run the server directly using:
uv --directory path/to/directory run sefaria_jewish_library
Alternatively, configure the server to be used by MCP clients through a configuration file:
{
"mcpServers": {
"sefaria_jewish_library": {
"command": "uv",
"args": [
"--directory",
"C:/dev/mcp-sefaria-server",
"run",
"sefaria_jewish_library"
],
"env": {
"PYTHONIOENCODING": "utf-8"
}
}
}
}
Here is a Mermaid diagram outlining the flow of data and commands between an AI application, the MCP client, the Sefaria MCP Server, and the underlying Sefaria database:
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Sefaria Database]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
This diagram illustrates how the Sefaria MCP Server interacts with the Sefaria database, providing a structured data architecture:
graph TD
A[Data Source] --> B[MCP Server]
C[Sefaria Database] --> B
style A fill:#e8f5e8
style B fill:#f3e5f5
Ensure you have the following:
Clone the repository:
git clone https://github.com/sivan22/mcp-sefaria-server.git
cd mcp-sefaria-server
Install the necessary dependencies:
pip install -r requirements.txt
Run the server using one of the following commands:
uv --directory path/to/directory run sefaria_jewish_library
The Sefaria MCP Server is designed to be compatible with several MCP clients, each offering different levels of support for retrieving and interacting with data. Here is a matrix summarizing the compatibility status:
MCP Client | Resources | Tools & Texts | Prompts & Comments | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ❌ | Tools Only |
Cursor | ❌ | ✅ | ❌ | N/A |
By integrating the Sefaria MCP Server with an LLM, developers can create applications that not only translate and summarize texts but also automatically generate detailed commentaries based on historical context and scholarly insights.
Researchers can use the server to fetch multiple references at once, enabling them to perform in-depth analyses across different passages. The MCP client ensures seamless interaction between the AI application and the data sources, streamlining the research process.
The Sefaria MCP Server supports integration with various clients through a standardized protocol. To include this server as part of your AI workflow, you can configure it in your MCP client settings as follows:
{
"mcpServers": {
"sefaria_jewish_library": {
"command": "uv",
"args": [
"--directory",
"C:/dev/mcp-sefaria-server",
"run",
"sefaria_jewish_library"
],
"env": {
"PYTHONIOENCODING": "utf-8"
}
}
}
}
The Sefaria MCP Server demonstrates robust performance and compatibility across a range of AI applications. Below is a detailed matrix that showcases its broad support in different environments.
AI Application | Resource Access | Tool Execution | Prompt Generation |
---|---|---|---|
Claude Desktop | Full | Full | Full |
Continue | Full | N/A | N/A |
For advanced users and developers, the Sefaria MCP Server offers several configuration options to enhance performance and security. Key areas include:
PYTHONIOENCODING
to ensure proper text encoding.The server leverages efficient database queries and caching mechanisms to manage vast datasets, ensuring quick responses even under heavy load.
While the current implementation focuses on Jewish texts, the architecture could be extended to support other cultural and religious traditions with slight modifications.
You can configure it as a custom server in your project’s MCP client settings, ensuring seamless data retrieval and processing.
Yes, you can develop and integrate custom MCP tools that extend the server’s capabilities beyond its current features.
Contributions are welcome! If you wish to contribute to this project, here are some steps:
Explore more about Model Context Protocol (MCP) and its ecosystem at the official Model Context Protocol GitHub page.
For developers looking to integrate MCP servers into their projects, Sefaria's API documentation is a valuable resource. Visit the Sefaria API documentation for more information.
By following these guidelines and integrating this server with your AI workflows, you can unlock new possibilities in text analysis and generation, paving the way for more sophisticated applications in Jewish studies and beyond.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods