Seamlessly access Contentful content via AI with natural language queries and comprehensive content management features
The Contentful Delivery MCP (Model Context Protocol) Server provides an advanced, scalable solution that bridges the gap between natural language queries and content management platforms like Contentful. This server enables AI applications to access and utilize real-time, personalized data from Contentful through a standard protocol, much like how USB-C facilitates connectivity across various devices.
The Contentful Delivery MCP Server integrates seamlessly with a wide range of AI applications, including Claude Desktop, Continue, Cursor, and more. It offers a variety of features that enhance the capabilities of these applications by enabling natural language queries to retrieve content and manage assets effortlessly. The server supports rich text handling, pagination support, and content type schema access, all while ensuring compatibility across different platforms.
graph TD
A[AI Application] -->|MCP Client| B[MCP Protocol]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
The Contentful Delivery MCP Server is built on top of the Model Context Protocol (MCP), which allows AI applications to interact with external services and data sources in a standardized manner. The architecture leverages modern web technologies such as Node.js, making it highly efficient and scalable.
graph TD
A[AI Application] --> B[MCP Client]
B --> C[MCP Server]
C --> D[Data Source/Tool]
style A fill:#e1f5fe
style C fill:#f3e5f5
style D fill:#e8f5e8
The server seamlessly integrates with Mastra AI, providing a powerful platform for AI applications to interact with real-time data. To leverage this integration, follow the steps outlined below.
Install the package in your project via npm:
npm install @mshaaban0/contentful-delivery-mcp-server
Or globally:
npm install -g @mshaaban0/contentful-delivery-mc-sserver
Set up your Contentful credentials by exporting environment variables:
export CONTENTFUL_SPACE_ID="your_space_id"
export CONTENTFUL_ACCESS_TOKEN="your_access_token"
# Optional: Restrict content to specific content types
export CONTENTFUL_CONTENT_TYPE_IDS="blogPost,article,product"
Imagine a scenario where an e-commerce company wants to generate personalized product recommendations based on customer queries. By integrating the Contentful Delivery MCP Server with Mastra AI and Claude Desktop, customers can receive highly relevant and personalized recommendations through natural language commands.
import { MastraMCPClient } from "@mastra/mcp";
import { Agent } from "@mastra/core/agent";
// Initialize the MCP client
const contentfulClient = new MastraMCPClient({
name: "contentful-delivery",
server: {
command: "npx",
args: ["-y", "@mshaaban0/contentful-delivery-mcp-server@latest"],
env: {
CONTENTFUL_ACCESS_TOKEN: "your_access_token",
CONTENTFUL_SPACE_ID: "your_space_id",
// Optional: Restrict content to specific content types
CONTENTFUL_CONTENT_TYPE_IDS: "blogPost,article,product"
}
}
});
// Create an AI agent with access to Contentful
const assistant = new Agent({
name: "Product Assistant",
instructions: `
You are a helpful assistant for product recommendations.
`,
model: "gpt-4",
});
// Connect and register tools
await contentfulClient.connect();
const tools = await contentfulClient.tools();
assistant.__setTools(tools);
const response = await assistant.chat("Recommend some summer shoes");
A blog platform can benefit from the Contentful Delivery MCP Server by integrating it with Continue to manage and update their blog content efficiently. Users can issue natural language commands to create, edit, or delete entries directly through an AI interface.
The Contentful Delivery MCP Server can be easily integrated into various AI applications. The following examples demonstrate how these integrations can be set up:
import { MastraMCPClient } from "@mastra/mcp";
import { Agent } from "@mastra/core/agent";
// Initialize the MCP client
const contentfulClient = new MastraMCPClient({
name: "contentful-delivery",
server: {
command: "npx",
args: ["-y", "@mshaaban0/contentful-delivery-mcp-server@latest"],
env: {
CONTENTFUL_ACCESS_TOKEN: "your_access_token",
CONTENTFUL_SPACE_ID: "your_space_id",
// Optional: Restrict content to specific content types
CONTENTFUL_CONTENT_TYPE_IDS: "blogPost,article,product"
}
}
});
// Create an AI agent with access to Contentful
const assistant = new Agent({
name: "Content Assistant",
instructions: `
You are a helpful assistant with access to our content database.
Use the available tools to find and provide accurate information.
`,
model: "gpt-4",
});
// Connect and register tools
await contentfulClient.connect();
const tools = await contentfulClient.tools();
assistant.__setTools(tools);
The Contentful Delivery MCP Server has been tested and optimized for performance, ensuring that it can handle heavy workloads efficiently. Below is a compatibility matrix outlining the supported MCP clients:
MCP Client | Resources | Tools | Prompts |
---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ |
Continue | ✅ | ✅ | ✅ |
Cursor | ❌ | ✅ | ❌ |
export CONTENTFUL_SPACE_ID=""
export CONTENTFUL_ACCESS_TOKEN=""
# Optional: Restrict content to specific content types
export CONTENTFUL_CONTENT_TYPE_IDS=""
Ensure that your environment variables are stored securely and not exposed in version control or public repositories. Use secure methods such as environment variable management tools provided by cloud platforms.
The server ensures data privacy by implementing strict access controls and using encryption for all data transmitted between AI applications, Contentful, and other external services. Additionally, environment variables are used to securely store sensitive information such as API keys.
Yes, multiple AI applications can connect to the same Contentful instance simultaneously through the Contentful Delivery MCP Server. The server handles concurrent connections efficiently to ensure seamless performance.
The Contentful Delivery MCP Server automatically fetches updated data from Contentful as soon as changes are made, ensuring that AI applications always have access to the latest content definitions. This dynamic update process is facilitated by the integration with the Contentful API.
The number of tools and commands offered is flexible and can be customized based on specific needs. The server supports a wide range of queries, including retrieving entries by ID, content type, or using natural language processing for complex searches.
Yes, the Contentful Delivery MCP Server is designed to handle large datasets efficiently through smart pagination and caching mechanisms. These optimizations ensure that large volumes of data can be processed without significant performance degradation.
To contribute to the development of the Contentful Delivery MCP Server, clone the repository and follow these steps:
# Clone the repo
git clone https://github.com/mshaaban0/contentful-delivery-mcp-server.git
# Install dependencies
npm install
# Build
npm run build
# Development with auto-rebuild
npm run watch
# Run the inspector
npm run inspector
For more information about the Model Context Protocol and related integrations, please refer to:
MIT
This comprehensive documentation highlights the key features and integration capabilities of the Contentful Delivery MCP Server, positioning it as an essential tool for developers building AI applications and integrating them with external data sources through the Model Context Protocol.
RuinedFooocus is a local AI image generator and chatbot image server for seamless creative control
Learn to set up MCP Airflow Database server for efficient database interactions and querying airflow data
Simplify MySQL queries with Java-based MysqlMcpServer for easy standard input-output communication
Build stunning one-page websites track engagement create QR codes monetize content easily with Acalytica
Explore CoRT MCP server for advanced self-arguing AI with multi-LLM inference and enhanced evaluation methods
Access NASA APIs for space data, images, asteroids, weather, and exoplanets via MCP integration