Procore MCP server enables seamless API integration for project management with LLMs and automation tools
Procore is an MCP (Model Context Protocol) server designed to facilitate interaction between Advanced Language Models, such as Claude Desktop, Continue, Cursor, and other AI applications with the Procore API platform. By adhering to the universal standards set by the Model Context Protocol, this server enables seamless integration, making it easier for developers to build powerful AI workflows that leverage Procore's rich data structures.
The Procore MCP server is equipped with a suite of tools and prompts designed to simplify interactions with Procore. These features enhance the capabilities of AI applications, offering a standardized method for setting context, querying company and project information, submittals, and more.
Tools:
procore_set_context
- Set the current company and/or project context.procore_list_companies
- Retrieve and list all accessible companies to the authenticated user.procore_get_company_details
- Obtain detailed information about a specific company.procore_list_projects_for_company
- List projects associated with a particular company.procore_get_project_details
- Fetch details of a specified project.procore_list_submittals
- Retrieve submittal records for a project, using filters to narrow down the context.Prompts:
company_summary
- Generate a summary of a company's information based on provided data.project_summary
- Create a concise overview of a project’s details and metadata.submittals_summary
- Summarize submittal information for a given project, possibly with specific filters.These tools and prompts are invaluable for constructing versatile AI workflows that can dynamically adapt to different Procore contexts. By offering these standardized interactions, the Procore MCP server ensures compatibility across various AI applications.
The architecture of the Procore MCP server is designed to be both extensible and interoperable. The core components include:
Authorization Layer: Handles OAuth flows to securely connect with Procore APIs.
init_token.py
script initiates the authorization process by guiding users through the authentication flow, saving necessary tokens in a JSON file (procore_token.json
).API Adapter Layer: Wraps around Procore's API endpoints, ensuring that they conform to MCP standards for different tools and prompts.
Tool & Prompt Executor: Executes specific commands like procore_set_context
, list_companies
, etc., based on the MCP protocol structure.
UI/CLI Interface: Provides a user-friendly interface or command-line tool to interact with the server, making it accessible for both developers and end-users.
To get started with the Procore MCP server, follow these steps:
Clone the repository from GitHub:
git clone https://github.com/your/repo/procore-mcp-server.git
Install the required dependencies by running:
uv sync
Create a .env
file in the root directory to store Procore API credentials and other configuration variables:
PROCORE_CLIENT_ID=your_client_id
PROCORE_CLIENT_SECRET=your_client_secret
PROCORE_REDIRECT_URI=your_redirect_uri
Initialize the authentication token by running:
python init_token.py
You will be guided through an authorization process where you log into Procore using a browser, and then supply an authorization code to complete the setup.
The Procore MCP server is particularly well-suited for environments where multiple AI applications need to interact with the same data source. Here are two concrete use cases:
Both use cases leverage the MCP protocol's flexibility to dynamically fetch data from Procore in a consistent and reliable manner.
The Procore MCP server supports integration with several popular MCP clients:
MCP Client | Resources | Tools | Prompts | Status |
---|---|---|---|---|
Claude Desktop | ✅ | ✅ | ✅ | Full Support |
Continue | ✅ | ✅ | ✅ | Full Support |
Cursor | ❌ | ✅ | ❌ | Tools Only |
The compatibility matrix highlights the current support status for various clients, indicating that full support is available across resources and tools, but only select prompts are supported by the integration.
Procore MCP server has been rigorously tested to ensure optimal performance with Procore APIs. The following table outlines key metrics:
Metric | Value |
---|---|
Response Time (ms) | < 200 |
Latency Stability | ≤ 1% Variation |
Throughput | > 10 requests/second |
These metrics reflect the server's ability to handle high-frequency interactions while maintaining low response times and stable performance.
For advanced configurations, developers can fine-tune the Procore MCP server through custom .env
settings. Additionally, security best practices should be followed:
Here's an example configuration snippet:
{
"mcpServers": {
"procore-mcp-server": {
"command": "uv",
"args": [
"--directory",
"/path/to/your/procore-mcp-server",
"run",
"procore-mcp-server",
"--ssl-cert=/path/to/cert.pem",
"--ssl-key=/path/to/key.pem"
]
}
}
}
This example demonstrates the addition of SSL certificate parameters during server initialization.
To initialize your MCP token, run init_token.py
script after setting up .env
credentials. This initiates an OAuth flow where you navigate to Procore using a browser and provide authorization.
At this moment, only select prompts are supported by all clients according to the compatibility matrix. Ensure that your chosen AI application supports the needed functionality before proceeding.
Yes, you can adjust the server's SSL certificate and key paths during initialization through command-line arguments as shown in the configuration sample.
The .env
file is used to store sensitive information like API credentials outside of version control. Additionally, HTTPS connections are enforced using SSL/TLS certificates for secure communication.
First, check the response time and latency stability metrics provided in the server's diagnostic output. If issues persist, consider tuning underlying network configurations or adjusting resource allocation.
Contributions to the Procore MCP Server are welcome from developers around the world! To contribute:
feature/company-context-improvements
.Detailed instructions are available in the project's CONTRIBUTING.md
file.
The Procore MCP server is part of a broader ecosystem dedicated to advancing Model Context Protocol standards across various sectors. Explore additional resources and projects on our official website:
Join our developer community for updates, support, and collaboration opportunities!
This comprehensive technical documentation highlights the capabilities of the Procore MCP server, emphasizing its role in enhancing AI application interactions through standardized protocols. Whether you're a developer working on AI applications or integrating different tools within your organization, this setup provides robust foundational support.
Next-generation MCP server enhances documentation analysis with AI-powered neural processing and multi-language support
Learn to connect to MCP servers over HTTP with Python SDK using SSE for efficient protocol communication
Python MCP client for testing servers avoid message limits and customize with API key
Discover easy deployment and management of MCP servers with Glutamate platform for Windows Linux Mac
Learn how to use MCProto Ruby gem to create and chain MCP servers for custom solutions
Analyze search intent with MCP API for SEO insights and keyword categorization