MCP Server Integration Guide
Streamline your AI applications by connecting to the Praxis AI Middleware’s MCP (Model Context Protocol) server. The MCP server enables powerful features such as thesearch-instance-rag tool for in-depth context retrieval from digital twins. This guide walks you through configuration, connection, digital twin linking, and troubleshooting.
Overview
The Praxis AI Middleware includes an embedded MCP server, which exposes essential AI endpoints for seamless LLM integration. With tools likesearch-instance-rag, you can enhance your educational AI with real-time, context-aware reasoning—building robust, interconnected AI systems.
Key Benefits
- Exposes your digital twin’s knowledge to your favorite LLM.
- Integrates instances securely.
- Powers sophisticated educational workflows.
Configure your Digital Twin as an MCP Server
Enable the MCP Server support and save your digital twin in the Configuration and Integrations panel
Connection Configuration
To connect your AI or middleware app to the MCP server:- Server URL:
https://pria.praxislxp.com/api/mcp - Server Label:
pria-mcp - Description: (Optional) The description of your MCP server instructs the AI on when to use the tools configured. We suggest the following instruction:
-
Authorization Header:
Use Bearer authentication with your Digital Twin’s
MCP Secret:Example Token:
Bearer f831501f-b645-481a-9cbb-331509aaf8c1
MCP Secret field from the Configuration and Integrations panel of your Digital Twin

- Primary Tool Available:
search-instance-rag(Contextual retrieval for digital twins)api-admin-docs(Documents the Admin functions of the Praxis AI middleware REST API)api-rt-docs(Documents the User-based Runtime functions of the Praxis AI middleware REST API)- Additional tools may be available depending on your configuration
Testing
You may use the MCP Inspector (see docs) or use any other tools to validate requests and see live responses and interact with your MCP Server.
Link Digital Twins with connector MCP
One practical use case for linking two digital twins via connector MCP is to enable knowledge sharing—where the document collection and contextual knowledge of one digital twin remain centralized, while other digital twins or systems can securely query and leverage this content1 . This approach avoids unnecessary data duplication, ensures that updates to source knowledge are instantly available to all connected twins, and allows different AI instances to collaborate or access real-time context without copying or synchronizing datasets. Ultimately, it streamlines data management, strengthens consistency, and unlocks powerful distributed workflows within connected AI ecosystems. To link two digital twins (e.g., “Twin B” connects with “Twin A”):- In Digital Twin B:
- Navigate to connectors.
- Add a new connector with:
- Type:
Connector MCP - Target: Digital Twin A’s MCP Server endpoint (from Configuration)
- Bearer token: Digital Twin A’s
MCP Secret
- Type:
- Set up connection parameters as shown below:

The Bearer token you provide here should match Digital Twin A’s
MCP Secret. This ensures secure, authenticated communication.- Save and enable your connector.

- Launch the first request to test the connectivity

- And review the Agent Details

Troubleshooting & Tips
- Invalid Token:
Ensure your Authorization header matches an existing Digital Twin’sMCP Secret. - Connection Fails:
Double-check endpoint URLs and enable connectors in both twins. - Not using the connector:
Verify that the server label is configured correctly, and that the tools are named properly in the instance. Then ask a specific query like:
using the pria-mcp, execute the search-instance-rag tool to find information on ‘xyz’
- Debugging:
Use the MCP Inspector to trace issues step-by-step.
More information
More information on Model Context Protocol available at MCP.ioFootnotes
- Model Context Protocol Documentation – Outlines knowledge sharing and centralized data strategies with MCP-connected digital twins. ↩