Using Mia-Platform as Both Producer and Consumer of an MCP Server

8 minutes read
23 October 2025

Overview

  • Agentic AI integration into software applications requires fast delivery and standardized solutions.
  • The model context protocol enables seamless developer operation across layered platform ecosystems.
  • An IDP that produces and consumes MCP servers becomes an efficient self-service hub to accelerate development workflows.

Gartner suggests that a considerably growing number of software applications will likely integrate agentic AI in the coming few years. But what’s the missing link that brings together increasingly complex agentic applications, the need to deliver software faster and the urgency to standardize solutions? The model context protocol (MCP) might be key to abstract this complexity and enable developers to operate seamlessly across all layers of the platform ecosystem.

Most importantly, an internal developer platform (IDP) acting as both a producer and a consumer of an MCP server enables smoother, secure and efficient tool sharing within the ecosystem. It can produce MCP servers that expose useful APIs or tools and consume other MCP servers, making the platform a self-service hub that accelerates development workflows for multiple purposes: discovery, composability, creation, deployment and monitoring.

 

What is the Model Context Protocol?

The sprawling of generative AI (GenAI) caused data sources and services to grow exponentially, overcomplicating the already vast and varied landscape of existing APIs for function discovery and calling. The MCP is revolutionary because of its mediation role, facilitating the interaction between LLMs and services.

Basically, the MCP is an open-source standard that allows mutual connection and intelligibility between data sources, tools, workflows and AI systems. Developers can choose to share their data via an MCP server or create AI applications (MCP clients) that interact with that server. It’s like having a communication bridge that helps entities who speak different languages understand each other, streamlining and optimizing the interaction. In essence, a standardized protocol that allows AI to request information from and execute actions on external systems. This standardization significantly shuffles the cards on the table because such a mediation layer simplifies API integration regardless of the underlying API protocols.

 

Anatomy of the Model Context Protocol

The MCP is a communication framework with a dual role. One is exposing capabilities (tools, data and functions) to AI systems, making them visible to the model in a standardized way; the other is retrieving real-time contextual information and executing complex actions on external systems in response to a query. The foundational elements of the MCP are:

  • MCP Server: the intermediary that gives context, data and functionality to MCP clients, converting info from outside sources into formats that AI models (such as LLMs) can comprehend.
  • MCP Client: the messenger that carries requests between the MCP server and the AI model.
  • MCP Host: the AI application that encapsulates the MCP client. It’s the interaction point with the user, whether it’s an intelligent IDE, a conversational interface or a custom agent.

Tools and catalogs of resources complete the picture to offer, respectively, specific functions and retrievable context. With a dash of imagination, the MCP could be compared to the human brain’s command center. Just like the brain dynamically accesses specific memories and motor skills to formulate a response or take an action by using a common language of signals, the MCP framework coordinates the flow of information to facilitate and improve the AI model’s output by defining a standardized protocol for accessing diverse sources.

Note that, however, the MCP is different from retrieval augmented generation (RAG), though they can work together and complete each other. The aim of RAG is to retrieve information from a knowledge base to enrich an AI’s response, but the outcome remains static. The MCP focuses on empowering agents so that they can perform critical, multi-step actions in the real world, in dynamic fashion.

So, let’s say you would like to cook your favourite dish from your trusted, first-hand experience recipe book. Retrieving the memory of one specific recipe is only the first step, but it also triggers the brain’s commands to the hands to mix ingredients in the right proportions, the eyes to read the notes and the tongue to taste for perfection. Similarly, the MCP doesn’t just retrieve context, but mediates to let agents execute real-world actions based on that very context.

 

Mia-Platform as a Smart Self-Service Hub 

Imagine developers can seamlessly interact with a cohesive platform ecosystem directly from their IDE to control platform resources. Now, consider this platform’s AI capabilities are a fundamental building block in a larger AI-driven ecosystem. Mia-Platform better exemplifies this duality because it can fulfill both roles in the model context protocol communication architecture. In other words, an internal developer platform acting as both a producer and a consumer of an MCP server becomes an intelligent self-service hub, a unique ecosystem that produces and consumes services to enable a seamless journey throughout the software life cycle.

 

Mia-Platform as a Producer of an MCP Server

The Mia-Platform Console hosts and serves an MCP server that exposes the platform’s assets to AI applications acting as consumers. This MCP server mediates and informs AI applications about the available platform tools and knowledge base and how to interact with them. For example, the MCP server can grant an AI assistant, like Mia-Assistant, conversational access to platform resources, such as catalogs, to interact with the platform in an intelligent, context-aware manner. This producer role is basically about making platform capabilities discoverable and usable via the MCP.

A developer can prompt directly to Mia-Assistant to create a project:

  1. Internal Action Request: The user’s request is to “Create the project Mia-Platform Playground in the Demo Live tenant using all defaults”. This is an administrative and configuration action that takes place entirely within the Mia-Platform Console’s environment.
  2. Internal Tool Execution: Mia-Assistant, acting as the MCP client, calls functions exposed by the platform’s internal APIs. The steps involve:
      • list_tenants…  (to verify the tenant).
      • create_project_from_template  (the primary action tool).
  3. Providing Internal Context: The entire detailed response, which includes the Project ID, Host URLs, Namespaces, GitLab repository, and list of enabled features (Security, Monitoring, Enhanced workflow template), is proprietary operational data generated and managed by the Mia-Platform Console.

The Mia-Platform Console MCP server is producing its core functionality (project creation and configuration) and internal context for Mia-Assistant (the client) to consume conversationally. The platform is the source and executor of the task—the ability to create a project—using its own data, APIs and internal logic.

Interestingly, you can also continue the conversation with additional prompts—creating a microservice, testing the code, monitoring—and answers will always be contextually relevant and fit your needs.

However, the Mia-Platform Console MCP server is also accessible remotely, meaning that external AI clients (like Visual Studio Code) can connect to it using the available server endpoint to their configuration. This feature improves flexibility, since users can communicate and perform operations on their projects using an MCP Server always running and always updated with the latest tools and prompts from the Mia-Platform Console.

 

Mia-Platform as a Consumer of an MCP Server

When Mia-Assistant acts as a consumer, its built-in MCP client can connect to and request information from multiple MCP servers in addition to the console’s MCP server. If so, Mia-Assistant can act like an integrated MCP client capable of querying and interacting with external systems like specific DevOps tools—such as Git, Grafana and Prometheus—and others like Jira via MCP servers built by those systems. This lets Mia-Assistant retrieve data from external systems and perform actions as part of completing tasks or workflows, expanding the reach and automation potential of the platform beyond its own boundaries. Take the following example:

A developer asks Mia-Assistant to help implement a feature by saying something like: “On the Insurance Hub project, what am I missing to implement the features described in the BCK_001 story?”

  1. Information Retrieval: Mia-Assistant (the consumer) uses a pre-configured tool to connect to an external Jira MCP Server (the producer). It calls the GET_JIRA_ISSUE function to retrieve the details of the BCK_001 story ticket.
  2. Analysis and Action: Mia-Assistant analyzes the ticket information and determines that a new data collection is required.
  3. Proposal: The AI responds to the developer with a clear, actionable proposal: “Based on the BCK_001 story, you need to add a collection with the following fields… Do you want me to proceed with the creation of the missing collections?”
  4. Execution: When the developer says “Yes”, Mia-Assistant calls another tool—this time a CREATE_COLLECTIONS function from the Mia-Platform MCP Server—to integrate the new data collections into the project.

This seamless flow shows how the platform acts as a smart consumer, retrieving information from an external service (Jira) to proactively propose and execute a task within its own environment.

 

Conclusion

Agentic applications have started filling the market, demanding innovation and standardization without compromising fast shipping and security. The model context protocol (MCP) is a standardized communication bridge that allows AI applications to easily communicate with external systems, thus abstracting this complexity.

Mia-Platform fulfils both roles of the model context protocol communication architecture, not only exposing the console APIs via its own MCP server, but also consuming functions from multiple, external MCP servers.

Expand your possibilities and choose Mia-Platform for full operational agentic applications that accelerate your workflows.

Back to start ↑
TABLE OF CONTENT
Overview
What is the Model Context Protocol?
Anatomy of the Model Context Protocol
Mia-Platform as a Smart Self-Service Hub 
Mia-Platform as a Producer of an MCP Server
Mia-Platform as a Consumer of an MCP Server
Conclusion