An introduction to MCP on Azure

Over the last few weeks I seem to be having more and more conversations about MCP servers. Be it with customers looking at agentic AI solutions, or announcements from the big tech providers. 

With this in mind, I thought I'd write this blog to help engineers and architects understand what I've learnt about how they fit into the agentic solutions that data science teams are building today.

What are MCP servers?

An MCP, or model context protocol, server is ultimately an interpolation layer that makes it easier for data science teams to integrate APIs, data sets, etc into large language model (LLM) solutions.  Within these setups, an API, data set, etc is referred to as a tool.

Historically, integrating one tool into an agentic solution has been relatively straight forward; but, that effort doesn't scale linearly. As more and more tools, the complexity rapidly increases - increasing both development and maintenance costs.

To solve this challenge, Anthropic introduced MCP. This protocol standardises the way that tools are integrated into agentic solutions and adds descriptions around the tool - meaning that the LLM can identify the right tool based on the user prompt, and use this to dynamically create the final prompt that will be sent to the LLM to answer.


How does MCP work?

With this flow in mind, we can now look at how MCP works. Behind the scenes, MCP comes in two elements:
  1. MCP client
  2. MCP server
In terms of MCP clients, most of the common LLMs available today have an MCP client built into them. They are designed to understand this protocol and how to work with it. 

Turning to look at MCP servers, these come in two flavours:
  1. COTS solutions from current software vendors such as Microsoft's recent announcements around MCP for Dataverse, Fabric, etc.
  2. Building and hosting your own MCP server (potentially accelerated by many of the open source projects available today).
Either way, we end up with the following setup:



What does this mean for Azure?

Across blog announcements, build, and Databricks Data+AI summit we have seen a number of announcements in this space over the last month. By each product we have seen announcements for the following:

  • Microsoft Fabric. Microsoft have released an open source MCP server for Fabric. Today it allows users to add RTI solutions into Fabric so that LLMs can write KQL against Eventhouses. Microsoft have also announced a number of additional Fabric objects will be added soon (check the blog for more details).
  • Azure MCP server. In public preview, Microsoft have released Azure MCP server. This allows LLMs to interact with a number of Azure components such as Cosmos DB, Azure CLI, etc. Check out the associated blog for more info.
  • Dataverse. We now get an MCP server for all data that is stored in the dataverse. Making it easy for those running D365 to integrate their data into agentic solutions.
  • Copilot studio. MCP support has been added to copilot studio, making it easier than before to add tools to the copilots being built.
  • AI foundary. Has all the tools you need to build your own MCP server for use within agentic solutions - as well as being integrated with the Azure AI agent service.
  • Azure Databricks. At summit Databricks announced MCP support for both unity catalogue and Mosaic AI, making it easier than ever to integrated data hosted in Databricks in agentic solutions.
With all these announcements, one thing is clear, MCP is here to stay and it will be the new defecto standard for integrating tools into LLM based solutions.

Comments

Popular posts from this blog

Ignite 2024 - What's been announced for Microsoft Fabric

Power BI - Fabcon keynote, preview features, and March 2025 announcements

Workspace topologies in Microsoft Fabric