Agentic AI with Azure MCP Server

The Agentic Web is here. The idea of the Azure MCP Server is that you can tap into Azure services (given what Microsoft decides to expose or provide at their discretion) through agents from your LLM infused applications. This is NOT just a set of APIs being listed and returned to your client. The promise of MCP is that whatever provider of server is serving resources, can serve varied capabilities that fall under the primitive terms 'Tools' , 'Prompts and 'Resources' (there is a good point on this made here ). Currently, you can use 'Tools' such as the Azure CLI where as for 'Resources', or you can interact with services such as Azure Blob Storage or Azure Monitor.

You will need to have MCP client code that will invoke these tools, prompts and resources from the MCP Server in natural language. Additionally for .NET, we will assume the following:

  • You are using .NET8 or higher
  • An OpenAI Developer API Key with credits available

Azure MCP Server usage Quick Example

To expose the capabilities of Azure MCP Server to our application/agent, we can write:

using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Agents;
using Microsoft.SemanticKernel.Connectors.OpenAI;
using ModelContextProtocol.Client;
////...

var builder = Kernel.CreateBuilder();

builder.Services.AddOpenAIChatCompletion(
    modelId: "gpt-4o", // or other OpenAI model of your choice
    apiKey: "Your OpenAI Dev API Key");
Kernel kernel = builder.Build();

await using IMcpClient mcpClient = await McpClientFactory.CreateAsync(new StdioClientTransport(new()
{
    Name = "Azure",
    Command = "npx",
    Arguments = ["-y", "@azure/mcp@latest", "server", "start"]
}));

var tools = await mcpClient.ListToolsAsync().ConfigureAwait(false);
foreach (var tool in tools)
{
    Console.WriteLine($"{tool.Name}: {tool.Description}");
}

Listing Azure MCP Server capabilities

For the above we will need the following nuget packages:

 <PackageReference Include="Microsoft.SemanticKernel" Version="1.54.0" />
 <PackageReference Include="Microsoft.SemanticKernel.Agents.Core" Version="1.54.0" />
 <PackageReference Include="ModelContextProtocol" Version="0.2.0-preview.1" />

From what I discovered at time writing, the Azure MCP has these following tools and capabilities available:

We need to add the Azure MCP capabilities to the Kernel.

However it must be noted that at the time of writing, all function names with a _ character (like the ones we have for Azure MCP Server) cannot be added as Plugins to the Semantic Kernel:

List<McpClientTool> renamedTools = new List<McpClientTool>();
foreach (var function in tools)
{
    //must rename all Azure MCP Server function names since names with _ cannot be added
    // Replace _ with - 
    string newName = function.Name.StartsWith("azmcp-") ? function.Name.Replace("-", "_") : function.Name;
    McpClientTool mcpClientTool = function.WithName(newName).WithDescription(function.Description);
    renamedTools.Add(mcpClientTool);
}

#pragma warning disable SKEXP0001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.
kernel.Plugins.AddFromFunctions("Azure", renamedTools.Select(aiFunction => aiFunction.AsKernelFunction()));
#pragma warning restore SKEXP0001 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.

Add Azure MCP Server functions to Semantic Kernel after renaming

💡
This renaming allows us to carry these Azure MCP Server functions into the Kernel without causing errors. The models are smart enough to then use the given descriptions of the functions to know what to use because they are all described in plain text.

Next we can start prompting the Azure MCP Server. It must be known that the model must have some awareness of what your Azure subscriptionId is, and what your tenantId is in order to query Azure correctly about your resources. Without them, the model simply reports that it could not access your resources due to authorisation and authentication issues.

The easiest way to state these is to state them within the prompt itself (Proceed with Caution!!):

OpenAIPromptExecutionSettings executionSettings = new()
{
    Temperature = 0,
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(options: new() { AllowConcurrentInvocation =true })
};

var prompt = "Can you tell me about my blob container called 'A' in my 'B' storage account? my subscriptionId is X and my tenantId is X.  ";
        var result = await kernel.InvokePromptAsync(prompt, new(executionSettings)).ConfigureAwait(false);
        Console.WriteLine($"{result}");

Prompting the Azure MCP Server through Semantic Kernel and GPT4o

From this code, the model will agentively decide what tools are appropriate for the task based on what is available within Semantic Kernel (could be thousands of tools and abilities), reach into Azure via MCP, return a result and respond to whatever the prompt asked for (the prompt can be more complex than what is shown above)

And finally we can call individual functions from Azure MCP server as the following as long as we know the exact name of the MCP function and it's required parameters (which we can see from the Tools List text from before):

var directToolCall = await mcpClient.CallToolAsync("azmcp-appconfig-account-list" , new Dictionary<string, object>() { ["subscription"] = "[Your_subscriptionId]" , ["tenant"] = "[Your_tenantId]" });

A direct MCP Function call to query Azure App Configuration

Current Limitations with Azure MCP Server

The Azure MCP Server is still rather 'deterministic' in the sense that when the MCP functions are called, the required arguments/parameters that go into the MCP functions need to be exact. And so for example, if you have a test auto-generated storage account with a hard-to-remember name with random letters and numbers, asking the model to somehow figure out the full name based on a 'guess' or approximation of that storage account name can sometimes result in no information being retrieved at all. It helps to tell the model your exact thoughts and give educated approximations about resources so that it will execute generic funtions against the MCP server and then assess the results for you rather than supplying non-existing resource names to Azure, having those names being treated as absolute and getting no information back. Work with model productively!

Finally, depending on the AI model that you end up choosing, it is still subject to rate limits, and some prompts that invoke a large number of recursive calls to Azure will tend to use up more quota on the AI model provider, be cautious.