Azure AI Foundry MCP Server

Azure AI Foundry MCP Server

Enables interaction with Azure AI Foundry services including model exploration and deployment, knowledge management through AI Search, evaluation of text and agents, and fine-tuning operations. Provides unified access to Azure's AI capabilities through natural language commands.

Category
访问服务器

README

MCP Server that interacts with Azure AI Foundry (experimental)

A Model Context Protocol server for Azure AI Foundry, providing a unified set of tools for models, knowledge, evaluation, and more.

GitHub watchers GitHub forks GitHub stars

Azure AI Community Discord

Available Tools

Capabilities: Models

Category Tool Description
Explore list_models_from_model_catalog Retrieves a list of supported models from the Azure AI Foundry catalog.
list_azure_ai_foundry_labs_projects Retrieves a list of state-of-the-art AI models from Microsoft Research available in Azure AI Foundry Labs.
get_model_details_and_code_samples Retrieves detailed information for a specific model from the Azure AI Foundry catalog.
Build get_prototyping_instructions_for_github_and_labs Provides comprehensive instructions and setup guidance for starting to work with models from Azure AI Foundry and Azure AI Foundry Labs.
Deploy get_model_quotas Get model quotas for a specific Azure location.
create_azure_ai_services_account Creates an Azure AI Services account.
list_deployments_from_azure_ai_services Retrieves a list of deployments from Azure AI Services.
deploy_model_on_ai_services Deploys a model on Azure AI Services.
create_foundry_project Creates a new Azure AI Foundry project.

Capabilities: Knowledge

Category Tool Description
Index list_index_names Retrieve all names of indexes from the AI Search Service
list_index_schemas Retrieve all index schemas from the AI Search Service
retrieve_index_schema Retrieve the schema for a specific index from the AI Search Service
create_index Creates a new index
modify_index Modifies the index definition of an existing index
delete_index Removes an existing index
Document add_document Adds a document to the index
delete_document Removes a document from the index
Query query_index Searches a specific index to retrieve matching documents
get_document_count Returns the total number of documents in the index
Indexer list_indexers Retrieve all names of indexers from the AI Search Service
get_indexer Retrieve the full definition of a specific indexer from the AI Search Service
create_indexer Create a new indexer in the Search Service with the skill, index and data source
delete_indexer Delete an indexer from the AI Search Service by name
Data Source list_data_sources Retrieve all names of data sources from the AI Search Service
get_data_source Retrieve the full definition of a specific data source
Skill Set list_skill_sets Retrieve all names of skill sets from the AI Search Service
get_skill_set Retrieve the full definition of a specific skill set
Content fk_fetch_local_file_contents Retrieves the contents of a local file path (sample JSON, document etc)
fk_fetch_url_contents Retrieves the contents of a URL (sample JSON, document etc)

Capabilities: Evaluation

Category Tool Description
Evaluator Utilities list_text_evaluators List all available text evaluators.
list_agent_evaluators List all available agent evaluators.
get_text_evaluator_requirements Show input requirements for each text evaluator.
get_agent_evaluator_requirements Show input requirements for each agent evaluator.
Text Evaluation run_text_eval Run one or multiple text evaluators on a JSONL file or content.
format_evaluation_report Convert evaluation output into a readable Markdown report.
Agent Evaluation agent_query_and_evaluate Query an agent and evaluate its response using selected evaluators. End-to-End agent evaluation.
run_agent_eval Evaluate a single agent interaction with specific data (query, response, tool calls, definitions).
Agent Service list_agents List all Azure AI Agents available in the configured project.
connect_agent Send a query to a specified agent.
query_default_agent Query the default agent defined in environment variables.

Capabilities: Finetuning

Category Tool Description
Finetuning fetch_finetuning_status Retrieves detailed status and metadata for a specific fine-tuning job, including job state, model, creation and finish times, hyperparameters, and any errors.
list_finetuning_jobs Lists all fine-tuning jobs in the resource, returning job IDs and their current statuses for easy tracking and management.
get_finetuning_job_events Retrieves a chronological list of all events for a specific fine-tuning job, including timestamps and detailed messages for each training step, evaluation, and completion.
get_finetuning_metrics Retrieves training and evaluation metrics for a specific fine-tuning job, including loss curves, accuracy, and other relevant performance indicators for monitoring and analysis.
list_finetuning_files Lists all files available for fine-tuning in Azure OpenAI, including file IDs, names, purposes, and statuses.
execute_dynamic_swagger_action Executes any tool dynamically generated from the Swagger specification, allowing flexible API calls for advanced scenarios.
list_dynamic_swagger_tools Lists all dynamically registered tools from the Swagger specification, enabling discovery and automation of available API endpoints.

Prompt Examples

Models

Explore models

  • How can you help me find the right model?
  • What models can I use from Azure AI Foundry?
  • What OpenAI models are available in Azure AI Foundry?
  • What are the most popular models in Azure AI Foundry? Pick me 10 models.
  • What models are good for reasoning? Show me some examples in two buckets, one for large models and one for small models.
  • Can you compare Phi models and explain differences?
  • Show me the model card for Phi-4-reasoning.
  • Can you show me how to test a model?
  • What does free playground in Azure AI Foundry mean?
  • Can I use GitHub token to test models?
  • Show me latest models that support GitHub token.
  • Who are the model publishers for the models in Azure AI Foundry?
  • Show me models from Meta.
  • Show me models with MIT license.

Build prototypes

  • Can you describe how you can help me build a prototype using the model?
  • Describe how you can build a prototype that uses an OpenAI model with my GitHub token. Don't try to create one yet.
  • Recommend me a few scenarios to build prototypes with models.
  • Tell me about Azure AI Foundry Labs.
  • Tell me more about Magentic One
  • What is Omniparser and what are potential use cases?
  • Can you help me build a prototype using Omniparser?

Deploy OpenAI models

  • Can you help me deploy OpenAI models?
  • What steps do I need to take to deploy OpenAI models on Azure AI Foundry?
  • Can you help me understand how I can use OpenAI models on Azure AI Foundry using GitHub token? Can I use it for production?
  • I already have an Azure AI services resource. Can I deploy OpenAI models on it?
  • What does quota for OpenAI models mean on Azure AI Foundry?
  • Get me current quota for my AI services resource.

Quick Start with GitHub Copilot

Use The Template

This GitHub template has minimal setup with MCP server configuration and all required dependencies, making it easy to get started with your own projects.

Install in VS Code

This helps you automatically set up the MCP server in your VS Code environment under user settings. You will need uvx installed in your environment to run the server.

Manual Setup

  1. Install uv by following Installing uv.

  2. Start a new workspace in VS Code.

  3. (Optional) Create .env file in the root of your workspace to set environment variables.

  4. Create .vscode/mcp.json in the root of your workspace.

    {
        "servers": {
            "mcp_foundry_server": {
                "type": "stdio",
                "command": "uvx",
                "args": [
                    "--prerelease=allow",
                    "--from",
                    "git+https://github.com/azure-ai-foundry/mcp-foundry.git",
                    "run-azure-ai-foundry-mcp",
                    "--envFile",
                    "${workspaceFolder}/.env"
                ]
            }
        }
    }
    
  5. Click Start button for the server in .vscode/mcp.json file.

  6. Open GitHub Copilot chat in Agent mode and start asking questions.

See More examples for advanced setup for more details on how to set up the MCP server.

Setting the Environment Variables

To securely pass information to the MCP server, such as API keys, endpoints, and other sensitive data, you can use environment variables. This is especially important for tools that require authentication or access to external services.

You can set these environment variables in a .env file in the root of your project. You can pass the location of .env file when setting up MCP Server, and the server will automatically load these variables when it starts.

See example .env file for a sample configuration.

Category Variable Required? Description
Model GITHUB_TOKEN No GitHub token for testing models for free with rate limits.
Knowledge AZURE_AI_SEARCH_ENDPOINT Always The endpoint URL for your Azure AI Search service. It should look like this: https://<your-search-service-name>.search.windows.net/.
AZURE_AI_SEARCH_API_VERSION No API Version to use. Defaults to 2025-03-01-preview.
SEARCH_AUTHENTICATION_METHOD Always service-principal or api-search-key.
AZURE_TENANT_ID Yes when using service-principal The ID of your Azure Active Directory tenant.
AZURE_CLIENT_ID Yes when using service-principal The ID of your Service Principal (app registration)
AZURE_CLIENT_SECRET Yes when using service-principal The secret credential for the Service Principal.
AZURE_AI_SEARCH_API_KEY Yes when using api-search-key The API key for your Azure AI Search service.
Evaluation EVAL_DATA_DIR Always Path to the JSONL evaluation dataset
AZURE_OPENAI_ENDPOINT Text quality evaluators Endpoint for Azure OpenAI
AZURE_OPENAI_API_KEY Text quality evaluators API key for Azure OpenAI
AZURE_OPENAI_DEPLOYMENT Text quality evaluators Deployment name (e.g., gpt-4o)
AZURE_OPENAI_API_VERSION Text quality evaluators Version of the OpenAI API
AZURE_AI_PROJECT_ENDPOINT Agent services Used for Azure AI Agent querying and evaluation

[!NOTE] Model

  • GITHUB_TOKEN is used to authenticate with GitHub API for testing models. It is not required if you are exploring models from Foundry catalog.

Knowledge

  • See Create a search service to learn more about provisioning a search service.
  • Azure AI Search supports multiple authentication methods. You can use either a Microsoft Entra authentication or an Key-based authentication to authenticate your requests. The choice of authentication method depends on your security requirements and the Azure environment you are working in.
  • See Authenication to learn more about authentication methods for a search service.

Evaluation

  • If you're using agent tools or safety evaluators, make sure the Azure project credentials are valid.
  • If you're only doing text quality evaluation, the OpenAI endpoint and key are sufficient.

License

MIT License. See LICENSE for details.

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选