Factifai MCP Server

Factifai MCP Server

Integrates Factifai testing capabilities with MCP-compatible AI tools, enabling asynchronous creation and retrieval of automated tests using AI models from OpenAI or AWS Bedrock.

Category
访问服务器

README

Factifai MCP Server

License: MIT npm version <img alt="Install in VS Code (npx)" src="https://img.shields.io/badge/VS_Code-VS_Code?style=plastic&label=Install&color=0098FF"> <img alt="Install in VS Code Insiders (npx)" src="https://img.shields.io/badge/VS_Code_Insiders-VS_Code_Insiders?style=plastic&label=Install&color=24bfa5"> <img alt="Install in Cursor (npx)" src="https://img.shields.io/badge/Cursor-Cursor?style=plastic&label=Install&color=1A1A1A">

<p> <img style="margin-right:18px;" src="assets/img/hai.png" alt="Hai Build" /> <img style="margin-right:18px;" src="assets/img/amazon-q.png" alt="Amazon Q" /> <img style="margin-right:18px;" src="assets/img/vsc.png" alt="VS Code" /> <img style="margin-right:18px;" src="assets/img/cursor.png" alt="Cursor" /> <img style="margin-right:18px;" src="assets/img/windsurf.png" alt="Windsurf" /> <img style="margin-right:18px;" src="assets/img/zed.png" alt="Zed" /> </p>

A Model Context Protocol (MCP) server for Factifai integration with any MCP-compatible AI tool. This server is designed to be tool-agnostic, meaning it can be used with any tool that supports the MCP protocol. This server currently exposes tools to create tests asynchronously and get the result of the test.

Table of Contents

Requirements

  • Node.js >= 16.0.0
  • Hai Build, Cursor, Windsurf, Claude Desktop or any MCP Client

Installation

# Latest version
npx --yes @presidio-dev/factifai-mcp-server@latest

# Specific version
npx --yes @presidio-dev/factifai-mcp-server@1.2.3

We recommend npx to install the server, but you can use any node package manager of your preference such as yarn, pnpm, bun, etc.

Installation Note

⚠️ Important: The first time you install Factifai MCP Server, it will automatically download and install browser dependencies using Playwright. This process may take several minutes depending on your internet connection and system specifications.

The installation includes:

  • Downloading browser binaries (Chromium, Firefox, WebKit)
  • Installing browser dependencies
  • Setting up the necessary environment

This happens only once, and subsequent runs will be much faster as the browsers are already installed.

Pre-Installation Tip

⚠️ Recommended for First-Time Installation: Many MCP clients have strict timeout limits for server startup. The browser installation process during first-time setup may exceed these timeouts, causing the installation to fail or appear non-responsive.

To avoid timeout issues, we strongly recommend pre-installing Playwright browsers manually:

# Step 1: Install Playwright browsers manually before installing the MCP server
npx playwright install --with-deps

# Step 2: Then install the MCP server (will be much faster and avoid timeouts)
npx --yes @presidio-dev/factifai-mcp-server@latest

This pre-installation step:

  1. Ensures browsers are downloaded without MCP client timeout constraints
  2. Significantly speeds up the MCP server's first-time installation
  3. Prevents installation failures due to timeout issues in your IDE or MCP client

Configuration

with npx with latest version:

{
	"factifai": {
		"command": "npx",
		"args": ["--yes", "@presidio-dev/factifai-mcp-server@latest"],
		"env": {
			"MODEL_PROVIDER": "bedrock|openai",
			"OPENAI_API_KEY": "<your-openai-api-key>",
			"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
			"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
			"AWS_DEFAULT_REGION": "<your-aws-region>"
		},
		"disabled": false,
		"autoApprove": []
	}
}

with npx with specific version:

{
	"factifai": {
		"command": "npx",
		"args": ["--yes", "@presidio-dev/factifai-mcp-server@1.2.3"],
		"env": {
			"MODEL_PROVIDER": "bedrock|openai",
			"OPENAI_API_KEY": "<your-openai-api-key>",
			"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
			"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
			"AWS_DEFAULT_REGION": "<your-aws-region>"
		},
		"disabled": false,
		"autoApprove": []
	}
}

Environment Variables

Variable Name Description
MODEL_PROVIDER The model provider to use. (bedrock or openai)
OPENAI_API_KEY The API key for the OpenAI model provider
AWS_ACCESS_KEY_ID The AWS access key ID for the Bedrock model provider
AWS_SECRET_ACCESS_KEY The AWS secret access key for the Bedrock model provider
AWS_DEFAULT_REGION The AWS default region for the Bedrock model provider

Model Provider Configuration Examples

Bedrock Configuration Example

{
	"factifai": {
		"command": "npx",
		"args": ["--yes", "@presidio-dev/factifai-mcp-server@latest"],
		"env": {
			"MODEL_PROVIDER": "bedrock",
			"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
			"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
			"AWS_DEFAULT_REGION": "<your-aws-region>"
		},
		"disabled": false,
		"autoApprove": []
	}
}

OpenAI Configuration Example

{
	"factifai": {
		"command": "npx",
		"args": ["--yes", "@presidio-dev/factifai-mcp-server@latest"],
		"env": {
			"MODEL_PROVIDER": "openai",
			"OPENAI_API_KEY": "<your-openai-api-key>"
		},
		"disabled": false,
		"autoApprove": []
	}
}

Factifai MCP integration with popular IDE and extension

See the setup instructions for each

<details>

<summary><b>Install in Hai Build</b></summary>

Add the following to your hai_mcp_settings.json file. To open this file from Hai Build, click the "MCP Servers" icon, select the "Installed" tab, and then click "Configure MCP Servers".

See the Hai Build MCP documentation for more info.

{
	"mcpServers": {
		"factifai": {
			"command": "npx",
			"args": ["-y", "@presidio-dev/factifai-mcp-server@latest"],
			"env": {
				"MODEL_PROVIDER": "bedrock|openai",
				"OPENAI_API_KEY": "<your-openai-api-key>",
				"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
				"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
				"AWS_DEFAULT_REGION": "<your-aws-region>"
			}
		}
	}
}

</details>

<details>

<summary><b>Install in Amazon Q Developer</b></summary>

Add the following to your Amazon Q Developer configuration file. See MCP configuration for Q Developer in the IDE for more details.

The configuration file can be stored globally at ~/.aws/amazonq/mcp.json to be available across all your projects, or locally within your project at .amazonq/mcp.json.

{
	"mcpServers": {
		"factifai": {
			"command": "npx",
			"args": ["-y", "@presidio-dev/factifai-mcp-server@latest"],
			"env": {
				"MODEL_PROVIDER": "bedrock|openai",
				"OPENAI_API_KEY": "<your-openai-api-key>",
				"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
				"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
				"AWS_DEFAULT_REGION": "<your-aws-region>"
			}
		}
	}
}

</details>

<details>

<summary><b>Install in VS Code (Copilot)</b></summary>

<img alt="Install in VS Code (npx)" src="https://img.shields.io/badge/VS_Code-VS_Code?style=plastic&label=Install&color=0098FF"> <img alt="Install in VS Code Insiders (npx)" src="https://img.shields.io/badge/VS_Code_Insiders-VS_Code_Insiders?style=plastic&label=Install&color=24bfa5">

First, enable MCP support in VS Code by opening Settings (Ctrl+,), searching for mcp.enabled, and checking the box.

Then, add the following configuration to your user or workspace settings.json file. See the VS Code MCP documentation for more info.

"mcp": {
  "servers": {
    "factifai": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@presidio-dev/factifai-mcp-server@latest"],
      "env": {
        "MODEL_PROVIDER": "bedrock|openai",
        "OPENAI_API_KEY": "<your-openai-api-key>",
        "AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
        "AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
        "AWS_DEFAULT_REGION": "<your-aws-region>"
      }
    }
  }
}

</details>

<details> <summary><b>Install in Cursor</b></summary>

The easiest way to install is with the one-click installation button below.

<img alt="Install in Cursor (npx)" src="https://img.shields.io/badge/Cursor-Cursor?style=plastic&label=Install&color=1A1A1A">

Alternatively, you can manually configure the server by adding the following to your mcp.json file. This file can be located globally at ~/.cursor/mcp.json or within a specific project at .cursor/mcp.json. See the Cursor MCP documentation for more information.

{
	"mcpServers": {
		"factifai": {
			"command": "npx",
			"args": ["--yes", "@presidio-dev/factifai-mcp-server@latest"],
			"env": {
				"MODEL_PROVIDER": "bedrock|openai",
				"OPENAI_API_KEY": "<your-openai-api-key>",
				"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
				"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
				"AWS_DEFAULT_REGION": "<your-aws-region>"
			}
		}
	}
}

</details>

<details> <summary><b>Install in Windsurf</b></summary>

Add the following to your ~/.codeium/windsurf/mcp_config.json file. See the Windsurf MCP documentation for more information.

{
	"mcpServers": {
		"factifai": {
			"command": "npx",
			"args": ["-y", "@presidio-dev/factifai-mcp-server@latest"],
			"env": {
				"MODEL_PROVIDER": "bedrock|openai",
				"OPENAI_API_KEY": "<your-openai-api-key>",
				"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
				"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
				"AWS_DEFAULT_REGION": "<your-aws-region>"
			}
		}
	}
}

</details>

<details> <summary><b>Install in Zed</b></summary>

You can add the Factifai MCP server in Zed by editing your settings.json file (accessible via the zed: settings action) or by using the Agent Panel's configuration UI (agent: open configuration). See the Zed MCP documentation for more information.

Add the following to your settings.json:

{
	"context_servers": {
		"factifai": {
			"command": {
				"path": "npx",
				"args": ["-y", "@presidio-dev/factifai-mcp-server@latest"],
				"env": {
					"MODEL_PROVIDER": "bedrock|openai",
					"OPENAI_API_KEY": "<your-openai-api-key>",
					"AWS_ACCESS_KEY_ID": "<your-aws-access-key-id>",
					"AWS_SECRET_ACCESS_KEY": "<your-aws-secret-access-key>",
					"AWS_DEFAULT_REGION": "<your-aws-region>"
				}
			}
		}
	}
}

</details>

Available Tools

Tool Name Description
testWithFactifai Start a test with Factifai
getFactifaiSessionResult Get test result
listFactifaiSessions List tests

Contributing

We welcome contributions to the Factifai MCP Server! Please see our Contributing Guide for more information on how to get started.

Security

For information about our security policy and how to report security vulnerabilities, please see our Security Policy.

License

This project is licensed under the MIT License - see the LICENSE file for details.

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选