
Tensorus MCP
Model Context Protocol server and client that enables AI agents and LLMs to interact with Tensorus tensor database for operations like creating datasets, ingesting tensors, and applying tensor operations.
README
license: mit title: Tensorus MCP sdk: python emoji: 🐠 colorFrom: blue colorTo: yellow short_description: Model Context Protocol server and client for Tensorus tensor database
Tensorus MCP
Model Context Protocol (MCP) server and client for Tensorus tensor database operations. This package provides a standardized interface for AI agents and LLMs to interact with Tensorus capabilities using the Model Context Protocol.
Features
- MCP Server: Python implementation using
fastmcp
for tensor database operations - MCP Client: Python client library for easy integration with MCP servers
- Tensor Operations: Complete set of tensor manipulation tools via MCP
- Dataset Management: Create, list, and manage tensor datasets
- Demo Mode: Pre-configured mock data for testing and demonstration
Installation
pip install fastmcp
pip install -r requirements.txt
Quick Start
Starting the MCP Server
python -m tensorus_mcp.server
For web endpoint support:
python -m tensorus_mcp.server --transport streamable-http
Demo Mode
For demonstration or testing purposes, run the server in demo mode:
python -m tensorus_mcp.server --demo-mode
Using the Python Client
from tensorus_mcp.client import TensorusMCPClient
async def example():
async with TensorusMCPClient.from_http("http://localhost:8000/mcp/") as client:
# List available datasets
datasets = await client.list_datasets()
print(f"Available datasets: {datasets}")
# Create a new dataset
await client.create_dataset("my_dataset")
# Ingest a tensor
result = await client.ingest_tensor(
dataset_name="my_dataset",
tensor_shape=[2, 2],
tensor_dtype="float32",
tensor_data=[[1.0, 2.0], [3.0, 4.0]],
metadata={"source": "example"}
)
print(f"Ingested tensor with ID: {result['record_id']}")
MCP Demo Script
Prerequisites
- Tensorus MCP Server running (
python -m tensorus_mcp.server
) - For live mode: Tensorus backend API accessible
- For demo mode: No additional setup required
Demo Scenario: MCP Client Interaction
Goal: Demonstrate how an external AI agent can leverage Tensorus via MCP.
-
Start MCP Server:
python -m tensorus_mcp.server --demo-mode
-
Connect via Python Client:
from tensorus_mcp.client import TensorusMCPClient async def demo(): async with TensorusMCPClient.from_http("http://localhost:8000/mcp/") as client: # List available datasets datasets = await client.list_datasets() print(f"Available datasets: {datasets}") # Create a new dataset result = await client.create_dataset("demo_dataset") print(f"Created dataset: {result}") # Ingest sample tensor data tensor_result = await client.ingest_tensor( dataset_name="demo_dataset", tensor_shape=[3, 3], tensor_dtype="float32", tensor_data=[[1.0, 2.0, 3.0], [4.0, 5.0, 6.0], [7.0, 8.0, 9.0]], metadata={"source": "mcp_demo", "type": "sample_matrix"} ) print(f"Ingested tensor: {tensor_result}") # Apply tensor operation (transpose) op_result = await client.apply_operation( operation="transpose", dataset_name="demo_dataset", record_id=tensor_result["record_id"], dim0=0, dim1=1 ) print(f"Applied transpose operation: {op_result}")
-
Conceptual Client Interaction (JavaScript):
// Example of how other AI agents could interact via MCP async function mcpDemo() { // List available tools const { tools } = await client.request({ method: 'tools/list' }, {}); console.log("Available Tensorus Tools:", tools.map(t => t.name)); // Create dataset via MCP const createResponse = await client.request({ method: 'tools/call' }, { name: 'tensorus_create_dataset', arguments: { dataset_name: 'mcp_demo_dataset' } }); console.log("Dataset created:", createResponse.content[0].text); // Ingest tensor via MCP const ingestResponse = await client.request({ method: 'tools/call' }, { name: 'tensorus_ingest_tensor', arguments: { dataset_name: 'mcp_demo_dataset', tensor_shape: [2, 2], tensor_dtype: 'float32', tensor_data: [[1.0, 2.0], [3.0, 4.0]], metadata: { source: 'mcp_demo' } } }); console.log("Tensor ingested:", ingestResponse.content[0].text); }
Available MCP Tools
Dataset Management
tensorus_list_datasets
: Lists all available datasetstensorus_create_dataset
: Creates a new datasettensorus_delete_dataset
: Deletes an existing dataset
Tensor Operations
tensorus_ingest_tensor
: Ingests a new tensor into a datasettensorus_get_tensor_details
: Retrieves tensor data and metadatatensorus_delete_tensor
: Deletes a specific tensortensorus_update_tensor_metadata
: Updates tensor metadata
Tensor Computations
tensorus_apply_unary_operation
: Operations likelog
,reshape
,transpose
,sum
,mean
tensorus_apply_binary_operation
: Operations likeadd
,subtract
,multiply
,matmul
tensorus_apply_list_operation
: Operations likeconcatenate
andstack
tensorus_apply_einsum
: Einstein summation operations
Diagnostic Tools
mcp_server_status
: Check server operational statusconnection_test
: Lightweight connectivity checkbackend_ping
: Test backend API health endpointbackend_connectivity_test
: Verify backend communication
Configuration
API Key Management
When not in demo mode, provide authentication via:
-
Global API Key: Set when starting the server
python -m tensorus_mcp.server --mcp-api-key YOUR_API_KEY
-
Per-Tool API Key: Pass
api_key
parameter in tool calls
Environment Variables
TENSORUS_API_BASE_URL
: Backend API URL (default:https://tensorus-core.hf.space
)TENSORUS_MINIMAL_IMPORT
: Set to1
for lightweight imports
Demo Examples
Interactive Notebook
See examples/demo_notebook.ipynb
for a complete interactive example.
Streamlit App
Launch the demo Streamlit app:
streamlit run examples/demo_app.py
Development
Running Tests
# Install test dependencies
pip install -r examples/requirements.txt
# Run MCP-specific tests
pytest tests/test_mcp_integration.py
Project Structure
tensorus_mcp/
├── __init__.py # Package initialization
├── server.py # MCP server implementation
├── client.py # MCP client library
└── config.py # Configuration management
examples/
├── demo_app.py # Streamlit demo application
├── demo_notebook.ipynb # Interactive Jupyter notebook
└── requirements.txt # Demo dependencies
tests/
└── test_mcp_integration.py # Integration tests
Usage in Claude Desktop
Add to your Claude Desktop MCP settings:
{
"mcpServers": {
"tensorus": {
"command": "python",
"args": ["-m", "tensorus_mcp.server"],
"env": {
"TENSORUS_API_BASE_URL": "https://tensorus-core.hf.space"
}
}
}
}
API Reference
TensorusMCPClient Methods
list_datasets()
: Get all available datasetscreate_dataset(name, schema=None)
: Create a new datasetingest_tensor(dataset_name, tensor_shape, tensor_dtype, tensor_data, metadata)
: Add tensor to datasetget_tensor_details(dataset_name, record_id)
: Retrieve tensor informationapply_operation(operation, dataset_name, record_id, **kwargs)
: Apply tensor operations
Contributing
Contributions are welcome! Please feel free to open issues or submit pull requests.
License
MIT License
推荐服务器

Baidu Map
百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。
Playwright MCP Server
一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。
Magic Component Platform (MCP)
一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。
Audiense Insights MCP Server
通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

VeyraX
一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。
graphlit-mcp-server
模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。
Kagi MCP Server
一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

e2b-mcp-server
使用 MCP 通过 e2b 运行代码。
Neon MCP Server
用于与 Neon 管理 API 和数据库交互的 MCP 服务器
Exa MCP Server
模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。