ClaudeSmalltalk

ClaudeSmalltalk

Connects Claude Desktop to a live Smalltalk environment to browse classes, evaluate expressions, and manage methods in Squeak or Cuis images. It utilizes a dedicated agent to delegate Smalltalk programming tasks to local or cloud-based LLMs while maintaining a secure connection to the live image.

Category
访问服务器

README

ClaudeSmalltalk

Connect Claude Desktop to a live Smalltalk programming environment. Browse classes, evaluate expressions, define methods, and run autonomous code review — all against a running Squeak or Cuis image.

Developed by John M McIntosh, Corporate Smalltalk Consulting Ltd. 2026

What It Does

The Squeak VM provides 14 MCP tools — evaluate code, browse classes, read/write methods, navigate hierarchies, and save the image. Claude Desktop accesses them via smalltalk_task, which delegates all Smalltalk interaction to a locally-configured LLM (Ollama for free/local, or Anthropic/OpenAI/xAI) — no source code leaves your machine.

You → Claude Desktop → smalltalk_task → Your LLM → Live Smalltalk Image (TCP)
                        (MCP server)     (Ollama)    (Squeak or Cuis)

The agent isolates Smalltalk reasoning from your chat model. Claude Desktop triggers the work, but a separate model (which can be local and free) does the actual Smalltalk coding.

Quick Start

1. Get a Smalltalk VM and Image

Squeak (recommended):

  • Download Squeak 6.0 — the All-in-One package includes VM and image
  • Follow SQUEAK-SETUP.md to install the MCP server into the image

Cuis Smalltalk:

macOS note: Place the VM and image files in /Applications/ or your home directory. Files in ~/Documents/ or ~/Desktop/ may be blocked by macOS privacy restrictions (TCC). See macOS Permissions below.

2. Create a Configuration File and Install

Follow CLAUDE-README-MCPB.md — it covers creating your smalltalk-mcp.json config and installing the desktop extension.

Copy a starter config from examples/ and set your VM paths:

cp examples/smalltalk-mcp-ollama.json smalltalk-mcp.json
# Edit vm.binary and vm.image to match your install

The VM auto-starts on first use — no manual launch needed. Token auth is handled automatically.

See examples/ for Anthropic, OpenAI, xAI, and MQTT variants.

3. Configure Claude Desktop

Edit ~/Library/Application Support/Claude/claude_desktop_config.json (macOS):

{
  "mcpServers": {
    "smalltalkAgent": {
      "command": "python3",
      "args": ["/path/to/ClaudeSmalltalk/smalltalk_agent_mcp.py"],
      "env": {
        "SMALLTALK_MCP_CONFIG": "/path/to/smalltalk-mcp.json"
      }
    }
  }
}

Requires Python 3.10+ and pip install httpx. For MQTT transport, also: pip install paho-mqtt.

4. Verify It Works

Open Claude Desktop and ask:

"List all Smalltalk classes that start with String"

Available Tools

Claude Desktop (1 tool)

Tool Description
smalltalk_task Delegate any Smalltalk task to a locally-configured LLM. No source code leaves your machine.

VM Tools (14, available to the agent)

Tool Description
smalltalk_evaluate Execute Smalltalk code and return result
smalltalk_browse Get class metadata (superclass, ivars, methods)
smalltalk_method_source View source code of a method
smalltalk_define_class Create or modify a class definition
smalltalk_define_method Add or update a method
smalltalk_delete_method Remove a method from a class
smalltalk_delete_class Remove a class from the system
smalltalk_list_classes List classes matching a prefix
smalltalk_hierarchy Get superclass chain
smalltalk_subclasses Get immediate subclasses
smalltalk_list_categories List all system categories
smalltalk_classes_in_category List classes in a category
smalltalk_save_image Save the current image in place
smalltalk_save_as_new_version Save image/changes as next version number

All 14 tools are also available directly via the st CLI (openclaw/smalltalk.py).

Configuration Reference

Supported LLM Providers

Provider API Cost Config key
Ollama /api/chat (native) Free (local) "provider": "ollama"
Anthropic Messages API Paid "provider": "anthropic"
OpenAI /v1/chat/completions Paid "provider": "openai"
xAI /v1/chat/completions Paid "provider": "xai"

Transport Options

Transport How Use Case
tcp Token-authenticated TCP to Squeak VM Recommended — VM is its own server
mqtt MQTT broker to remote image Remote images, Cuis with MQTT handler

tcp is the default. The Squeak VM runs MCPTcpTransport and listens on a local port. The agent auto-starts the VM on first use, generates a UUID token, and connects per-request with JSON-RPC + token auth.

mqtt connects through an MQTT broker. Used for remote images or Cuis with the MQTT LLM handler.

macOS Permissions

macOS Transparency, Consent, and Control (TCC) restricts which directories applications can access.

Safe locations (no extra permissions needed):

  • /Applications/ — recommended for VM and image files
  • ~/ (home directory root) — works for config files

Restricted locations (will cause errors):

  • ~/Documents/, ~/Desktop/, ~/Downloads/

Other Integration Options

Option Architecture Guide
OpenClaw Telegram/Discord ↔ OpenClaw ↔ Squeak OPENCLAW-SETUP.md

Security

The extension only connects to a local Smalltalk image over TCP (localhost only). No source code is sent to cloud APIs when using Ollama.

With Ollama + TCP transport, no Smalltalk source code leaves your machine.

Dual security audit details: SECURITY.md

Files

File Description
Claude.SmalltalkInterface.mcpb Desktop extension — double-click to install
CLAUDE-README-MCPB.md Setup guide bundled with the extension
smalltalk_agent_mcp.py MCP server (stdio JSON-RPC for Claude Desktop)
smalltalk_agent.py Agent with tool-calling loop, TcpBridge + MqttBridge
openclaw/smalltalk.py st CLI — direct TCP access to all 14 tools
openclaw/mqtt_bridge.py MQTT CLI bridge for Cuis/remote images
smalltalk-mcp-example.json Starter config — copy and edit
SKILL.md Drag into Claude Desktop for Smalltalk best practices
MCP-Server-Squeak.st MCP server fileIn for Squeak 6.0 (TCP transport)
MCP-Server.pck.st MCP server package for Cuis
examples/ Config examples for all providers and transports

License

MIT License — see LICENSE

推荐服务器

Baidu Map

Baidu Map

百度地图核心API现已全面兼容MCP协议,是国内首家兼容MCP协议的地图服务商。

官方
精选
JavaScript
Playwright MCP Server

Playwright MCP Server

一个模型上下文协议服务器,它使大型语言模型能够通过结构化的可访问性快照与网页进行交互,而无需视觉模型或屏幕截图。

官方
精选
TypeScript
Magic Component Platform (MCP)

Magic Component Platform (MCP)

一个由人工智能驱动的工具,可以从自然语言描述生成现代化的用户界面组件,并与流行的集成开发环境(IDE)集成,从而简化用户界面开发流程。

官方
精选
本地
TypeScript
Audiense Insights MCP Server

Audiense Insights MCP Server

通过模型上下文协议启用与 Audiense Insights 账户的交互,从而促进营销洞察和受众数据的提取和分析,包括人口统计信息、行为和影响者互动。

官方
精选
本地
TypeScript
VeyraX

VeyraX

一个单一的 MCP 工具,连接你所有喜爱的工具:Gmail、日历以及其他 40 多个工具。

官方
精选
本地
graphlit-mcp-server

graphlit-mcp-server

模型上下文协议 (MCP) 服务器实现了 MCP 客户端与 Graphlit 服务之间的集成。 除了网络爬取之外,还可以将任何内容(从 Slack 到 Gmail 再到播客订阅源)导入到 Graphlit 项目中,然后从 MCP 客户端检索相关内容。

官方
精选
TypeScript
Kagi MCP Server

Kagi MCP Server

一个 MCP 服务器,集成了 Kagi 搜索功能和 Claude AI,使 Claude 能够在回答需要最新信息的问题时执行实时网络搜索。

官方
精选
Python
e2b-mcp-server

e2b-mcp-server

使用 MCP 通过 e2b 运行代码。

官方
精选
Neon MCP Server

Neon MCP Server

用于与 Neon 管理 API 和数据库交互的 MCP 服务器

官方
精选
Exa MCP Server

Exa MCP Server

模型上下文协议(MCP)服务器允许像 Claude 这样的 AI 助手使用 Exa AI 搜索 API 进行网络搜索。这种设置允许 AI 模型以安全和受控的方式获取实时的网络信息。

官方
精选