About
TalkToFigma is an MCP integration that connects AI coding assistants (Cursor, Claude Code) directly to Figma, enabling programmatic reading and modification of design files. It bridges AI agents and design workflows through a WebSocket server and companion Figma plugin. Key capabilities: - Read and analyze Figma design files programmatically through natural language commands in AI agents - Automate design modifications including bulk text content replacement across multiple layers - Propagate component instance overrides from source instances to multiple targets for design consistency - Streamline design-to-code workflows by allowing AI agents to inspect and manipulate Figma components, styles, and layout properties
README
Talk to Figma MCP
This project implements a Model Context Protocol (MCP) integration between AI agent (Cursor, Claude Code) and Figma, allowing AI agent to communicate with Figma for reading designs and modifying them programmatically.
https://github.com/user-attachments/assets/129a14d2-ed73-470f-9a4c-2240b2a4885c
Project Structure
src/talk_to_figma_mcp/ - TypeScript MCP server for Figma integrationsrc/cursor_mcp_plugin/ - Figma plugin for communicating with Cursorsrc/socket.ts - WebSocket server that facilitates communication between the MCP server and Figma pluginHow to use
1. Install Bun if you haven't already:
curl -fsSL https://bun.sh/install | bash
2. Run setup, this will also install MCP in your Cursor's active project
bun setup
3. Start the Websocket server
bun socket
4. NEW Install Figma plugin from Figma community page or install locally
Quick Video Tutorial
Design Automation Example
Bulk text content replacement
Thanks to @dusskapark for contributing the bulk text replacement feature. Here is the demo video.
Instance Override Propagation Another contribution from @dusskapark Propagate component instance overrides from a source instance to multiple target instances with a single command. This feature dramatically reduces repetitive design work when working with component instances that need similar customizations. Check out our demo video.
Manual Setup and Installation
MCP Server: Integration with Cursor
Add the server to your Cursor MCP configuration in ~/.cursor/mcp.json:
{
"mcpServers": {
"TalkToFigma": {
"command": "bunx",
"args": ["cursor-talk-to-figma-mcp@latest"]
}
}
}
WebSocket Server
Start the WebSocket server:
bun socket
Figma Plugin
1. In Figma, go to Plugins > Development > New Plugin
2. Choose "Link existing plugin"
3. Select the src/cursor_mcp_plugin/manifest.json file
4. The plugin should now be available in your Figma development plugins
Windows + WSL Guide
1. Install bun via powershell
powershell -c "irm bun.sh/install.ps1|iex"
2. Uncomment the hostname 0.0.0.0 in src/socket.ts
// uncomment this to allow connections in windows wsl
hostname: "0.0.0.0",
3. Start the websocket
bun socket
Usage
1. Start the WebSocket server
2. Install the MCP server in Cursor
3. Open Figma and run the Cursor MCP Plugin
4. Connect the plugin to the WebSocket server by joining a channel using join_channel
5. Use Cursor to communicate with Figma using the MCP tools
Local Development Setup
To develop, update your mcp config to direct to your local directory.
{
"mcpServers": {
"TalkToFigma": {
"command": "bun",
"args": ["/path-to-repo/src/talk_to_figma_mcp/server.ts"]
}
}
}
MCP Tools
The MCP server provides the following tools for interacting with Figma:
Document & Selection
get_document_info - Get information about the current Figma documentget_selection - Get information about the current selectionread_my_design - Get detailed node information about the current selection without parametersget_node_info - Get detailed information about a specific nodeget_nodes_info - Get detailed information about multiple nodes by providing an array of node IDsset_focus - Set focus on a specific node by selecting it and scrolling viewport to itset_selections - Set selection to multiple nodes and scroll viewport to show themAnnotations
get_annotations - Get all annotations in the current document or specific nodeset_annotation - Create or update an annotation with markdown supportset_multiple_annotations - Batch create/update multiple annotations efficientlyscan_nodes_by_types - Scan for nodes with specific types (useful for finding annotation targets)Prototyping & Connections
get_reactions - Get all prototype reactions from nodes with visual highlight animationset_default_connector - Set a copied FigJam connector as the default connector style for creating connections (must be set before creating connections)create_connections - Create FigJam connector lines between nodes, based on prototype flows or custom mappingCreating Elements
create_rectangle - Create a new rectangle with position, size, and optional namecreate_frame - Create a new frame with position, size, and optional namecreate_text - Create a new text node with customizable font properties###
Related MCP Servers
AI Research Assistant
hamid-vakilzadeh
AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar
Linkup
LinkupPlatform
Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages
Math-MCP
EthanHenrickson
Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support