About
Framelink MCP for Figma is a Model Context Protocol server that bridges Figma design files with AI coding assistants like Cursor. It retrieves design data from Figma and translates it into structured, AI-optimized context for implementing designs in code. Key features: - Fetches design specifications directly from Figma files, frames, and groups via the Figma API - Translates complex Figma data into simplified layout and styling information relevant for code generation - Reduces AI context window usage by filtering out unnecessary design metadata before passing data to coding agents - Enables one-shot design implementation in any frontend framework when used with Cursor's agent mode - Provides direct integration with Cursor and other MCP-compatible AI coding tools
README
Framelink MCP for Figma Give your coding agent access to your Figma data.Implement designs in any framework in one-shot.
Give Cursor and other AI-powered coding tools access to your Figma files with this Model Context Protocol server.
When Cursor has access to Figma design data, it's way better at one-shotting designs accurately than alternative approaches like pasting screenshots.
See quickstart instructions →
Demo
Watch a demo of building a UI in Cursor with Figma design data
[](https://youtu.be/6G9yb-LrEqg)
How it works
1. Open your IDE's chat (e.g. agent mode in Cursor). 2. Paste a link to a Figma file, frame, or group. 3. Ask Cursor to do something with the Figma file—e.g. implement the design. 4. Cursor will fetch the relevant metadata from Figma and use it to write your code.
This MCP server is specifically designed for use with Cursor. Before responding with context from the Figma API, it simplifies and translates the response so only the most relevant layout and styling information is provided to the model.
Reducing the amount of context provided to the model helps make the AI more accurate and the responses more relevant.
Getting Started
Many code editors and other AI clients use a configuration file to manage MCP servers.
The figma-developer-mcp server can be configured by adding the following to your configuration file.
> NOTE: You will need to create a Figma access token to use this server. Instructions on how to create a Figma API access token can be found here.
MacOS / Linux
{
"mcpServers": {
"Framelink MCP for Figma": {
"command": "npx",
"args": ["-y", "figma-developer-mcp", "--figma-api-key=YOUR-KEY", "--stdio"]
}
}
}
Windows
{
"mcpServers": {
"Framelink MCP for Figma": {
"command": "cmd",
"args": ["/c", "npx", "-y", "figma-developer-mcp", "--figma-api-key=YOUR-KEY", "--stdio"]
}
}
}
Or you can set FIGMA_API_KEY and PORT in the env field.
If you need more information on how to configure the Framelink MCP for Figma, see the Framelink docs.
Star History
Learn More
The Framelink MCP for Figma is simple but powerful. Get the most out of it by learning more at the Framelink site.
Related MCP Servers
AI Research Assistant
hamid-vakilzadeh
AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar
Linkup
LinkupPlatform
Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages
Math-MCP
EthanHenrickson
Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support