Price Per TokenPrice Per Token
X Twitter Server

X Twitter Server

by rafaljanicki

GitHub 23 578 uses Remote
0

About

X Twitter Server enables AI assistants to interact with Twitter (X) through natural language commands using the official Twitter API v2. Key capabilities include: - User profile management: fetch profiles, follower lists, and following lists - Tweet operations: post new tweets, delete tweets, and favorite posts - Search and discovery: search tweets and trending topics on Twitter - Personal organization: manage bookmarks and view personalized timelines - Rate limit protection: built-in handling of Twitter API rate limits - Secure authentication: proper API key and token-based access

README

X (Twitter) MCP server

[](https://smithery.ai/server/@rafaljanicki/x-twitter-mcp-server) [](https://badge.fury.io/py/x-twitter-mcp)

A Model Context Protocol (MCP) server for interacting with Twitter (X) via AI tools. This server allows you to fetch tweets, post tweets, search Twitter, manage followers, and more, all through natural language commands in AI Tools.

Features

  • Fetch user profiles, followers, and following lists.
  • Post, delete, and favorite tweets.
  • Search Twitter for tweets and trends.
  • Manage bookmarks and timelines.
  • Built-in rate limit handling for the Twitter API.
  • Uses Twitter API v2 with proper authentication (API keys and tokens), avoiding the username/password hack to minimize the risk of account suspensions.
  • Provides a complete implementation of Twitter API v2 endpoints for user management, tweet management, timelines, and search functionality.
  • Prerequisites

  • Python 3.10 or higher: Ensure Python is installed on your system.
  • Twitter Developer Account: You need API credentials (API Key, API Secret, Access Token, Access Token Secret, and Bearer Token) from the Twitter Developer Portal.
  • Optional: Claude Desktop: Download and install the Claude Desktop app from the Anthropic website.
  • Optional: Node.js (for MCP integration): Required for running MCP servers in Claude Desktop.
  • A package manager like uv or pip for Python dependencies.
  • Installation

    Option 1: Installing via Smithery (Recommended)

    To install X (Twitter) MCP server for Claude Desktop automatically via Smithery:

    npx -y @smithery/cli install @rafaljanicki/x-twitter-mcp-server --client claude
    

    Option 2: Install from PyPI

    The easiest way to install x-twitter-mcp is via PyPI:

    pip install x-twitter-mcp
    

    Option 3: Install from Source

    If you prefer to install from the source repository:

    1. Clone the Repository:

       git clone https://github.com/rafaljanicki/x-twitter-mcp-server.git
       cd x-twitter-mcp-server
       

    2. Set Up a Virtual Environment (optional but recommended):

       python -m venv .venv
       source .venv/bin/activate  # On Windows: .venv\Scripts\activate
       

    3. Install Dependencies: Using uv (recommended, as the project uses uv.lock):

       uv sync
       
    Alternatively, using pip:
       pip install .
       

    4. Configure Environment Variables: - Create a .env file in the project root (you can copy .env.example if provided). - Add your Twitter API credentials:

          TWITTER_API_KEY=your_api_key
          TWITTER_API_SECRET=your_api_secret
          TWITTER_ACCESS_TOKEN=your_access_token
          TWITTER_ACCESS_TOKEN_SECRET=your_access_token_secret
          TWITTER_BEARER_TOKEN=your_bearer_token
          

    Running the Server

    Preferred transport is Streamable HTTP. Use one of the following:

    Recommended: Streamable HTTP (Docker/Smithery)

    Run the server as an HTTP service with Streamable HTTP and SSE endpoints.

    1. Build the Docker image:

       docker build -t x-twitter-mcp .
       

    2. Run the container (Smithery uses PORT; default here is 8081):

       docker run -p 8081:8081 -e PORT=8081 x-twitter-mcp
       

    3. Endpoints: - Streamable HTTP (JSON-RPC over HTTP): POST http://localhost:8081/mcp - SSE (Server-Sent Events): GET http://localhost:8081/sse

    4. Pass config per-request (recommended in Smithery) via base64-encoded config query parameter. Example config JSON:

       {"twitterApiKey":"...","twitterApiSecret":"...","twitterAccessToken":"...","twitterAccessTokenSecret":"...","twitterBearerToken":"..."}
       
    Encode and call initialize:
       CONFIG_B64=$(printf '%s' '{"twitterApiKey":"YOUR_KEY","twitterApiSecret":"YOUR_SECRET","twitterAccessToken":"YOUR_TOKEN","twitterAccessTokenSecret":"YOUR_TOKEN_SECRET","twitterBearerToken":"YOUR_BEARER"}' | base64)

    curl -sS -X POST "http://localhost:8081/mcp?config=${CONFIG_B64}" \ -H 'content-type: application/json' \ -d '{"jsonrpc":"2.0","id":"1","method":"initialize","params":{"capabilities":{}}}'

    Notes:

  • A POST / will return 404; use /mcp for Streamable HTTP and /sse for SSE.
  • When deployed via Smithery, smithery.yaml is configured for runtime: container and startCommand.type: http.
  • Streamable HTTP (Local, no Docker)

    Run the ASGI server directly.

    If installed from PyPI:

    python -m x_twitter_mcp.http_server
    

    If installed from source with

    Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81