Price Per TokenPrice Per Token

NotebookLM MCP Server

by PleasePrompto

0

About

NotebookLM MCP Server connects AI coding assistants directly to Google's NotebookLM knowledge base, enabling zero-hallucination research on your documentation with citation-backed answers synthesized by Gemini. Key features: - Direct integration between CLI agents (Claude Code, Cursor, Codex) and NotebookLM - Persistent authentication and local library management for organizing notebooks - Cross-client sharing of NotebookLM resources - Grounded responses with citations from uploaded documents - Automatic follow-up sequencing for deep research on implementation details and edge cases

README

NotebookLM MCP Server

Let your CLI agents (Claude, Cursor, Codex...) chat directly with NotebookLM for zero-hallucination answers based on your own notebooks

[](https://www.typescriptlang.org/) [](https://modelcontextprotocol.io/) [](https://www.npmjs.com/package/notebooklm-mcp) [](https://github.com/PleasePrompto/notebooklm-skill) [](https://github.com/PleasePrompto/notebooklm-mcp)

InstallationQuick StartWhy NotebookLMExamplesClaude Code SkillDocumentation

---

The Problem

When you tell Claude Code or Cursor to "search through my local documentation", here's what happens:

  • Massive token consumption: Searching through documentation means reading multiple files repeatedly
  • Inaccurate retrieval: Searches for keywords, misses context and connections between docs
  • Hallucinations: When it can't find something, it invents plausible-sounding APIs
  • Expensive & slow: Each question requires re-reading multiple files
  • The Solution

    Let your local agents chat directly with NotebookLM — Google's zero-hallucination knowledge base powered by Gemini 2.5 that provides intelligent, synthesized answers from your docs.

    Your Task → Local Agent asks NotebookLM → Gemini synthesizes answer → Agent writes correct code
    

    The real advantage: No more manual copy-paste between NotebookLM and your editor. Your agent asks NotebookLM directly and gets answers straight back in the CLI. It builds deep understanding through automatic follow-ups — Claude asks multiple questions in sequence, each building on the last, getting specific implementation details, edge cases, and best practices. You can save NotebookLM links to your local library with tags and descriptions, and Claude automatically selects the relevant notebook based on your current task.

    ---

    Why NotebookLM, Not Local RAG?

    | Approach | Token Cost | Setup Time | Hallucinations | Answer Quality | |----------|------------|------------|----------------|----------------| | Feed docs to Claude | 🔴 Very high (multiple file reads) | Instant | Yes - fills gaps | Variable retrieval | | Web search | 🟡 Medium | Instant | High - unreliable sources | Hit or miss | | Local RAG | 🟡 Medium-High | Hours (embeddings, chunking) | Medium - retrieval gaps | Depends on setup | | NotebookLM MCP | 🟢 Minimal | 5 minutes | Zero - refuses if unknown | Expert synthesis |

    What Makes NotebookLM Superior?

    1. Pre-processed by Gemini: Upload docs once, get instant expert knowledge 2. Natural language Q&A: Not just retrieval — actual understanding and synthesis 3. Multi-source correlation: Connects information across 50+ documents 4. Citation-backed: Every answer includes source references 5. No infrastructure: No vector DBs, embeddings, or chunking strategies needed

    ---

    Installation

    Claude Code

    claude mcp add notebooklm npx notebooklm-mcp@latest
    

    Codex

    codex mcp add notebooklm -- npx notebooklm-mcp@latest
    

    Gemini

    gemini mcp add notebooklm npx notebooklm-mcp@latest
    

    Cursor

    Add to ~/.cursor/mcp.json:

    {
      "mcpServers": {
        "notebooklm": {
          "command": "npx",
          "args": ["-y", "notebooklm-mcp@latest"]
        }
      }
    }
    

    amp

    amp mcp add notebooklm -- npx notebooklm-mcp@latest
    

    VS Code

    code --add-mcp '{"name":"notebooklm","command":"npx","args":["notebooklm-mcp@latest"]}'
    

    Other MCP clients

    Generic MCP config:

    {
      "mcpServers": {
        "notebooklm": {
          "command": "npx",
          "args": ["notebooklm-mcp@latest"]
        }
      }
    }
    

    ---

    Alternative: Claude Code Skill

    Prefer Claude Code Skills over MCP? This server is now also available as a native Claude Code Skill with a simpler setup:

    NotebookLM Claude Code Skill - Clone to ~/.claude/skills and start using immediately

    Key differences:

  • MCP Server (this repo): Persistent sessions, works with Claude Code, Codex, Cursor, and other MCP clients
  • Claude Code Skill: Simpler setup, Python-based, stateless queries, works only with local Claude Code
  • Both use the same browser automation technology and provide zero-hallucination a

    Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81