About
LibraLM is a book summarization service providing AI-generated summaries and detailed chapter breakdowns for business, self-help, and educational books. It grants access to a curated library of over 50 professionally summarized titles with extracted key insights and actionable takeaways. Key features of LibraLM: - Search books by title, author, or ISBN to locate specific titles instantly - Comprehensive book summaries covering main themes and frameworks - Chapter-by-chapter breakdowns with detailed summaries of individual sections - Complete table of contents view with chapter descriptions - Key insights extraction highlighting main concepts and actionable takeaways - Secure API authentication for protected access to the summary library
README
LibraLM MCP Server
[](https://smithery.ai/server/@libralm-ai/libralm_mcp_server)
Access 50+ book summaries and chapter breakdowns directly in Claude Desktop through the Model Context Protocol (MCP).
Overview
LibraLM MCP Server brings a library of AI-generated book summaries to your Claude Desktop conversations. Search for books, read comprehensive summaries, explore chapter-by-chapter breakdowns, and get instant access to key insights from business, self-help, and educational books.
Features
Installation
Installing via Smithery
To install libralm_mcp_server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @libralm-ai/libralm_mcp_server --client claude
Prerequisites
Quick Install
1. Clone the repository:
git clone https://github.com/libralm-ai/libralm_mcp_server.git
cd libralm_mcp_server
2. Install dependencies:
pip install -r requirements.txt
3. Get your API key: - Visit libralm.com - Sign in with Google or GitHub - Copy your API key from the dashboard
4. Configure Claude Desktop:
Add to your Claude Desktop configuration file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"libralm": {
"command": "uvx",
"args": ["--from", "libralm-mcp-server", "libralm-mcp-server"],
"env": {
"LIBRALM_API_KEY": "your_api_key_here"
}
}
}
}
5. Restart Claude Desktop
Available Tools
🔍 search_books
Search for books by title, author, or ISBN.
Search for "Atomic Habits"
Find books by James Clear
Look up ISBN 0735211299
📖 get_book_info
Get detailed information about a specific book.
Get details for book ID 0735211299
Show me information about this book
📝 get_book_summary
Get the comprehensive AI-generated summary of a book.
Summarize "Atomic Habits"
Give me the main points of this book
📋 get_table_of_contents
View the complete chapter list with descriptions.
Show me the chapters in "Atomic Habits"
What topics does this book cover?
📄 get_chapter_summary
Get a detailed summary of a specific chapter.
Summarize chapter 3 of "Atomic Habits"
What's in the first chapter?
Example Usage
Here are some example prompts you can use with Claude:
Configuration
Environment Variables
LIBRALM_API_KEY (required): Your LibraLM API keyAPI Limits
Troubleshooting
"Invalid API key" error
"Resource not found" error
No books showing up
Contributing
We welcome contributions! Please see our Contributing Guide for details.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
Related Projects
---
Built with ❤️ by the LibraLM team
Related MCP Servers
AI Research Assistant
hamid-vakilzadeh
AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar
Linkup
LinkupPlatform
Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages
Math-MCP
EthanHenrickson
Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support