About
Protolint is a pluggable linting and fixing utility for Protocol Buffer files (proto2 and proto3). It operates without a compiler for fast performance and enforces the official Protocol Buffer style guide. Key features of Protolint: - Fast linting that works without a Protocol Buffer compiler - Automatic fixing of style guide violations - Plugin system for loading custom lint rules - Rule suppression via inline comments for maintaining API compatibility - MCP server support for AI assistant integration - Integration with editors, GitHub Actions, and CI pipelines
README
protolint
[](https://github.com/yoheimuta/protolint/releases) [](https://github.com/yoheimuta/protolint/blob/master/LICENSE) [](https://hub.docker.com/r/yoheimuta/protolint)
protolint is the pluggable linting/fixing utility for Protocol Buffer files (proto2+proto3):
Demo
Once MCP server configured, you can ask any MCP clients like Claude Desktop to lint and fix your Protocol Buffer files like this:
Also, vim-protolint works like the following.
MCP Server
protolint now includes support for the Model Context Protocol (MCP), which allows AI models to interact with protolint directly.Usage
protolint --mcp
For detailed documentation on how to use and integrate protolint's MCP server functionality, see the MCP documentation.
Installation
Via Homebrew
protolint can be installed for Mac or Linux using Homebrew via the yoheimuta/protolint tap.
brew tap yoheimuta/protolint
brew install protolint
Since homebrew-core includes protolint, you can also install it by just brew install protolint. This is the default tap that is installed by default. It's easier, but not maintained by the same author. To keep it updated, I recommend you run brew tap yoheimuta/protolint first.
Via GitHub Releases
You can also download a pre-built binary from this release page:
In the downloads section of each release, you can find pre-built binaries in .tar.gz packages.
Use the maintained Docker image
protolint ships a Docker image yoheimuta/protolint that allows you to use protolint as part of your Docker workflow.
❯❯❯ docker run --volume "$(pwd):/workspace" --workdir /workspace yoheimuta/protolint lint _example/proto
[_example/proto/invalidFileName.proto:1:1] File name should be lower_snake_case.proto.
[_example/proto/issue_88/oneof_options.proto:11:5] Found an incorrect indentation style " ". " " is correct.
[_example/proto/issue_88/oneof_options.proto:12:5] Found an incorrect indentation style " ". " " is correct.
From Source
The binary can be installed from source if Go is available. However, I recommend using one of the pre-built binaries instead because it doesn't include the version info.
go install github.com/yoheimuta/protolint/cmd/protolint@latest
Within JavaScript / TypeScript
You can use protolint using your nodejs package manager like npm or yarn.
$ npm install protolint --save-dev
This will add a reference to a development dependency to your local package.json.
During install, the postinstall.js script will be called. It will download the matching protolint from github. Just like @electron/get, you can bypass the download using the following environment variables:
| Environment Variable | Default value | Description | |-------------------------------|---------------------------------------|-----------------------------------------------| | PROTOLINT_MIRROR_HOST | https://github.com | HTTP/Web server base url hosting the binaries | | PROTOLINT_MIRROR_REMOTE_PATH | yoheimuta/protolint/releases/download | Path to the archives on the remote host | | PROTOLINT_MIRROR_USERNAME | | HTTP Basic auth user name | | PROTOLINT_MIRROR_PASSWORD | |
Related MCP Servers
AI Research Assistant
hamid-vakilzadeh
AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar
Linkup
LinkupPlatform
Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages
context7
huynguyen03dev