About
Ruflo (formerly Claude Flow) enables enterprise teams to deploy and coordinate large-scale swarms of AI agents for automated software engineering workflows through Claude Code. Key features of Ruflo: - Orchestrates 60+ specialized agents that collaborate on complex development tasks with consensus-based fault tolerance. - Self-learning capabilities that optimize agent coordination and task routing algorithms over time. - Enterprise-grade security with Rust-based WASM kernels powering policy engines, embeddings, and cryptographic proofs. - Production infrastructure supporting 100+ agent deployments with seamless Claude Code integration.
README
🌊 RuFlo v3.5: Enterprise AI Orchestration Platform
[](https://github.com/ruvnet/claude-flow)
[](https://github.com/ruvnet/claude-flow) [](https://www.npmjs.com/package/claude-flow) [](https://www.npmjs.com/package/claude-flow) [](https://ruv.io) [](https://discord.com/invite/dfxmpwkG2D) [](https://github.com/ruvnet/claude-flow) [](https://opensource.org/licenses/MIT) --- [](https://x.com/ruv) [](https://www.linkedin.com/in/reuvencohen/) [](https://www.youtube.com/@ReuvenCohen)
Production-ready multi-agent AI orchestration for Claude Code
*Deploy 100+ specialized agents in coordinated swarms with self-learning capabilities, fault-tolerant consensus, and enterprise-grade security.*> Why Ruflo? Claude Flow is now Ruflo — named by Ruv, who loves Rust, flow states, and building things that feel inevitable. The "Ru" is the Ruv. The "flo" is the flow. Underneath, WASM kernels written in Rust power the policy engine, embeddings, and proof system. 6,000+ commits later, this is v3.5.
Getting into the Flow
Ruflo is a comprehensive AI agent orchestration framework that transforms Claude Code into a powerful multi-agent development platform. It enables teams to deploy, coordinate, and optimize specialized AI agents working together on complex software engineering tasks.
Self-Learning/Self-Optimizing Agent Architecture
User → Ruflo (CLI/MCP) → Router → Swarm → Agents → Memory → LLM Providers
↑ ↓
└──── Learning Loop ←──────┘
📐 Expanded Architecture — Full system diagram with RuVector intelligence
```mermaid flowchart TB subgraph USER["👤 User Layer"] U[User] end
subgraph ENTRY["🚪 Entry Layer"] CLI[CLI / MCP Server] AID[AIDefence Security] end
subgraph ROUTING["🧭 Routing Layer"] QL[Q-Learning Router] MOE[MoE - 8 Experts] SK[Skills - 130+] HK[Hooks - 27] end
subgraph SWARM["🐝 Swarm Coordination"] TOPO[Topologiesmesh/hier/ring/star] CONS[ConsensusRaft/BFT/Gossip/CRDT] CLM[ClaimsHuman-Agent Coord] end
subgraph AGENTS["🤖 100+ Agents"] AG1[coder] AG2[tester] AG3[reviewer] AG4[architect] AG5[security] AG6[...] end
subgraph RESOURCES["📦 Resources"] MEM[(MemoryAgentDB)] PROV[ProvidersClaude/GPT/Gemini/Ollama] WORK[Workers - 12ultralearn/audit/optimize] end
subgraph RUVECTOR["🧠 RuVector Intelligence Layer"] direction TB subgraph ROW1[" "] SONA[SONASelf-Optimize<0.05ms] EWC[EWC++No Forgetting] FLASH[Flash Attention2.49-7.47x] end subgraph ROW2[" "] HNSW[HNSW150x-12,500x faster] RB[ReasoningBankPattern Store] HYP[HyperbolicPoincaré] end subgraph ROW3[" "] LORA[LoRA/Micro128x compress] QUANT[Int8 Quant3.92x memory] RL[9 RL AlgosQ/SARSA/PPO/DQN] end end
subgraph LEARNING["🔄 Learning Loop"] L1[RETRIEVE] --> L2[JUDGE] --> L3[DISTILL] --> L4[CONSOLIDATE] --> L5[ROUTE] end
U --> CLI
Related MCP Servers
AI Research Assistant
hamid-vakilzadeh
AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar
Linkup
LinkupPlatform
Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages
Math-MCP
EthanHenrickson
Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support