CleverBee Documentation
Welcome to the CleverBee documentation! CleverBee is a Python-based research agent using advanced AI models (Claude/Gemini), Playwright for web browsing, and a suite of tools to extract and summarize information from various sources.
This documentation will guide you through installing, configuring, and using CleverBee to conduct comprehensive research on any topic.
Important Note
CleverBee is strongly optimized for Gemini models due to their cost-effectiveness and performance. While other models are supported, we recommend using Gemini 2.5 Pro for planning/final report, Gemini 2.5 Flash for agentic flow, and Gemini 2.0 for summarization.
CleverBee is open source but intended for personal use, research purposes, and non-commercial applications. It is designed to bring fair, balanced, and critical knowledge research to everyone.
Getting Started
Select one of the following sections to quickly get started with CleverBee:
Installation
Set up CleverBee on your system with our simple installation guide. Covers dependencies, API keys, and platform-specific instructions.
Installation GuideConfiguration
Learn how to configure CleverBee's models, browser behavior, and research tools to tailor the experience to your needs.
Configuration GuideUsage
Discover how to use CleverBee effectively for research, including how to formulate queries and understand results.
Usage GuideTools & Features
Explore CleverBee's powerful research tools, including web browsing, content extraction, and AI summarization capabilities.
Tools & FeaturesKey Features
- Interactive Web UI via Chainlit - User-friendly interface for research interactions
- Multi-LLM Research - Distinct AI models for planning, next-step decision making, and summarization
- Automated Web Browsing - Uses Playwright for content extraction from HTML
- Content Cleaning and Summarization - Processes raw content into useful research material
- Token Tracking and Cost Estimation - Monitor resource usage in real-time
- Configurable Settings - Customize models, limits, and features via config.yaml
- MCP Tool Integration - Support for specialized tools via standardized Model Context Protocol
- Optional Local Models - Use lightweight models for intermediate tasks to reduce costs
- Hardware-Aware Setup - Optimizes configuration based on your CPU/GPU capabilities
- NormalizingCache (SQLite-based) - Improve performance and reduce costs with smart caching
System Requirements
- Operating System: macOS, Linux, or Windows
- Python: Python 3.8 or higher
- RAM:
- Minimum: 4GB (using cloud models only)
- Recommended: 16GB+ (for local model support)
- Storage: 2GB+ for base installation, 10GB+ if using local models
- Hardware Acceleration: Optional but recommended
- NVIDIA GPU with CUDA support
- Internet Connection: Required for research functionality and cloud model access