v0.1.0 — Now Available

AI Assistant That Fits
in 3 Megabytes

Ultra-lightweight personal AI assistant framework. Multi-provider support, extensible tools, container runtime isolation — all in a binary smaller than a selfie.

3MB binary
🔒 Zero dependencies
🚀 7+ LLM providers
zeptoclaw — agent mode
$ zeptoclaw agent "Analyze the codebase"
🤖 ZeptoClaw — I'll analyze the codebase for you.

read_file Reading project structure...
shell Running cargo check...
Found 37 Rust source files across 12 modules
Architecture: Async trait-based with MessageBus
Providers: Anthropic, OpenAI, Gemini, Groq + 3 more
Tools: shell, read_file, write_file, list_dir, edit_file
Analysis complete in 1.2s
$

Small Binary, Big Capabilities

ZeptoClaw proves that size isn't everything. Packed with features typically found in frameworks 10x larger.

🤖

Multi-Provider LLM

Seamlessly switch between Claude, GPT-4, Gemini, Groq, and more. Use the best model for each task without changing your workflow.

🛠️

Extensible Tools

Built-in tools for file operations, shell commands, and web search. Write custom tools in Rust with the simple Tool trait.

📦

Container Runtime Isolation

Execute shell commands in Docker or Apple Container for security. Falls back to native runtime when containers aren't available.

💬

Multi-Channel Gateway

Chat with your AI via Telegram, Discord, Slack, or CLI. Same assistant, your choice of interface.

🧠

Session Persistence

Long-running conversations with context. Sessions survive restarts with filesystem or in-memory storage.

🦀

Written in Rust

Memory safety without garbage collection. Zero runtime crashes. Minimal resource usage even under heavy load.

Bring Your Own API Key

Claude
GPT-4
Gemini
Groq
OpenRouter
Zhipu
VLLM

Message-Driven Design

Built on a MessageBus architecture for loose coupling and easy extensibility.

Channels

Telegram, Discord, Slack, CLI — unified message interface

Agent Loop

MessageBus → LLM provider → Tool execution → Response

Providers

Pluggable LLM backends with unified interface

3MB
Release Binary
12
Core Modules
7
LLM Providers
0
Runtime Crashes