Get unlimited AI working in 30 seconds. One command sets up local models, connects your Google account to the Gemini pool, and configures intelligent routing.
nself doctor --aiThis command checks your system, installs Ollama if needed, pulls a recommended model for your hardware, and walks you through connecting a Google account to the Gemini pool.
nself ai statusExpected output:
AI Stack Status
──────────────
Local (Ollama): ✓ Running (gemma2:2b loaded, 14.2 tok/s)
Gemini Pool: ✓ 1 account, 18/20 RPD remaining
Routing: ✓ 11 task classes configured
Total capacity: Unlimited local + 20 RPD free cloudAI is now available to any ɳSelf app. The routing engine handles everything automatically. For ɳClaw users, open the app and start chatting. AI requests route through the zero-config stack with no additional setup.
# Add another Google account (adds 20 RPD)
nself ai pool add
# Pull a larger local model
nself ai local pull llama3.1:8b
# Check what the router would pick for a task
nself ai route --task chat --dry-runnself doctor --ai detects available RAM and recommends a local model size.