close

DEV Community

HIMANSHU LOHANI
HIMANSHU LOHANI

Posted on

Why I Built a SQLite Brain for AI Coding (and How It Saves 70-90% Tokens)

The Problem Nobody Talks About

AI coding tools are incredible for the first 30 minutes. Then quality drops.

By the time you're on your 5th file edit, Claude is:

  • Forgetting your project conventions
  • Breaking imports it created 10 minutes ago
  • Re-asking questions you already answered
  • Producing increasingly generic, copy-paste code

This is context rot — as the context window fills with file reads, error
messages, and previous task artifacts, the signal-to-noise ratio collapses.

The Fix: A SQLite Knowledge Graph

I built ShipFast — a framework that
gives each task fresh context via a persistent SQLite database.

How It Works

npm i -g @shipfast-ai/shipfast
cd your-project
shipfast init # indexes codebase in <1 second

Then in your AI tool:

/sf-do add dark mode toggle

Behind the scenes:

  1. Analyze — intent detection, complexity scoring (zero tokens)
  2. Optimize — selects which agents to skip based on brain.db learnings
  3. Plan — Scout researches, Architect creates task list (fresh context)
  4. Execute — Builder implements each task in a separate fresh context
  5. Verify — Critic reviews, consumer check, stub scan, build verify
  6. Learn — Records decisions + patterns for next time

The Numbers

Session Without ShipFast With ShipFast
1st time ~100K tokens ~30K (70% saved)
2nd time ~100K tokens ~15K (85% saved)
3rd time ~100K tokens ~5K (95% saved)

The brain gets smarter every session...

Top comments (0)