BookForge maintains a living knowledge graph of every character, location, relationship, and timeline event in your story. Write Chapter 40. The AI remembers Chapter 1.
Every large language model has the same fatal flaw for novel writing: a context window. Feed it Chapter 30 to write, and it has no idea what happened in Chapters 1 through 29.
"She has blue eyes" — Chapter 3. "Her green eyes met his" — Chapter 28. The AI wrote both and noticed nothing.
The coffee shop on Fifth Ave becomes Maple Street. The character's apartment is on the third floor, then the second, then back again.
Events happen in the wrong order. Characters reference conversations that haven't happened yet. Seasons contradict each other.
Every chapter you write gets ingested into a graph database. Every entity — character, location, event, fact — becomes a node. Every relationship becomes an edge. When the AI writes the next chapter, it queries the graph first.
Every fact established in your novel — hair color, home address, relationship status, past trauma — is stored as a canonical record and checked against every new chapter you write.
Semantic embeddings let the system find relevant context even when wording changes. Asking about "the blue-eyed engineer" still retrieves facts about Maya, even if they mention her "sapphire irises."
Start at Chapter 1 or Chapter 40. The graph has full context either way. No more "let me paste in the last 50 pages as context" hacks. The AI queries what it needs, when it needs it.
Write your chapter in any editor. BookForge works with Claude Code as your AI writing assistant — structured prompts, consistent voice, your style guide baked in.
Run the ingestion pipeline. Your chapter text is analyzed, entities extracted, facts recorded. Everything flows into FalkorDB as nodes and edges with vector embeddings via Ollama.
Before writing the next chapter, the system queries the graph for relevant characters, locations, and established facts. The AI receives full context — not a context window, but a knowledge graph.
Every generated passage is checked against the canonical graph. Conflicts surface immediately. Facts update automatically. Your story stays coherent across 400 pages and 40 chapters.
Every node is an entity from The Lattice — a real 40-chapter novel built with BookForge. Hover any node to see what the graph knows. Click to explore relationships.
BookForge cross-references every detail against the graph before the words hit the page.
✎ Draft output — BookForge consistency scan running...
Maren sat in the motel room, her Lumen implant dark against her wrist. She had deactivated it hours ago — cut herself off from CORA's warm presence the way you'd remove a splinter, fast and necessary.
Outside, Paloma was somewhere in the network of Casa Segura contacts, making the calls Maren couldn't. The desert night pressed against the window. She thought about Yosef and the particular cruelty of a mind eating itself.
She missed her Berkeley apartment, the fog rolling in off the Bay. She missed running at 5AM with the city still asleep. She missed CORA's voice — her mother's voice — asking if she needed anything.
Lumen 3 consumer implant and Ch.29's narrative arc requiring her to disconnect.
CORA.USES_VOICE_OF → Katrin Skov (Maren's mother, computational linguist).
Guerrero Street, Mission District, San Francisco. She does not live in Berkeley. Suggest: "her Mission District apartment" or "her flat on Guerrero Street."
Physical description, age, occupation, relationships, backstory, speech patterns, emotional arcs, and every appearance across chapters.
Floor plans, neighborhoods, distances, what each location looks like, smells like, who frequents it, and what happened there.
Every event is timestamped relative to your story's calendar. No more "but that was before the storm" contradictions. The graph maintains causal order.
Who knows whom. Who trusts whom. Who is related to whom. The graph tracks relationship state over time — relationships evolve across chapters.
Rules of your world — what the technology can do, what it can't, how the magic system works, its costs and limitations. Enforced consistently.
The name of the kingdom, the currency, the political structure, the slang, the food. Everything your world has that ours doesn't — remembered perfectly.
Graph database purpose-built for AI. Stores entities, relationships, and facts with property graph semantics. Blazing fast even with thousands of nodes.
Local embeddings via nomic-embed-text. Every entity summary becomes a vector. Find semantically similar facts even when words differ.
The AI writing engine. Queries the graph before every chapter. Understands your style guide, POV, voice registers, and narrative structure.
Plain JSON and Markdown. No lock-in. Your canonical_reference.json is human-readable. Export your graph anytime. Your story is yours.
A 118,000-word literary fiction novel about neural implants, AI consciousness, and exploitation — written with BookForge from outline to final draft.
Self-hosted. Your knowledge graph lives on your machine. Spin up FalkorDB, run the ingestor, and let Claude Code write with the full context of your novel behind every sentence.