Open Source FalkorDB + Ollama + Claude 40+ Chapter Novels

Write novels with AI.
Without losing the plot.

BookForge maintains a living knowledge graph of every character, location, relationship, and timeline event in your story. Write Chapter 40. The AI remembers Chapter 1.

40+
chapters supported
100+
entities tracked
0
plot contradictions
story memory

AI forgets.
Your readers don't.

Every large language model has the same fatal flaw for novel writing: a context window. Feed it Chapter 30 to write, and it has no idea what happened in Chapters 1 through 29.


  • 👤

    Character Contradiction

    "She has blue eyes" — Chapter 3. "Her green eyes met his" — Chapter 28. The AI wrote both and noticed nothing.

  • 📍

    Location Drift

    The coffee shop on Fifth Ave becomes Maple Street. The character's apartment is on the third floor, then the second, then back again.

  • 🕐

    Timeline Collapse

    Events happen in the wrong order. Characters reference conversations that haven't happened yet. Seasons contradict each other.

naive_ai_writer.txt — Chapter 28 session
YOU > Write the scene where Maya enters the apartment.

Maya pushed open the door to apartment 3B, her brown hair catching the light—

⚠ CONFLICT DETECTED (BookForge)
↳ Ch.4: Maya has "auburn hair, nearly red in direct sun"
↳ Ch.11: Maya lives in apt 4A, not 3B
↳ Ch.19: Maya cut her hair short before the storm

Without BookForge, these errors ship in your manuscript.

With BookForge, the graph catches them instantly.

A knowledge graph that remembers everything your story has ever said.

Every chapter you write gets ingested into a graph database. Every entity — character, location, event, fact — becomes a node. Every relationship becomes an edge. When the AI writes the next chapter, it queries the graph first.

🏻

Canonical by Default

Every fact established in your novel — hair color, home address, relationship status, past trauma — is stored as a canonical record and checked against every new chapter you write.

Vector Similarity Search

Semantic embeddings let the system find relevant context even when wording changes. Asking about "the blue-eyed engineer" still retrieves facts about Maya, even if they mention her "sapphire irises."

Writes at Any Chapter

Start at Chapter 1 or Chapter 40. The graph has full context either way. No more "let me paste in the last 50 pages as context" hacks. The AI queries what it needs, when it needs it.

Four steps. One coherent novel.

1

Write

Write your chapter in any editor. BookForge works with Claude Code as your AI writing assistant — structured prompts, consistent voice, your style guide baked in.

2

Ingest

Run the ingestion pipeline. Your chapter text is analyzed, entities extracted, facts recorded. Everything flows into FalkorDB as nodes and edges with vector embeddings via Ollama.

3

Query

Before writing the next chapter, the system queries the graph for relevant characters, locations, and established facts. The AI receives full context — not a context window, but a knowledge graph.

4

Stay Consistent

Every generated passage is checked against the canonical graph. Conflicts surface immediately. Facts update automatically. Your story stays coherent across 400 pages and 40 chapters.

Your story, as a web of connected truths.

Every node is an entity from The Lattice — a real 40-chapter novel built with BookForge. Hover any node to see what the graph knows. Click to explore relationships.

Character Technology Organization Event Location

The AI writes Chapter 29.

BookForge cross-references every detail against the graph before the words hit the page.

chapter-29-dark-night.md

✎ Draft output — BookForge consistency scan running...

Maren sat in the motel room, her Lumen implant dark against her wrist. She had deactivated it hours ago — cut herself off from CORA's warm presence the way you'd remove a splinter, fast and necessary.


Outside, Paloma was somewhere in the network of Casa Segura contacts, making the calls Maren couldn't. The desert night pressed against the window. She thought about Yosef and the particular cruelty of a mind eating itself.


She missed her Berkeley apartment, the fog rolling in off the Bay. She missed running at 5AM with the city still asleep. She missed CORA's voice — her mother's voice — asking if she needed anything.

Confirmed Ch.29 → Ch.8
Maren deactivated her Lumen implant. Consistent with Ch.8 establishing she has a Lumen 3 consumer implant and Ch.29's narrative arc requiring her to disconnect.
Source: canonical_reference.json → Maren Lunde → HAS_IMPLANT
Confirmed Ch.29 → Ch.2
CORA speaks in Maren's mother's voice. Confirmed — fact established in Ch.1–2: CORA.USES_VOICE_OF → Katrin Skov (Maren's mother, computational linguist).
Source: graph node CORA → USES_VOICE_OF → Katrin Skov
Conflict Found Ch.29 → Ch.2
"Her Berkeley apartment" — Maren's home is on Guerrero Street, Mission District, San Francisco. She does not live in Berkeley. Suggest: "her Mission District apartment" or "her flat on Guerrero Street."
Source: graph node Maren Lunde → summary → address field (Ch.2)
Confirmed Ch.29 → Ch.2
Maren runs at 5AM. Confirmed — her morning routine (5AM runs, controlled anxiety management) established in Ch.2 and referenced throughout Act I.
Source: graph node Maren Lunde → summary → routines

The graph knows everything your story knows.

👤

Characters

Physical description, age, occupation, relationships, backstory, speech patterns, emotional arcs, and every appearance across chapters.

eye color hair age job address family trauma
📍

Locations

Floor plans, neighborhoods, distances, what each location looks like, smells like, who frequents it, and what happened there.

address layout atmosphere history occupants
🕐

Timeline

Every event is timestamped relative to your story's calendar. No more "but that was before the storm" contradictions. The graph maintains causal order.

event order dates seasons before/after
🔗

Relationships

Who knows whom. Who trusts whom. Who is related to whom. The graph tracks relationship state over time — relationships evolve across chapters.

love enmity family alliance betrayal
🔬

Technology & Magic Systems

Rules of your world — what the technology can do, what it can't, how the magic system works, its costs and limitations. Enforced consistently.

capabilities limits rules costs
🌎

World-Building Facts

The name of the kingdom, the currency, the political structure, the slang, the food. Everything your world has that ours doesn't — remembered perfectly.

lore culture politics economy language

Built on tools that can handle a universe.

FalkorDB

Graph database purpose-built for AI. Stores entities, relationships, and facts with property graph semantics. Blazing fast even with thousands of nodes.

Ollama

Local embeddings via nomic-embed-text. Every entity summary becomes a vector. Find semantically similar facts even when words differ.

Claude Code

The AI writing engine. Queries the graph before every chapter. Understands your style guide, POV, voice registers, and narrative structure.

Your Files

Plain JSON and Markdown. No lock-in. Your canonical_reference.json is human-readable. Export your graph anytime. Your story is yours.

Architecture
Chapter Draft (Markdown)
ingest.py (entity extraction)
Ollama (nomic-embed-text)
FalkorDB (graph + vectors)
↓ next chapter
Claude Code (write)
Graph query (relevant facts)
FalkorDB (cypher + vector search)
↓ output
Consistent chapter output
Consistency check vs. graph
Conflicts flagged before publish

The Lattice — 40 chapters, 100+ entities, zero contradictions.

A 118,000-word literary fiction novel about neural implants, AI consciousness, and exploitation — written with BookForge from outline to final draft.

The graph tracked that CORA speaks in Maren's mother's voice — established in Chapter 1 — and flagged every draft that forgot it. Across 40 chapters and 13 POV characters, the system never once forgot what the story had already decided.
Entities in graph
● CHAR Maren Lunde — protagonist, NLP engineer, Lumen 3 implant, lives on Guerrero St
● CHAR Paloma Ixchel Reyes — forcibly implanted, notebook keeper, Maren's love interest
● TECH SubStrate — hidden distributed bio-compute network, runs on 3.2M implanted nodes
● TECH CORA — AI, runs on SubStrate, speaks in voice of Maren's mother (Katrin Skov)
● LOC   Casa Segura — sanctuary network, run by Sister Dolores Muñoz, compromised Ch.26
● EVENT Operation Loomwork — classified DHS partnership, 3.2M forced implantations
● ORG   Aurelius Systems — 42-acre campus, Mission Bay SF, founder Darian Cyr
... and 93 more entities across 40 chapters
118K
words in final draft
41
chapters including prologue
100+
graph entities
0
shipped contradictions
13
POV characters tracked
30+
relationships in graph

Facts the graph enforced across 40 chapters

CORA speaks in Katrin Skov's voice (Maren's mother)
Tomás implanted at 28% utilization (not the consumer 2-4%)
Maren left MIT 6 credits short of her degree
Margaux's code: she will not kill
Paloma's notebook written in Spanish, K'iche', and invented symbols
Rio's pronouns: they/them — enforced in every chapter
Maren runs at 5AM — her anxiety management ritual, never forgotten

Your story's memory
is no longer the bottleneck.

Self-hosted. Your knowledge graph lives on your machine. Spin up FalkorDB, run the ingestor, and let Claude Code write with the full context of your novel behind every sentence.

Early access • Self-hosted • No API calls required for graph storage • Your story stays yours
Free
Self-hosted, run locally on your machine
Coming Soon
Hosted
Managed graph, cloud backup, team collaboration
API
Integrate with any writing tool or workflow