How do you transmit complex experiential state from one mind to another across time and space, using only a serial, lossy communication channel?
You can’t send the raw experience. It’s too high-dimensional, too context-dependent, too entangled with the sender’s entire cognitive state. You need to compress it into something that can survive transmission through a narrow channel (speech, writing), arrive at the destination (another mind), and reconstruct something close enough to the original state to be useful.
This is the state replication problem in distributed systems. And stories are the protocol humans evolved to solve it.
The Constraints
The protocol must work within hard limits:
Bandwidth: Working memory holds roughly 7 items. Transmission is serial — one word or image at a time. Attention is finite and contested.
Lossy channel: Oral retelling introduces errors. Memory decays. Each receiver interprets through their own cognitive lens. Details get forgotten, distorted, recombined.
Fidelity requirements: Despite the lossy channel, core meaning must survive. Emotional and moral state must transfer. Causal structure must remain intact. Cultural knowledge must propagate accurately across generations.
These constraints — limited bandwidth, lossy transmission, high fidelity requirements — are the same constraints that shape every state replication protocol in distributed computing. And they produce the same structural solutions.
Compression
You can’t transmit raw experience. You extract the salient features — the state changes that matter.
Characters are state trackers. The protagonist is a variable we follow through state space. “Hero’s journey” isn’t Jungian mysticism — it’s the canonical state transformation sequence (initial state → perturbation → transformation → new stable state). Supporting characters are context that drives the transformation.
Plot is a state transition sequence. Beginning: initial state. Middle: perturbations and responses. End: final state (stable or catastrophic). Every plot point is a state change. Remove a state change and the sequence breaks.
Setting is initial conditions. “Once upon a time in a kingdom far away” loads the minimum context needed to understand why subsequent transitions occur. It’s initializing the state machine before running it.
Archetypal characters are lossy compression with high resilience. “Wise old mentor” encodes a complex behavioral pattern in three words. It’s a compression scheme that trades detail for transmission reliability — stereotypes survive noise better than nuanced character studies, which is why folklore characters are archetypes and literary characters are detailed.
Error Correction
Information degrades with each retelling. Stories have built-in error correction:
The Rule of Three. Fairy tales repeat key patterns three times — three wishes, three tasks, three brothers. This isn’t aesthetic convention. It’s redundancy. If one instance gets corrupted in transmission, the other two allow reconstruction. Same principle as triple modular redundancy in safety-critical systems.
Rhyme and rhythm (in oral traditions). Sound patterns provide error detection — if a line doesn’t rhyme, something’s wrong. Meter provides a synchronization signal — deviation from the expected pattern flags corruption. The Iliad survived centuries of oral transmission partly because its dactylic hexameter is a built-in checksum.
Callbacks and foreshadowing. Critical state is encoded at multiple points in the narrative. Chekhov’s gun (if a gun appears in Act 1, it must fire in Act 3) is redundant state registration — the same information appears twice, allowing the receiver to verify that the state persisted correctly through the intervening narrative.
Engagement as Attention Management
The protocol must compete for scarce cognitive resources. Stories solve this by exploiting the brain’s prediction machinery:
Tension is prediction error. Set up expectations (this is what should happen), then violate them (but this happens instead). The brain’s prediction error signal — the gap between expected and actual — is what maintains attention. A story without tension is a story without prediction error, and the brain stops allocating processing resources to it.
Resolution is error collapse. Eventually resolve the accumulated prediction error. The satisfaction of a good ending is the transition from uncertainty to certainty — a high-information event that explains everything the brain has been tracking.
Pacing is load management. Alternate high-information scenes (action, revelation) with low-information scenes (reflection, travel). This prevents cognitive overload and allows consolidation of state changes. Same principle as backpressure in stream processing — you can’t push data faster than the consumer can process it.
Emotional beats are memory anchors. Emotionally salient moments get priority in memory encoding. When plot details decay, the emotional arc persists — “I don’t remember exactly what happened, but I remember how it made me feel.” The emotional signal provides a backbone that survives even when the detailed state is lost.
Replication Fitness
Which stories survive transmission across minds and generations?
Stories that solve all constraints simultaneously:
- “Good stories” = efficient compression + robust error correction + effective attention capture
- ”Boring stories” = failed compression (too much irrelevant detail) or failed engagement (no prediction error)
- “Confusing stories” = corrupted state (missing causal links, broken transition sequence)
- “Unsatisfying stories” = failed error collapse (accumulated prediction error never resolved)
Stories that balance all constraints become cultural DNA — they replicate across minds and generations because they’re optimized for the transmission channel. The Odyssey, Cinderella, the prodigal son — these aren’t just good stories. They’re high-fitness replicators for the human cognitive channel.
Genre as Protocol Specialization
If storytelling is a state replication protocol, genre is protocol specialization for different transmission goals:
Mystery: Maximize prediction error while providing all necessary clues (fair play). The receiver should be able to reconstruct the correct state from the transmitted data, but the encoding makes this difficult until the reveal. It’s a puzzle protocol — all data present, assembly required.
Tragedy: Inevitable state collapse from initial conditions. The receiver watches a deterministic state machine execute to its predetermined end. The engagement comes from knowing the outcome while watching the transitions that produce it — dramatic irony as a protocol feature.
Comedy: Unexpected but satisfying state resolution. Set up prediction errors that resolve in surprising-but-coherent ways. The “twist” in comedy is a state transition that violates expectation while maintaining causal consistency.
Horror: Prediction error without resolution. Deliberately maintains uncertainty, refuses error collapse. The protocol intentionally leaves the receiver in an unsettled state — anxiety as design goal.
Serialization and State Budget
TV series can maintain higher state complexity because viewers keep state between episodes — the “previously on…” recap is a state synchronization mechanism for viewers whose local state has decayed.
Movies must compress the entire state journey into roughly two hours. Tight bandwidth budget forces aggressive compression.
Short stories must use archetypal compression because there’s no room to build novel state from scratch. Every word must carry maximum information density.
Fanfiction reuses established state (known characters, settings, relationships) to skip the state-loading phase entirely. This explains why “original characters” in fanfiction are often unpopular — they require additional state initialization that the protocol was designed to skip.
The Connection to Distributed Systems
This maps precisely to state machine replication:
- State machine: The human mind
- Replication protocol: The story
- Consistency model: Eventual consistency (details vary across receivers, but core state converges)
- Fault tolerance: Redundancy through repetition, archetype, rhyme
- Consensus: Cultural stories are the ones that achieved consensus across enough minds to persist
The key difference: biological state replication uses engagement and emotion as the replication mechanism. Distributed databases use cryptographic proofs and acknowledgment packets. Both serve the same function — ensuring that state successfully transfers from source to destination and persists there.
Why This Matters
Understanding stories as constraint-driven protocols explains things that pure aesthetic theory struggles with:
Why “formula” stories work. They efficiently solve the constraints. Romance novels, action movies, fairy tales — all follow proven compression, error correction, and engagement patterns. Formula is a protocol that’s been optimized through millions of transmission cycles.
Why experimental narrative often fails commercially. Violating the constraints violates the replication fitness criteria. You can make art that deliberately corrupts the protocol — and some of it is brilliant — but it won’t replicate as widely because it’s fighting the transmission channel rather than working with it.
Why spoilers reduce enjoyment. Spoilers eliminate prediction error, removing the engagement mechanism. The protocol requires uncertainty to maintain attention allocation.
Why we retell stories everyone already knows. The value isn’t in the information content (which is already replicated). It’s in the state replication process itself — the shared emotional experience of watching state transitions unfold. Watching a story you know is like running a backup verification — confirming that the replicated state is still intact.
Stories aren’t art that happens to work. They’re protocols that evolved to solve the transmission of complex state through bandwidth-limited, lossy channels between human minds. The artistic innovation happens within the constraints, not by violating them.