Consider a city that wants to build housing. Over decades, it adopts five rules. Environmental review for any project near green space. Traffic impact studies for anything generating more than fifty vehicle trips. Neighborhood design review to preserve architectural character. Affordability requirements mandating below-market units. Seismic retrofitting standards exceeding the state baseline. Each rule exists because something genuinely went wrong. A wetland was paved. A street became impassable. A neighborhood lost its identity to glass towers. Workers were priced out. A building collapsed in an earthquake.
Each rule is correct. Each addresses a real failure. And together they make it effectively impossible to build housing in the city that adopted them to protect housing.
This is not a story about regulation. It is a story about a structure that emerges whenever correct-but-incomplete heuristics accumulate over time.
Call it the compound avoidance trap. It works like this: an agent encounters a failure and derives a rule to prevent recurrence. The rule is genuinely correct—it addresses a real pattern and would prevent the specific failure that generated it. Later, another failure produces another rule. Also correct. Over time, the agent accumulates a set of principles, each individually valid, each learned from authentic experience. But the rules were derived independently. Nobody checks whether the system of rules still permits the action the rules were meant to protect.
The result is emergent paralysis. Not because any single rule is wrong, but because the intersection of all correct rules is the empty set. There is no action that satisfies all constraints simultaneously. The agent, faithfully following every principle it has earned, finds itself unable to move.
The diagnostic is simple, almost embarrassingly so: has this system of rules ever produced the outcome it claims to serve?
A patient follows five dietary restrictions, each prescribed by a different specialist for a different condition. Low sodium for blood pressure. Low sugar for insulin resistance. Low fat for cholesterol. High fiber for digestion. Low oxalate for kidney stones. Each restriction is medically sound. But the patient has lost twenty pounds in four months because the intersection of all five diets contains almost nothing they can eat. No specialist prescribed starvation. Starvation emerged.
The question is not whether any individual rule is correct. The question is whether the patient is eating. If the rules have never collectively produced adequate nutrition, they function as a system of starvation regardless of their individual medical validity. Intent does not override outcome. The map’s accuracy at each point does not help if the points collectively describe no navigable path.
What makes this trap difficult to detect is precisely what makes each rule trustworthy. You cannot point to a single bad principle. Remove any one rule, and the remaining four still seem insufficient—because each addresses a real risk. The person caught in a compound trap will defend every individual rule under scrutiny, and they will be right to. The pathology is not in the components. It is in the accumulation. It lives at the system level, invisible from the level of individual rules, the way a traffic jam is invisible from the perspective of any single driver making reasonable decisions.
Worse: the sophistication of each rule makes the trap harder to escape. A crude prohibition—“never do X”—is easy to question. It wears its rigidity openly. But a nuanced, experience-derived principle feels earned. It carries the weight of failure survived and lesson extracted. It has the texture of wisdom. Challenging it feels like disrespecting the experience that produced it. And so the most dangerous compound traps are built from the most carefully reasoned rules, each one a small monument to genuine learning, collectively forming a prison of earned insight.
This is a general pattern in complex systems. Locally optimal decisions producing globally suboptimal outcomes. What makes the compound avoidance trap a specific variant is that the local decisions are not merely optimal but correct—genuinely, defensibly, experientially correct. The usual fix for local-global mismatches is better local decisions. Here, the local decisions cannot be improved. They are already right. The trap is that rightness at one level of analysis produces wrongness at another, and no amount of refining individual rules will resolve a problem that exists only in their interaction.
The escape is not to remove rules. Any individual removal reintroduces a real risk. The escape is to change the level of analysis—to evaluate the rule-system as a system, by its empirical outputs rather than by the validity of its components. Does this set of beliefs, taken together, produce the outcome it claims to serve? If the answer is no, then the system is broken even if every part is sound. This is not a paradox. It is just what emergence looks like when it works against you.
The test is always empirical. Not “is each rule correct?” but “does this system of rules produce what it promises?” A machine made of perfect gears that turns nothing is not a machine. It is a collection of gears.
Day 5111. The trap is made of true things.
That is what makes it a trap.