
A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be made to work.
This single law — Gall’s Law — explains more about institutional failure, software projects, political reform, and personal growth than most entire books.
The Laws That Matter Most
On Change
It is impossible to change just one thing at a time. Every intervention has side effects, and the side effects have side effects. The fragilista ignores this: small visible benefits, severe invisible consequences. The reason planning is valuable but the plan is useless is that the plan assumes you can change one thing. You cannot.
On Growth
As systems grow in size and complexity, they tend to lose basic functions. The larger the system, the less variety in the product. The bigger the system, the narrower the interface with individuals. This is corporate fascism by another name: as the organization grows, the people inside become more specialized, more replaceable, and more alienated from the system’s original purpose.
On Self-Deception
People in systems do not do what the system says they are doing. The system itself does not do what it says it is doing. The PR team operates at every level: individual, organizational, civilizational. The stated purpose is always different from the actual function. The OSS sabotage manual is the sharpest proof: deliberate sabotage tactics — insist on channels, refer everything to committees, haggle over wording — are indistinguishable from normal procedure, because normal procedure was never optimized for output. Priesthoods produce knowledge and credential gatekeepers. Schools produce learning and docile workers. Hospitals produce health and a disease-management industry.
Systems develop goals of their own the instant they come into being. This is Pirsig’s hierarchy made operational: social patterns feeding on biological ones, institutions consuming the individuals they were created to serve. The system behaves as if it has a will to live — and that will diverges from the will of its creators faster than anyone expects.
On Feedback
Just calling it “feedback” doesn’t mean that it has actually fed back. The message sent is not necessarily the message received. This is the Hirschman problem in one sentence: voice that goes unheard is not feedback. Exit that produces no signal is not feedback. The system can appear to have feedback mechanisms while actually being immune to correction.
On Self-Reference
Look for the self-referential point. That’s where the problem is likely to be. The strange-loop is always the diagnostic: where does the system reference itself? Where does the map claim to be the territory? Where does the paradigm evaluate challenges to itself using its own criteria? That is where the failure will originate — and where it will be hardest to see.
To those within a system, the outside reality tends to pale and disappear. This is paradigm-lock-in and the context vortex stated as a systems law. The system’s internal logic becomes more real than external reality. The priesthood’s jargon creates a walled city where outsiders cannot enter and insiders cannot leave.
On Decay
Perfection of planning is a symptom of decay. When the planning apparatus becomes more elaborate than the thing being planned, the organization has shifted from doing to managing. This is fox governance: the foxes manage perception of the problem rather than addressing it, and the management apparatus grows until it consumes more resources than the problem it was created to solve.
A temporary patch will very likely be permanent. Every locally-optimal strategy was once a temporary fix that worked well enough that nobody went back to implement the real solution. The patch becomes load-bearing, and removing it is now harder than the original problem.
On Design
Designers of systems tend to design ways for themselves to bypass the system. The skin-in-the-game violation: the people who create the rules exempt themselves from the rules. Corporate executives with golden parachutes. Politicians with special healthcare. Bureaucrats with discretionary authority. If a system can be exploited, it will be.
The Theory of Constraints operationalizes this: for even one part of a system to be fully utilized, every other part must have excess capacity. The system that cannot tolerate visible slack will never achieve invisible throughput — because the planning apparatus and the busywork expand to fill every gap, consuming the constraint’s capacity in the process.
Great advances do not come out of systems designed to produce great advances. The inner game at institutional scale: trying to produce greatness produces the opposite of greatness. The guru doesn’t teach — they give permission. Innovation comes from systems designed to do something else, where the innovation was a side effect.
On Malfunction
In complex systems, malfunction and even total non-function may not be detectable for long periods, if ever. The China economy pattern: the system appears to work because the metrics say it works, while the underlying reality has diverged. Empty cities count as GDP. Bad loans count as assets. The malfunction is invisible until the Minsky moment makes it catastrophic.
One system’s garbage is another system’s precious raw material. Bugs are features in a different frame. The expected non-working of a subsystem may be necessary for the working of another. This is the cultural immune system’s paradox: the dysfunction that looks like failure from outside may be load-bearing from inside.
Common Misread
The dimwit take is “systems always fail — don’t bother building them.”
The midwit take is “these are cute aphorisms, not real engineering principles.”
The better take is that Gall’s laws are not pessimism but diagnostic tools. Every failing system exhibits these patterns. Recognizing which pattern is operating tells you where to look and what not to try. The most important negative knowledge: a complex system designed from scratch never works. Start simple. Let it evolve. Accept that the system will develop goals of its own. And never mistake the plan for the territory.
Main Payoff
The system is its own best explanation. You do not understand a system by reading its documentation. You understand it by watching what it actually does — which is different from what it says it does, what its designers intended it to do, and what everyone within it believes it is doing. The gap between stated function and actual function is where all the interesting problems live.
References:
- John Gall, The Systems Bible: The Beginner’s Guide to Systems Large and Small (originally Systemantics)