When systems break, those on the outside are surprised. Those on the inside are not surprised it broke; they're surprised it lasted this long.
Is this email not displaying correctly?
View it in your browser.

Musings Report 2025-16  4-19-25  How Things Break: Hyper-Optimization

You are receiving this email/post because you are a subscriber/patron of Of Two Minds / Charles Hugh Smith.

How Things Break: Hyper-Optimization

Tangible things break for a variety of reasons, but in most cases the causes are easy to understand: 1) a part wore out; 2) a part was defective, or 3) the device was designed to break and be unrepairable to force us to buy a new one, i.e. planned obsolescence.

Systems break for other less identifiable reasons, reasons that are often hidden beneath a superficial veneer of normalcy and stability. When systems break, those on the outside are surprised. Those on the inside are not surprised it broke; they're surprised it lasted this long because they witnessed the gradual decay, debasement and hollowing out of the system--all the consequences of hyper-optimization.

Optimization is a core mechanism of increasing productivity, efficiency and profits. When we optimize processes, we seek ways to do more with less, reduce waste and quality-control failures, cut costs, and increase market share and profits.

All optimization serves one goal: increase profits, as profits are the Measure of All Things, the one and only measure of success in the global economy.

Optimization reduces complex processes and systems into data that then guides the optimization. 

If a steel barrel currently requires seven spot-welds and reducing this to six doesn't cause the barrel to leak, then the process is optimized by reducing the inputs (materials, resources, energy, labor, capital, etc.) while maintaining the output ( barrels that don't leak). 

Optimization is the process of identifying what works to lower costs, eliminate competition, gain market share, etc., and doing more of what has worked well.

Consolidation is a key factor in optimization.  If scattered production facilities are relocated to one transportation hub, costs can be reduced.

Optimization focuses on the present, as profits are measured in the present. Externalities such as the future waste stream are not included in the optimization data because the enterprise is not responsible for those costs.

What happens to a community when the production facility that provided half the jobs is moved to consolidate production is also not included, as what happens to the community does not affect profits.

Optimization considers risks and returns. If the probabilities of disruption are low, then the process optimizes normalcy: since the vast majority of the time conditions are stable, then the systemic "insurance" (redundant production facilities, warehousing spare parts, etc.) against disruption can be reduced as unnecessary expenses. This optimization boosts profits.

Optimization cuts corners in ways that are not readily visible to those who weren't engaged in the decision-making process of choosing which data would be collected as the key metrics used to further optimize yields, gains or results--all versions of the same thing. 

The choices made about what would be measured and collected for analysis may have seemed obvious, but what the choices left out of the optimization process are not just less obvious; they may be invisible until the system breaks down.  By then it's too late.

This reliance on--indeed, worship of--data as the essential foundation of optimization leads to the promotion of a "bean counter" methodology and value system, in which those who massage the data become the leaders not just of the optimization process but of the system. 

Tangible objects are relatively straightforward to repair or replace, or if no repair or replacement is possible, bypass or patch with a kludgy fix--the equivalent of duct-taping it together until a more permanent fix becomes available.

Systems are not quite so forgiving, as they are complex and emergent, meaning the entire assembly of parts and subsystems generates effects that aren't predictable because the assembly generates new effects that can't be predicted from the attributes of each part / subsystem.

Systems are also prone to phase shifts from linear states (predictable chains of causality) to non-linear states in which stability abruptly shifts into instability and chaotic behaviors that don't respond to the usual set of controls.

Optimization is thus prone to the illusions of precision and predictability, which lend themselves to dismissing the probabilities of disruptive externalities (the famous Black Swans) or chaotic breakdowns arising from apparently low-scale failures.

Humans are part of systems, and the desire to eliminate them as non-optimal bits that can be replaced by low-cost, optimized algorithms is natural--but also deeply flawed. 

Humans are also complex systems, and so reducing their performance, motivations and incentives to data points fails to capture the essence of their roles in the system.

To understand all these inherent limits, vulnerabilities and points of failure in the processes of optimization, let's consider some examples.

An infamous example of reducing a very human activity--warfare--to data fields that invite optimization of results--in this case, winning the war in Vietnam--was the "Whiz Kids" taking key roles in the Pentagon in the mid-1960s as the war heated up to a full boil.

The data fields chosen to measure and tabulate included those killed on both sides of the conflict. In the Bean-Counter understanding of the problem--i.e. the context of the situation and the moving parts of the system that responded to Pentagon controls--Kill Ratios seemed like an easily optimizable data point: if U.S. forces managed to kill three enemy combatants for every American life lost, then the war could be wrapped up in X number of months.

If the Kill Ratio could be increased, so much the better.

Left out of these bloodless calculations were all the important factors of war from the perspective of those risking their lives to fight it.  This "body count" encouraged those counting the dead to include non-combatants (civilians)  to plump up the numbers reflecting success (i.e. the equivalent of profits in commerce).

This incentivized what we now call "narrative control" via press conferences that gained the nickname
the Five O'clock Follies.



Bean-counting also missed the bedrock historical and cultural motivations of the ordinary Vietnamese risking their lives to push the Americans out of Vietnam. 

In another example of "optimizing warfare," sending pilots up in inferior aircraft with minimal training and poor unit cohesion might make sense in terms of data points, but whatever data is collected doesn't include how these conditions affect the confidence of the pilots in actual combat.

Since the probability of external disruptions or internal failures are statistically low, it makes sense to fire senior workers with decades of experience because these workers draw higher pay due to their seniority. If the goal is to reduce costs--and that is always the goal in optimization--then offering a buyout to push highly paid senior workers out and replace them with lower-paid less experienced workers makes undeniable financial sense.

But then something breaks in the refinery and the problem quickly escapes the narrow expertise of the inexperienced workers who were only trained to manage ordinary procedures. This narrowness of experience and training is called under-competence, and it's a topic I've addressed here.

The inexperienced workers are not incompetent; the problem is their competence has been optimized into a narrow band of possibilities based on flawed calculations of risk.

So the refinery burns down and managers declare this was "impossible to predict," even as redundancies and training were reduced or eliminated to trim costs and boost profits.

As for replacing humans with algorithms, consider the weather forecasts that continue to predict a wet rainy season while the drought continues. If the algorithm were a humanoid robot, it might take the human weather reporter by the shoulders and scream, "But the models are predicting rain and they must be correct!"  Except they're wrong.

I recently mentioned a vehicle diagnostic computer that failed to locate the source of the problem--a relay switch--because that part wasn't in the diagnostic codes.  That millions of lines of coding--much of it cobbled together--might fail is considered low probability until failures occur.

Systems break in normal conditions--this is the nature of complex systems. The
sociologist Charles Perrow explained why in his book Normal Accidents (1984:. "Perrow argues that multiple and unexpected failures are built into society's complex and tightly coupled systems, and that accidents are unavoidable and cannot be designed around."

This is the problem with optimization--a process that has been pushed to hyper-optimization as the incentives to increase profits by any means available have become paramount in the global economy-- is that data collection and analysis, calculating the probabilities of abnormal breakdowns occurring and focusing on cutting costs / boosting profits at the expense of system stability all have only one possible result: the superficial appearance of stability that masks the vulnerabilities and fragilities that are institutionalized by hyper-optimization.

With all the stabilizers stripped out or hollowed out, the system breaks down once any point of failure occurs. This non-linear cascade is self-reinforcing, and so once the system unravels it cannot be restored.

That these kinds of failure are still limited in scope lends a false sense of stability to inherently unstable systems that we all depend on. Plan accordingly.


Highlights of the Blog 


The Family Home: From Shelter to Asset to Liability  4/18/25

This Nails It: The Doom Loop of Housing Construction Quality  4/16/25

Trade, Tariffs, Currencies, Colonialism, the Gold Watch and Everything  4/14/25


Best Thing That Happened To Me This Week 

Our little mango tree, barely a year old, produced fruit, most of which split or fell off, not unexpected given the tree's youth and small size. But one mango ripened enough to eat, which was quite a thrill for the gardener.



What's on the Book Shelf


Autumn in the Heavenly Kingdom: China, the West, and the Epic Story of the Taiping Civil War  Stephen R. Platt 


From Left Field

NOTE TO NEW READERS: This list is not comprised of articles I agree with or that I judge to be correct or of the highest quality. It is representative of the content I find interesting as reflections of the current zeitgeist. The list is intended to be perused with an open, critical, occasionally amused mind.

Many links are behind paywalls. Most paywalled sites allow a few free articles per month if you register. It's the New Normal.


Trade Off: Financial system supply-chain cross contagion – a study in global systemic collapse.

Scientists uncover massive collateral damage tied to a routine practice: 'A necessary evil': Pesticides.

In a Tiny Gulf Town, Big Cheers for Trump’s Tariffs.

I got an early look at Epic Universe, Universal's new multibillion-dollar Florida theme park. Disney is about to get a run for its money.

10 Small Things Neurologists Wish You’d Do for Your Brain.

The Doom Loop of Generational Greed (via Chad D.)

Soundtrack to a Coup d’Etat reviewsuperb study of how jazz got caught between the cold war and the CIA.

My husband works at Trader Joe's, and we both swear by these 12 purchases--all ultra-processed...

New books chart Biden’s downfall – and the picture is damning for Democrats.

Why Are These Clubs Closing? The Rent Is High, and the Alcohol Isn’t Flowing. The financial decline of some of the city’s most popular clubs has put a spotlight on the realities of nightlife.  -- $60,000 a month rent... hmm could that be a factor??

The 'Panic Industry' Boom--build your bunker on an artificial island with a moat that catches fire....

Hollywood Is Cranking Out Original Movies. Audiences Aren’t Showing Up.

"There are two ways to be fooled. One is to believe what isn't true; the other is to refuse to believe what is true." Soren Kierkegaard


Thanks for reading--
 
charles
Copyright © *|CURRENT_YEAR|* *|LIST:COMPANY|*, All rights reserved.
*|IFNOT:ARCHIVE_PAGE|* *|LIST:DESCRIPTION|*
Our mailing address is:
*|HTML:LIST_ADDRESS_HTML|**|END:IF|*
*|IF:REWARDS|* *|HTML:REWARDS|* *|END:IF|*