Fragility & Efficiency

I am increasingly worried about the tradeoff between robustness and efficiency of systems and what the effect of this will be on our society. This is a continuation from my thoughts on Fragility and Boundary Conditions

Efficiency tends to increase fragility

As you optimize a system, you tend to make it more fragile in the process. There are lots of reasons for this, and plenty of examples, so I’ll choose a few illustrations. Many people have noticed this pattern before of course, e.g. Nassim Taleb

Supply chains
The simplest supply chain would probably be a farmer making their own things for their own consumption. This is very inefficient. The farmer cannot specialize and optimize and needs to know how to make everything he wants to have. However, it is quite robust to things happening in the world, except for local disturbances such as floods or droughts.

The most complex supply chain we have ever seen is the one we have today. Everything you buy is made up of hundreds of ingredients and parts that come from all over the globe. Everything is connected to everything else. It’s extremely efficient. You can buy a chocolate bar for €1 despite it containing ingredients from tens of countries and involving hundreds of companies in its end-to-end production. Shops and factories use “just in time” shipping, effectively utilizing moving containerships as their warehouses to avoid storing anything for too long (and thereby avoiding the need to pay storage costs).

However, our current supply chains are in some sense also incredibly fragile. A containership caught in a canal in the Suez canal can make your milk more expensive. Covid-19 caused car prices to go up, due to remote working leading to increased demand for electronics, leading to a shortage of chips needed for automotive manufacturing. Your everyday life and the products you consume are affected by the chaotic system of the entire global economy.

Algorithms
Anyone that has spent a lot of time optimizing code knows how messy and fragile it often ends up if you push it to the max. Extremely high-performing pieces of code tend to be hyper optimized for their specific use cases and use clever but extremely opaque tricks to speed things up (the fast inverse square root function is a great example). They are often fragile to small changes in the code or to their inputs.

One good example is the class of algorithms running in parallel vs in a single thread. Single thread programs are limited to the speed of the core they are running on, but tend to be fairly robust and deterministic since the code is executed in a particular order each time. Parallel multithreaded programs are a different beast. They are often extremely hard to reason about and can produce extremely strange bugs and race-conditions depending on how the threads interact with each other. But in general, parallel processing is of course much much faster than single threaded programs for a wide range of problems.

Engineered components
A physical component that has been optimized for maximum possible efficiency for a particular circumstance will almost always be very fragile to all other circumstances. If you engineer something to resist the maximum possible tension while minimizing weight (steel wires are great for this), it will be practically useless if exposed to a compressing force (try pushing on a steel wire). Here are more examples in a great thread started by the question “ELI5: How are spacecraft parts both extremely fragile and able to stand up to tremendous stress?”.

The exception is when systems are constantly destabilized

Antifragility is a concept born by Nassim Nicholas Taleb and describes systems that, when shocked, increase their capability to survive future shocks. This property is interesting specifically because it is rare. Antifragile systems are the only systems I can think of that can increase efficiency while not also increasing fragility.

Living things are antifragile. Immune systems respond with antibodies, building immunity to future diseases of the same kind. Muscles become stronger when you stress them. Populations recover fast after suffering a shock due to more abundant resources triggering increased procreation.

The forcing function that makes life antifragile, yet efficient, is evolution: constant competition to propagate genes to the next generation. Evolutionary pressure constantly de-stabilizes life to evolve both efficiency and robustness in parallel. Efficiency allows for effective use of resources, meaning more offspring, faster. Robustness defends against the constant attacks from the rest of nature: predators, viruses and resource scarcity. Both are crucial for any evolving species, and hence they both evolve in parallel.

The most fragile ecosystems in the world are the ones that have been the most protected. Quokkas on Rottnest Island have no natural predators — you can push them over and they will respond as if a harmless tree branch blew them over. New Zealand’s ecosystem has evolved in isolation from the majority of the world, leading to individual introduced species wreaking havoc on the entire country: possums, koi, rabbits, skinks and wasps have each managed to thrive in an environment unprepared for them and defenseless against them.

On the other hand, the rats and cockroaches of major cities are basically invulnerable as populations because they have been evolving while exposed to total chaos. New York City is now at the stage of trying to use birth control to reduce the rat population.

Optimization creates efficiency, chaos creates robustness. Evolution drives both.

Our society is increasingly stable, and efficient, and therefore also increasingly fragile

Our society has an analogy to evolution: free market capitalism. In the free market — in theory — companies compete against each other with goods and services. Consumer taste punishes or rewards companies, making the good ones thrive and grow, and the bad ones go bankrupt and vanish.

This works well and it creates massive efficiency. Measured via GDP per capita, a chart of historical efficiency looks like we hit a vertical wall when you zoom out a few centuries.

Free market capitalism also creates some level of robustness. The free market is a chaotic and complex system with many moving parts: Interest rates, exchange rates and rapid innovation create constant small shocks to the system. Companies that are too sensitive to supply chain issues or interest rate changes will go bankrupt within a few years and exit the system. However, we only tend to experience certain kinds of shocks and the severity of those shocks are quite bounded.

Wars are less frequent, particularly wars that tangibly affect the largest free market economies. Therefore, we should expect that most western countries are extremely vulnerable to the types of shocks that wartime would bring, such as:

  • shortages of critical components for infrastructure
  • supply chain issues for food, water, energy and medicine
  • labour market shocks from mandatory conscription

The free market also optimizes on very short timescales compared to planetary systems such as the climate. This means that the system we currently have has not been exposed to rare shocks such as massive solar flares / coronal mass ejections / cosmic ray bursts, supervolcano explosions, asteroid impacts, 500 year earthquakes, tsunamis, droughts, floods.

We recently experienced a once in 100 years event that had an incredibly disruptive effect on the entire world: a global pandemic. One huge reason that our system survived this was that we were “lucky” that Covid-19 “only” had a ~0.1% death rate instead of a 5% or even a 95% death rate.

I am worried that as we continue to optimize society, we will end up becoming so fragile that even small and local shocks could end up collapsing the entire, or a large part of our global systems. Recent examples that have come across my feed:

If someone is working or thinking about this, feel free to reach out to me here