#166: Booth's Rule #2, Black Elephant & the Swiss Cheese Model
3 Ideas in 2 Minutes on Avoiding Disaster
I. Booth's Rule #2
Booth's Rule #2 is named after legendary skydiving equipment pioneer Bill Booth. He’s credited for making skydiving a whole lot safer…but doesn’t seem too optimistic about the positive effect this had:
The safer skydiving gear becomes, the more chances skydivers will take, in order to keep the fatality rate constant.
—Bill Booth (allegedly)
It’s counterintuitive. But safety measures can indeed have unintended consequences, also known as the Peltzman Effect. If a sport becomes safer, let’s say by mandating helmets, people will be more comfortable taking risks, which leads to more injuries. At best.
II. Black Elephant
You’ve read about Black Swans. And you’re familiar with the proverbial Elephant in the Room. But what’s a Black Elephant? A cross between the two of course. Black Elephant events happen when “devastating disasters […] are predicted but ignored”:
Black Swans and Black Elephants both describe hazard events with high impact, high exposure, and high vulnerability but are viewed from two different vantage points. With knowledge but without acknowledgment, a Black Swan becomes a Black Elephant. Both have the potential for catastrophic impacts.
—Yolanda Lin et al, Asia’s looming Black Elephant events
The term was originally coined by environmentalist and risk analyst Adam Sweidan. One of his examples was water supply pollution. Everyone knows the risks. But because it’s too complex and disruptive to fix, we’d largely ignore it.
III. Swiss Cheese Model
Once the disaster has struck, it’s handy to find out what happened. The Swiss Cheese Model is a mental model used to understand how mistakes occur in complex systems. It visualises defences against failure as layers of holey Swiss cheese.
Each slice represents a barrier or safeguard. The holes in the cheese represent weaknesses or failures in those defences. An accident happens when the holes in all layers align, allowing a trajectory of accident opportunity to pass through. The model emphasises that accidents are typically the result of multiple, smaller failures rather than a single error.
Imagine an impending plane crash. The pilot incorrectly sets the autopilot to an unsafe altitude. The co-pilot should double-check but he’s distracted by the cheese platter the flight attendant just served him. The aircraft's automated safety system should detect the discrepancy in altitude and issue a warning. But it malfunctions.
Roughly speaking, the more such safety layers organisations establish, the greater the likelihood of failures being caught and corrected before leading to an accident.
👉 I’ve made a nice little graphic to illustrate the accident trajectories in the Swiss Cheese Model. Now available for Patreon members. 🐘
Have a great week,
Chris
themindcollection.com