While the term originated in financial risk, it has been widely adopted in safety‑critical industries to describe disasters such as Piper Alpha, Deepwater Horizon and Fukushima Daiichi. In each case, the eventual chain of causation has been reconstructed in forensic detail, creating the impression that the accident was obvious and therefore preventable. Yet this impression is itself a cognitive illusion: as Taleb notes, our minds are highly effective ‘explanation machines’, adept at weaving coherent narratives after events have occurred.
The Piper Alpha disaster of 1988, which claimed 167 lives, is often cited as a case in point. Lord Cullen’s inquiry identified multiple systemic failures, including inadequate permit‑to‑work controls, flawed design assumptions and ineffective emergency response arrangements. With the benefit of hindsight, the vulnerabilities appear stark. Yet prior to the accident, the platform was regarded as a mature and well‑managed asset, and the specific combination of maintenance error, gas release, ignition and escalation had not been fully envisaged within prevailing risk models. Similarly, the Deepwater Horizon blowout in 2010 involved a cascade of technical and organisational failures across multiple contractors, culminating in an uncontrolled release from the Macondo well. The US National Commission concluded that the disaster was not the result of a single decision, action, or failure, but a series of failures that combined to overwhelm the safeguards. Such interactions are precisely the kinds of emergent factors that evade conventional risk assessment methodology.
A key reason these events are subsequently framed as Black Swans lies in hindsight bias. Hindsight bias refers to the tendency to believe, after an event has occurred, that one would have predicted or expected it beforehand, the familiar ‘I‑knew‑it‑all‑along’ effect. In the context of accident investigation, hindsight bias can distort learning by exaggerating the foreseeability of the outcome and underestimating the uncertainty faced by decision‑makers at the time. Dekker has argued that this creates a moral asymmetry, in which past actors are judged against knowledge that was unavailable to them, encouraging simplistic stories about ‘missed warning signs’ and ‘obvious’ errors. This not only misrepresents reality but risks reinforcing the belief that future accidents will be avoided simply by trying harder to spot what, in truth, only appears obvious with retrospect.