Coal is back in Utah’s policy conversation. Grid reliability is back in the news. Aircraft safety never really left. One of the clearest lessons about how disasters actually form was written years ago, inside the record of Crandall Canyon.
Disasters rarely begin with explosions.
They begin with a sentence that sounds reasonable.
Months before the Crandall Canyon mine collapsed in central Utah in 2007, a seismic “bounce” was recorded underground. It was not ignored. It was noted, discussed and carried forward. Monitoring would continue. Conditions would be watched. No immediate operational change was required.
That is how many disasters begin — not with a cartoon villain, not with one obviously reckless act, and not even with a lie. They begin when a warning enters a system and is translated into language the system can live with.
I know because I watched one happen.
In the days after Crandall Canyon collapsed, trapping six miners underground, I served as a spokesperson for several of the miners’ families. For nearly two weeks, the story moved from Huntington Canyon to national television, the White House and the United States Senate. Ten days after the first collapse, three rescuers trying to reach the trapped men were killed in a second collapse. The six miners were never recovered. They were sealed inside the mountain.
I spent those days in rooms where grief, engineering, media pressure and institutional language kept colliding in public.
Years later, working back through the public record — press briefings, transcripts, engineering statements, testimony, oversight hearings — I came to a conclusion that was harder to see in real time.
The collapse did not appear suddenly inside the record. It formed as a sequence.
It is easy, years later and with the pages laid out in order, to mistake sequence for foresight. The pages are calmer than the days were.
By the time a disaster is visible to everyone, it has already traveled through a chain of rooms: engineering reviews, management discussions, safety meetings, regulatory language and press briefings. At each step, uncertainty has to be translated into something the next room can accept.
And the sentence that survives is usually not the most alarming one. It is the one that sounds measured, procedural, responsible. It reduces friction and keeps the meeting moving.
We’ll monitor it.
Within limits.
No immediate change is required.
Those sentences were not reckless; they were institutional.
After the Boeing 737 MAX crashes in 2018 and 2019, which killed 346 people after a new flight-control system relied on assumptions about sensors and pilot response, investigators traced a sequence of approvals and judgments that made sense inside each room until the whole chain was visible at once.
In the Iberian Peninsula blackout, when cascading grid instability left large parts of Spain and Portugal without power, instability was translated into manageable language until the system ran out of places to hide the truth from itself. For ordinary people, that meant stalled elevators, stopped trains, rushed water purchases and daily life suddenly dependent on decisions made far away.
By the time a system fails publicly, the underlying decisions have usually spent weeks or months being managed in private.
Most people will never stand inside a mine collapse, a cockpit or a control room. They will still live with the consequences of what those rooms decide. You do not have to work inside a failing system to be carried by one.
Different industries, same structure.
A signal appears. It is acknowledged. Then it is converted into language that allows for continuity.
That is why catastrophic failures often look sudden from the outside and strangely orderly from the inside.
Utah knows Crandall Canyon as a mining disaster. But it was also a record of how risk moves through systems while the people inside them are still trying to do their jobs.
As Utah again debates coal, grid stability, energy security, and the strain growth places on infrastructure, the Crandall record feels less like history than it should.
We keep acting surprised when large systems fail, then treating each failure as if it were unique.
They are not unique.
The language changes. The furniture changes. The physics do not.
Disasters rarely begin with explosions. They begin with a sentence that sounds reasonable enough to repeat.