This is an archived page and is no longer updated.
Current content of Prof. Ladkin's research group is available at

The Risks of Hubris:Inside Risks, CACM 41(12), December 1998

Peter B. Ladkin

Article RVS-J-98-04

In Memoriam Ted Hughes, 28 October 1998

Hubris is risky: a tautologous claim. But how to recognise it? Phaethon, the human child of Phoebus the sun god, fed up with being ridiculed, visited his father to prove his progeniture. Happy to see his son, Phoebus granted him one request. Phaethon chose to drive the sun-chariot. In Ted Hughes' vivid rendering of Ovid, Phoebus, aghast, warns him

`[...] Be persuaded
The danger of what you ask is infinite -
To yourself, to the whole creation.
You are my son, but mortal. No mortal
Could hope to manage those reins.
Not even the gods are allowed to touch them.
He admonished Phaethon to
` [...] avoid careening
Over the whole five zones of heaven.
The chariot set off, Phaethon lost control, scorched the earth and ruined the whole day.

Was that vehicle safe? The sun continues to rise each day, so I guess in Phoebus's hands it is, and in Phaethon's it isn't. The two contrary answers give us a clue that the question was misplaced: safety cannot be a property of the vehicle alone. To proclaim the system `safe', we must include the driver, and the pathway travelled Consider the Space Shuttle. Diane Vaughan pointed out in The Challenger Launch Decision that it flies with aid of an extraordinary organisation devised to reiterate the safety and readiness case in detail for each mission. Without this organisation, few doubt there would be more failures. Safety involves human affairs as well as hardware and software. If NASA would be Phoebus, who would be Phaethon? Consider some opportunities.

A car company boasts that their new product has more computational power than was needed to take Apollo to the moon. (Programmers of a different generation would be embarrassed by that admission.) We may infer that high performance, physical or digital, sells cars. Should crashworthiness, physical and digital?

Safe flight in limited visibility requires reliable flight data - airspeed, altitude and so forth - now electronically displayed. A new failure mode, screen malfunction, has occurred in two accidents and other incidents. But display systems are not considered safety-critical.

In 1993, Airbus noted that the amount of airborne software in new aircraft doubled every two years (2MLOC for the A310, 1983; 4M for the A320, 1988; 10M for the A330/340, 1992). Has the ability to construct adequate software safety cases increased by similar exponential leaps? One method, extrapolation from the reliability of previous versions, does not apply: calculations show that testing or experience cannot increase one's confidence to the high level required. If not by this method, then how?

If there's a deadly sin of safety-critical computing, Hubris must be one. But suppose we get away with it. What then? In Design Paradigms, Henry Petroski reports a study suggesting that the first failure of a new bridge type seems to occur some 30 years after its successful introduction. He offers thereby the second sin, Complacency. It is hard to resist suggesting a first axiom of safety-critical sinning-

(Hubris /\ ~Failure) ~> Complacency

where ~> is the temporal LEADSTO operator. (Compare Vaughan's starker concept normalisation of deviance.)

Why might engineers used to modern logic look at such classical themes? Consider what happened to Phaethon. Jove, the lawmaker, acted:

With a splitting crack of thunder he lifted a bolt,
Poised it by his ear,
Then drove the barbed flash point-blank into Phaethon.
The explosion
Snuffed the ball of flame
As it blew the chariot to fragments. Phaethon
Went spinning out of his life.
Then as now, although more the thousand cuts than the thunderbolt. Pursuant to an accident, Boeing is involved in legal proceedings concerning, amongst other things, error messages displayed by its on-board monitoring systems; Airbus is similarly involved in Japan concerning a specific design feature of its A300 autopilot. Whatever the merits of so proceeding, detailed technical design is coming under increasing legal scrutiny.

But what should we have expected? Recall: safety involves human affairs, of which the law is an instrument. This much hasn't changed since Ovid. To imagine otherwise was, perhaps, pure Hubris.