Does complexity guarantee “system failure”?
Thursday, 16 September, 2010 3 Comments
According to one journalist, whose speciality is deconstructing accidents, it does (see below). Naturally we at Ontonix would like to respond to this statement:
When complexity reaches the point of “critical complexity” system functionality is lost and failure can ensue.
System complexity can be managed…that is what we do! More Complexity Facts from Ontonix
Nevertheless this is an interesting and worrying observation. One that, when taken in the context of Global Financial Services, begs the obvious question:
Will Basel III (or II for that matter) make things better or worse?
While you ponder this…as if it really needs too much thought…you may wish to read an extract from this interesting article “Oil, complexity and the inevitability of “system accidents” dealing with THE spill in the Gulf of Mexico cites three fundamental truths:
First, the American economy runs on oil and will continue to do so for the foreseeable future, as much as we might wish otherwise.
Second, most oil in more accessible locations has been or is being pumped. There isn’t enough to satisfy our future needs.
Third, large new oil fields have been discovered in deep water in the Gulf and the Atlantic Ocean. Technology has advanced to the point that drilling those deep wells is both technically possible and economically feasible.
But doing so is complex, so complex that mistakes are inevitable. Unfortunately, there’s precious little margin of error when they occur.
• “Procedural” accidents, in which someone makes a mistake — as when pilot error causes plane crash.
• “Engineered” accidents, in which materials or structures fail in ways that should have been foreseen by designers and engineers.
• “Systems accidents,” such as the Gulf oil spill, which occur because “the control and operation of some of the riskiest technologies require organizations so complex that serious failures are virtually guaranteed to occur.”
Among those “riskiest technologies” are the air transportation system, nuclear power plants, aircraft carriers and, as we now know, deep-water oil drilling. We accept the risks they entail because we like the rewards they provide.
Systems accidents don’t occur because the system failed, they occur because the system exists — and because it is so complicated that inevitably something will go wrong.
One of the implications of “systems accidents” is that when we try to address what went wrong we add even more complexity to an overburdened system. And that increases the risk of accidents.
This is not to say that we should abandon regulatory oversight, mandatory safety reviews or environmental assessments, as some people have claimed. Those are all important safety checks that can help prevent disaster.
It is to say that as long as we continue drilling for oil in deep water, another accident is inevitable. The cast of characters probably will be different, as will be the proximate causes. But — initially, at least — the result will be the same: oil in the water.
It’s a sobering and incredibly inconvenient truth.
Interestingly Kenneth Rogoff (Professor of Economics and Public Policy at Harvard University, and was formerly chief economist at the IMF) drew a similar conclusion that was the subject of a previous blog item: Kenneth Rogoff: The BP Oil Spill’s Lessons for Regulation
Related articles by Zemanta
- Beware Self-Inflicted Complexity (davidgwilson.spaces.live.com)
- Complex Systems and Ecology: Report by US National Academies/NRC & Fed. Reserve Bank of NY (davidgwilson.spaces.live.com)
- Dilbert speaks out on: complexity and a novel alternative investment strategy (davidgwilson.spaces.live.com)
- BREAKING THE CYCLE: “i2o” the complexity change code (davidgwilson.spaces.live.com)
- Nature always extracts justice (davidgwilson.spaces.live.com)