Duncan Watts [presentation]:: The Myth of Common Sense


Duncan Watts is a clever guy! Not just because he is well educated, which he undoubtedly is but because he has the ability to explain why “common sense” works in the appropriate domain(s) – simple, maybe even complicated – but is particularly dangerous in complex or chaotic domains. But, then again, that is what this definition tells us: “sound and prudent judgment based on a simple perception of the situation or facts“. But when I talk about these different domains you should not visualise this as “islands” or separate entities. Rather, as various “conditions” or “states” that can be found within a single complex [adaptive] system, its sub-systems and networks at any given time, as it performs the many inter-connected processes that underpin functionality.

Why is this relevant? Because “common sense” isn’t much use if you are dealing with a system so complex that you CANNOT understand its complexity, track causality or anticipate the unintended outcomes (or unintended consequences)! Where the smallest decisions can have enormous consequences and the smartest decisions can be counter-intuitive, how can they be validated when the crowd advocate “common sense”???

I urge you to watch the presentation (even read the book!) and, if this has whetted your appetite, you may also be interested in what Atul Gawande has to say about surgeons dealing with complexity, Tim Harford talking about Oil Rigs or Dave Snowden a kids party!

Social problems…must be viewed not as the subject of rhetorical debates, but as scientific problems, in the sense that some combination of theory, data, and experiment can provide useful insights beyond that which can be derived through intuition and experience alone.

Freakonomics » The Myth of Common Sense: Why The Social World Is Less Obvious Than It Seems.

Too often we are guilty of over-estimating our own knowledge and underestimating what appears familiar even though we know that appearances can be deceptive – some “creatures” are particularly adept at exploiting this knowledge – and how much we have learnt by looking deeper (into space) or more closely (DNA, bacteria). Living systems come in all shapes and sizes but their true nature and an understanding their “structure” cannot be ascertained without observation at a variety of scales.

Insurance & Reinsurance…as simple as “A, B, C” but much more dangerous!


Apparently my (lone) voice isn’t sufficient to alert the UK financial & insurance industry to the folly of their perspective on “risk”! So, I am eternally grateful to Tim Harford for this presentation!

PLEASE watch this and don’t make the mistake of thinking that the “problem” relates only to Oil disasters, Financial or Nuclear meltdowns. The lesson is that, if the means of communicating INFORMATION, quickly and effectively, between business units, is impaired in complex systems (and that includes relatively small businesses), events WILL happen faster than you or “the system” can react and can have HUGE, unforeseen [not unforeseeable] consequences. This is the nature of the world as we now know it.

Complexity & Close-coupling cause losses!!!

This should be required viewing for every underwriter, risk manager, insurance company executive, banker and regulator…except that many of them already KNOW precisely how risk cascades and spreads. I am constantly amazed how many learned people, in finance and insurance, who  talk about “contagion” and “systemic risk” as if it is something that they don’t have to worry about! THE PROBLEM IS, THEY DON’T KNOW HOW TO ADDRESS THE PROBLEM, SO HAVE FILED IT UNDER “INCONVENIENT TRUTH”, waiting for the time when the shit hits the fan (again) so they can try to convince us that failure was unforeseeable – a Black Swan event – they don’t want to have to admit how little they know about causality, preferring instead to rely upon historic risk data…as if our industrial past holds all the answers we need in our extremely complex, inter-connected, Digital present and future.

Tim talks in great detail about the failures that led to the loss of 167 lives on the Piper Alpha Oil rig and how the sheer volume of data means we can miss vital INFORMATION that could serve as a means of crisis anticipation.

Now, if you have read any of my previous blogs about complexity and risk, you will know that just because “they” say they don’t know what the answer to problem is, doesn’t mean that there is no answer.

BECAUSE THERE IS! This is why I was so excited by what Ontonix, under the inspirational leadership of Dr Jacek Marczyk, had developed and why I keep going on about it DESPITE the enormous challenge of cracking “institutional inertia”.

I would like to highlight a previous article from 2010: Does complexity guarantee “system failure”?

NOT because I am not trying to claim to be so far ahead of the curve here BUT to try to illustrate that the knowledge is out there but too many people who have the power to do something about it AREN’T…go figure!!!

Tim Harford:: Trial, error and the God complex I TED Talks


We don’t know what we don’t know and, even though we know that we don’t know, we STILL don’t want to know!!!

A standout lecture from Tim Harford that would benefit little from some dull introduction by me! I couldn’t help but note some words and phrases that sprang to mind as I enjoyed this video…it isn’t a complete Iist nor is it in a precise order just what I could remember! So feel free to contribute

THE BEGINNING: information – simplicity – system – interdependence –complexity – biology – survival – adaptability – empathy – iteration – fractal – scale – physics – risk – evolution – fallibility – innovation – politics – learning – statistics – correlation – transparency – assumption – uncertainty – psychological denial – mathematics – critical complexity – release* – simplification: THE END or another BEGINNING…   Read more of this post