Both Aleatory and Epistemic Uncertainty Create Risk

Nice work Glen! I have asked the question before but “at what point does the decision NOT to obtain accessible knowledge about ‘reducible exposures’ [epistemic uncertainty] – such as excessive complexity – become a Corporate Governance issue?”

Epistemic risk is modeled by defining the probability that the risk will occur, the time frame in which that probability is active, and the probability of an impact or consequence from the risk when it does occur…

…For these types of risks we can have an explicit or an implicit risk handling plan. I use the word handling with special purpose. We handle risks in a variety of ways. Mitigation is one of those ways. But the risk handling work is actual work. It is in the schedule. We are doing work to mitigate the risk. We are buying down the risk, or we are retiring the risk. In all cases, we are spending money, and consuming time to reduce the probability that the risk will occur. Or we could be spending money and consuming time to reduce the impact of the risk when it does occur. In both cases we are taking action to address the risk.

via Herding Cats: Both Aleatory and Epistemic Uncertainty Create Risk.

Synchronization versus collaboration:: uncertainty v risk

The Lorenz attractor displays chaotic behavior...

The Lorenz attractor displays chaotic behaviour. These two plots demonstrate sensitive dependence on initial conditions within the region of phase space occupied by the attractor. (Photo credit: Wikipedia)

We don’t need an understanding of  ‘Chaos Theory’ to know about the “Butterfly Effect”, nor do we need a medical qualification to grasp that (hitherto) unseen flaws in human DNA can have life-changing consequences for individuals.

In business terms, those of us who are concerned enough with ‘risk’ to look beyond what conventional “wisdom” tells us, KNOW that in the Digital Age of networks of inter-connected systems and sub-systems, apparently minor errors can have a MAJOR effect:

HILP – high impact, low probability events

Power Laws [fat tail] NOT Gaussian [thin tail]

Beyond probability…to the possible and plausible.

Yet, still, risk carriers, such as banks and insurers, think and rate in terms of “old world”.

It is gone. Past. An era that will not return and the problems that are being stored-up, because they fail to embrace the facts, CANNOT be funded by informed customers!

Production did not drive our lives in the old world, but the complexity of the new societies is turning our lives into gears of a big machine. Thinking that human life has the only aim of being productive for the society is the first error of any organizational system. Engineers design the pieces of a machine in order to be manufactured with a certain level of precision that provides a good performance; however, human organizations cannot design the behaviour of the people with the required precision…

via Synchronization versus collaboration.

We are ALL funding an industry’s “prediction addiction”

We “know” (well, understand) that we cannot predict the future – which should be pretty worrying for the financial sector, whose success or failure relies upon the frequency and cost of a variety of events that haven’t yet and may never happen. Except that, the expiry date is approaching, for relying upon a steady stream of (mis)information, discredited economic theories, spending vast amounts on personnel and technology that convey the impression of knowledge.

In the absence of “special powers”, insurers have to rely upon what they ‘know’ i.e. what they have learnt from the impact and frequency of past events, that happened to OTHER, similar, risks!

We now have vast quantities of data, accumulated over many years, from a wide variety of sources. The type of information with which Statisticians, Actuaries, Economists (and Carol Vorderman) can have hours and hours of fun, aided by tried and tested techniques, using  the most sophisticated technology in our history. But it doesn’t change the basic fact that we cannot predict the future although we must learn from past events.

The invaluable lessons for our man-made world are, that:

  • non-linear [real world] interactions CANNOT be modelled – concatenated probabilities are still linear
  • we should question what we think we know – you know what they say about assumptions!
  • we cannot manage risk – we CAN influence what is within the scope of our control
  • conventional tools CANNOT identify, map or measure complexity
  • resilience is a function of complexity
  • resilience (or, per Nassim Taleb “anti-fragility”) should be our primary concern
  • we can learn many more lessons from nature Read more of this post

Dogbert does Financial Planning:: applying myth or math?

Dogbert (flaw of large numbers)

Regulators want big, complex banks to hold larger buffers of capital to protect the financial system.

Big banks argue this is unnecessary because risk is diversified across their larger balance sheets.

Who is right? Natural sciences – especially epidemiology, ecology and genetics – provide clues…

 Complex systems: The FLAW of large numbers.

A “law of large numbers” is one of several theorems expressing the idea that as the number of trials of a random process increases, the percentage difference between the expected and actual values goes to zero.

If you REALLY want to get a deeper understanding of probability – and why it is wrong to assume too much from independent events (e.g. the roll of a dice) and apply that knowledge to the real world of inter-connected, non-linear systems – PLEASE check out the “Physics Envy…” presentation by Andrew Lo (link below).

Enhanced by Zemanta

Broaden your mind:: Nassim Taleb reading recommendations

You don’t have to agree with all that he says, like his (occasionally) abrasive manner or even have much of an understanding of probability, financial markets, uncertainty, complexity and risk to learn some interesting information from Nassim Taleb’s book recommendations.

Another fascinating source of “challenging” reading material, to which Taleb contributes and that I would highly recommend is Edge.

via Book Recommendations from Nassim Taleb | Farnam Street.