A wise man knows one thing – the limits of his knowledge


If you haven’t come across Prof John Kay before but “get” the following, I can recommend more of his writings. He is a regular contributor to FT and wrote an excellent book with invaluable lessons on modern business life: Obliquity

…I have been looking at some of the models people use, in both the public and private sectors to predict events.

The models share a common approach. They pose the question: “How would we make our decision if we had complete knowledge of the world?” With such information you might make a detailed assessment drawing together many different pieces of relevant information on matters such as costs, benefits, and consequences.

But little of this knowledge exists. So you make the missing data up. You assume the future will be like the past, or you extrapolate a trend. Whatever you do, no cell on the spreadsheet may be left unfilled. If necessary, you put a finger in the air.

John Kay – A wise man knows one thing – the limits of his knowledge.

If we now know what we “don’t know”, we should already know that, underestimating the unknown (unknowable?!) impact of future (unforeseen or unforeseeable) events, based upon assumptions, carries unknown dangers.

Organizations need to learn to distinguish between the kinds of problems that can be handled with traditional perspectives, where precise prediction and solution is possible, and the kinds of problems associated with unavoidable complexity.

Entrainment of thinking is an ever-present danger.

7 Responses to A wise man knows one thing – the limits of his knowledge

  1. David, the mathematician La Place already considered the possibility:

    “We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.” Pierre Simon Laplace, A Philosophical Essay on Probabilities – 1814

    Obviously, this happened in 1814 before the understanding of relativity, quantum physics, and most of all Heisenberg’s Uncertainty Principle.

    Thus a wise man today doesn’t even have to consider that he won’t possess such knowledge but that it is in fact unattainable and further with the added understanding of evolutionary emergence that we can’t decompose complex adaptive systems to predict their properties in the future.

    While I agree that understanding a measure of complexity may be a good thing, I propose that measuring comlexity as a number is of little actual value as in some situations, substantial complexity may be good, especially when it emerged. Manmade complexity is another story and maybe this is where you see the benefit. Such complexity is not reduced by trying to reduce it by action, which increases complexity again, but by LETTING GO and reduce human influence to at most 20% of interactions … Kauffman’s edge between border and chaos at which evolution takes its course. Manually controlled systems eventually break from the tension stress due to lack or adaptive resilience.

    More here: http://isismjpucher.wordpress.com/2011/07/06/the-complexity-of-simplicity/

  2. Max: your comments are most welcome and add, considerably, to the message I am communicating. Thanks!

    I have previously visited your blog and had read the excellent article “the complexity of simplicity”. I, broadly, agree with what you say in the article and in your comment..the problem is that the “wise men” were/are silent as “intelligent” individuals and organisations attempted to control or manage complex systems, making them fragile: they aren’t “letting go”; haven’t been removed. But add more and more complexity in the form of debt (disguised as credit), ineffective legislation and regulation on top of structures that we KNOW to be self-serving to the point of being ethically, financially and structurally unsound…yet too big to fail!!?

    Where clarification is, perhaps, required is in relation to the “value” of a measure of complexity: being the total amount of structured information, it is measured in bits. According to our definition complexity is a function of structure and entropy [C = f (structure, entropy)] in the information, that is verification of system functions, in the form of data: http://wp.me/p16h8c-Fv

    We gain intelligence NOT from statistical analyses of data but from identifying the relationships among all possible pairs of variables. The integrity of the information exchange reveals the nodes and hubs of the (otherwise hidden), silo-free, structure in the data.

    The two key components of our complexity metric are the so-called System (or Complexity) Map – which reflects how information flows within a system – and entropy, which measures the degree of disorganization in the system.

    A Complexity Profile itself ranks the business parameters in terms of importance (=impact on overall complexity).

    The profile is computed using a technique known as “knock-out” in genomics.

    The equation is as follows but the infographics in this article may help –
    http://wp.me/p16h8c-Br:

    Complexity (of the business model, or corporation)

    X

    Uncertainty (of the environment, market, economy)

    = Fragility (of the situation)

    The idea is to show how managing complexity is equivalent to managing risk and how, with all things being equal, it is better to be less complex. Evidently, U cannot be managed but C can.

    The best way to impact C is by starting from the top of the Complexity profile. Because the profile is computed based on a model-free method, there are no subjective weights to adjust. Basically, this guarantees that you hit the most important parameters first, i.e the hubs.

    A complex system CANNOT perform functions for which it was intended without, first, possessing and maintaining the minimum amount of complexity to do so. Therefore, by monitoring current complexity, the potential risks associated with loss of function(s) due to endogenous events, are “pre-empted” by fluctuations in the complexity measure. The observer obtains a quantitative basis for “risk decisions” to maintain a healthy system that may otherwise appear counter-intuitive based upon the limits of past experience.

    The complexity measure serves as a single metric for the “fitness” of the system with the ability to identify and manage sources and avoid loss of function as a result of, loss of, or, excessive complexity: reducing uncertainty by extending the “risk horizon” to distinguish between unforeseen [epistemic] and unforeseeable [aleatory] uncertainty.

    Excessive complexity: too much of a good thing…
    Complexity cannot grow indefinitely. The laws of physics ensure that every system can sustain its own specific maximum amount of complexity before it becomes unmanageable and before it loses integrity. This limit is known as critical complexity. In the proximity of this threshold systems become unstable. Close to critical complexity a corporation loses resilience, becomes fragile and vulnerable.

    A fragile system is unprepared to face extreme events (the so-called Black Swans) whereas an agile system can adapt and is, therefore, resilient. The former a contributor to systemic risk, the latter, to systemic resilience.

    I hope that the foregoing helps clarify our position? In the light of your considerable understanding of the subject matter I would be interested to get your own thoughts on the following:

    If the characteristics of a system are communicated into its ecosystem and connected networks, can systems with unknown and unmanaged complexity – that are susceptible to low probability, high impact endo./exo. events – claim to be sustainable?

    David

    • David, you will and do find many claims of sustainability that are utterly unreasonable, such as ever increasing revenue, profits or productivity …

      I do understand your attempt to enumerate complexity of systems (similar to Kolmogorov complexity of a structure) to describe the risk that they are exposed to.

      But it is an assumption that less complexity automatically reduces the risk of the system not being resilient to the uncertainty outside changes. Even a very simple system may not posess the ability to perceive those outside changes and simply collapse when such happen.

      For me the risk reduction is linked to the adaptability and it may require a very complex system to make that adaptibility usable by people in sufficient simplicity. You may say that this is ‘managing’ complexity, but in the real world this ability is nearly impossible to enumerate. Having some enumeration of complexity and thus trying to reduce it when it becomes too large may simply remove those capabilities that actually are required to react to outside uncertainty. The systems seems less complex but it is still unmanageable.

      Therefore the subjective-less weighting of complexity removes the most important element – SUBJECTIVITY! Meaning, we do need subjective human expertize to increase resilience of complex systems. We simply have to give people transparency and control and then even VERY complex systems are suddenly quite resilient to change.

  3. Hello again Max!

    Rather than an assumption that less complexity = less resilience, through our work, we have established some “Complexity Facts”: http://wp.me/P16h8c-eu. Of course it would be fair to say that, depending upon the nature of the exogenous event, “system failure” may be unavoidable. After all “fitness”, a hard hat and work boots count for little if you are under a collapsed building as a result of an earthquake or other natural disaster.

    We have established that, for a system to perform the array of functions for which it was created, FIRSTLY requires the complexity to do so. We identify the nodes and hubs within the information flow – don’t treat data from an interdependent system as if a series of independent inputs/outputs – and enumerate the level of “current complexity”. Increased functionality, firstly (again), requires the presence of more structured information or complexity.

    Unlike exogenous (unforeseeable) events, an endogenous event is foreseeable as the system nears the point of “critical complexity”. The information flow loses structure, becoming more chaotic. With some systems it is possible to act to manage or mitigate the impact of such an event upon the system. In others, for example in Healthcare, that early warning may make the difference between life and death to someone in HDU or undergoing surgery. Of course, in some instances, the systems ability to adapt and to survive deteriorates, it is/becomes too fragile and system failure is unavoidable.

    To be clear, advocating the “less complexity” option is a case of applying Occam’s Razor when presented with 2 or more means of resolving a given problem: simplicity rather than complexity.

    Complexity per se is not the threat…excessive complexity is. Too much structure and too much entropy and the system loses its ability to adapt, etc.

    The process is verifiable…otherwise it wouldn’t be deployed in environments such as Healthcare, Automotive, Aerospace, etc.

    What we do is give the owner/observer a transparent, objective, insight into a system that they believe they know. Proving that unforeseen is not the same as unforeseeable and enabling them to interpret and act in the interests of maintaining the system in a “fit and resilient” state and enabling them to make better risk decisions.

    I HOPE I have addressed your various points and thanks again for your interest.
    Best,

    David

  4. David, thanks. Most people possess the ability to see those ‘excessive complexity’ signs through simply pattern matching that our brains are quite capable of. The problem is not about knowing that this complexity is there, but doing something sensible about it. That is a political and not a knowledge problem.

    So I think it is more important to not add another warning light that the car is going to fast, but rather add anti-locking breaks to the system to empower the operator. As it happens it is equally difficult to convince businesses that they should invest the money in such risk reducing features.

    Regards, Max

  5. Max: love the analogy because (operators) drivers of the current generation, particularly in Financial Services, are addicted to speed and have invested countless billions in competing, creating the illusion of competence, working lights and brakes…the problem is that the race organisers took them at their word whilst being left in no doubt that, unless they “looked the other way” they would take their races to other/foreign tracks.

    These guys have been racing at breakneck speed whilst relying upon the view in their rear view mirror and maps of the terrain as it was 50 years ago. Like asking the wrong questions precisely and getting the wrong answers precisely: creating false positives.

    As I am sure you are very well aware, mistaking correlations for causality is easy done in complex systems and much often what is required to maintain the “health” of the system is counter-intuitive…like the well-intentioned CRO in a bank or Petro-chemical company who lacks the quantitative basis for a particular course of action that is half as risky but twice as expensive as the figures contained in the financial projections for a new acquisition, process, product, oil field.

    On the subject of spend: bank RM $74bn by 2015! http://wp.me/p16h8c-Op
    Add to that the cost of ineffective regulation and I reckon that if the racers and track operators don’t change the game they will lose the race fans to the “tracks” of, such as, BRIC nations with considerably fewer legacy issues to overcome: adapt and survive or risk extinction…

    I would have thought, with your expertise, solutions for IT industry and the unbelievable figures quoted as a result of IT failure would have been something on your radar?
    Enormous cost of IT failure: $500 bn per month – complexity often cited as culprit! http://wp.me/p16h8c-9

    We should get our heads together and make the world a safer place…and I’ll settle for 50/50 split of even 1% of the savings if you will.
    Cheers,

    David

    • Hi David, you are quite right about failure being on my radar, but I don’t see complexity as the issue: http://isismjpucher.wordpress.com/2010/09/26/perceptions-are-reality/

      Look, I am not contradicting what you are saying. I said that the complexity warning light is equally hard to sell as the brakes to the current management mindset. My criticisms in reagrds to business intelligence that mistakes correlation for causality and somehow believes that the past predicts the future has been well documented.

      I am all for putting our heads together. We do actually a simplistic version of complexity assessment for processes already today. But we also do machine learning for process mining and it is a really difficult sell as well. We are simply way ahead of the curve. That’s why I focus on issues that are reasonably accepted by the current management mindset, such as more dynamic processes. And even there we still discuss if they exist … regards, Max

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s