Risk:: some things just CANNOT be modelled
Sunday, 28 October, 2012 Leave a comment
Believe it or not this is only an extract from a longer article by the Founder of Ontonix. I am, very much a layman when it comes to computer models but that is most certainly not the case with Jacek (Marczyk). However, even I know enough to question, what I have come to refer to as, the “prediction addiction” that afflicts the insurance and wider financial sector.
There is a fundamental principle – the Principle of Incompatibility – which states that as complexity increases, precision and relevance become mutually exclusive. In other words, as things get complex (and they seem to be) your statements about it become less and less precise. This means that as something becomes highly complex you can forget building models. You need to change strategy. A new approach is needed. You must change direction. Large consulting firms claim otherwise.
The Earth is a huge computer. It does things for real, it doesn’t simulate them. In effect, we are actually living on the surface of one huge supercomputer. It constantly floods us with unimaginable volumes of data. For free. However, we are being told take some of that data, to fit models on top of it and to simulate – in other words, to produce synthetic versions of the reality which we already have or soon will have. What is the scope? In most cases it is to predict the future. Prediction, however, is, in most cases, futile. Because of the physics we have the future is always under construction. This is why others, who realize that prediction is often irrelevant, also use models to try to understand the underlying phenomena.
So, it is a crime to be sitting in the middle of a gold mine of information (= Nature) and to build emasculated caricatures thereof. What we propose is:
1. Not spend time and resources on building increasingly complex (and irrelevant) models but,
2. Focus our efforts on the analysis of REAL data which the system called Earth produces free of charge.
But a new problem emerges. We must resort to new means of analysing this mass of data and turning it into something useful. As the state of our global economy shows, conventional means of data analysis, coupled with simulation (the so called Business Analytics) are a bit outdated, to say the least. What is the alternative?
Model-free methods. As things get messy, chaotic and turbulent a different approach is needed. In fact, with model-free methods you go to the next level. What you get is this:
- understanding of the structure of data – relationships, topology, hubs, information flow patterns, etc. Structure, not hundreds of pie charts, plots or surfaces
- new means of parameter ranking
- transformation of terabytes of data into megabytes of knowledge
- measures of complexity and critical complexity
- measures of resilience and fragility
- global patterns
The most important of these is understanding. In order to understand Nature better we must analyse the data is provides us with in its pure form, not using methods which warp and distort the information it carries. With statistical (and other) techniques it is incredibly easy to destroy information. Model-free methods preserve all data and in its original form and shape. Building models is NOT the only way to proceed.