Data Mining: Detecting patterns isn’t the same as looking for them
Monday, 13 June, 2011 Leave a comment
In the fine tradition of Sesame Street this blog is brought to you by the word APOPHENIA and the number 2 [two being the number of members from the Linkedin “Risk, Regulation & Reporting” Forum (on Linkedin) – Vladimir Seroff and Joe Erl – to whom I owe a debt of thanks, for inspiring this blog].
This Ontonix presentation illustrates, the limitations of conventional statistical analysis, when data does not conform to a linear fit…how inconvenient, misleading and downright dangerous!
I hope that, having viewed the presentation, you will have a better understanding, not only of the Ontonix methodology but, also, WHY it IS such a breakthrough: a quantitative means of analysing system data. Identifying, interpreting and applying knowledge (that otherwise remains hidden) for the benefit of:
a system owner, its stakeholders, connected networks & communities – reducing complexity & building resilience
The critical point is that Ontonix methodology is 100% objective and quantitative. The difference between our approach and one that is subjective, even if based upon data, is hugely significant.
It is apparent that “patterns” within data can, undoubtedly, be identified and useful. However, if conventional techniques can’t accurately interpret information we are, not only, missing opportunities but at serious risk of making decisions that add complexity – a finite system property..
The following article highlights the perils of subjective interpretation and, sub-consciously or consciously, fitting patterns to dearly held but flawed belief systems…
Psychologist David Pizarro discusses the phenomenon of “apophenia,” a tendency of the brain to find relationships based on superstition – such as an athlete wearing the same socks while winning (“Everyday apophenia,” Edge world question 2011, www.edge.org).
“The human brain is an amazing pattern-detecting machine. We possess a variety of mechanisms that allow us to uncover hidden relationships between objects, events and people. Without these, the sea of data hitting our senses would surely appear random and chaotic. But when our pattern-detecting systems misfire, they tend to err in the direction of perceiving patterns where none exist,” Pizarro notes.
In the same Edge collection, New York University psychology and linguistics professor Gary Marcus suggests misfires result because the mind is sensitive to context; it remembers better when the subject is in a familiar setting such as the name of a classmate on campus (“Cognitive humility,” Edge world question 2011).
“Perhaps the most dire consequence is that human beings tend almost invariably to be better at remembering evidence that is consistent with their beliefs than evidence that might disconfirm them. When two people disagree, it is often because their prior beliefs lead them to remember (or focus on) different bits of evidence.”
Confirmation bias results because people are prone to not consider alternatives to their beliefs, Marcus says.
- What was Machiavelli talking about? (via Get “fit for randomness” [with Ontonix UK]) (fitforrandomness.wordpress.com)
- Innovation: listen to Confucius…and look within (fitforrandomness.wordpress.com)
- Harvard Business Review: Complex Risk Management (fitforrandomness.wordpress.com)
- Ontonix: Complex Systems Management, Business Risk Management (fitforrandomness.wordpress.com)