From quantum complexity to monied tossers


Probability and Measure

Probability and Measure (Photo credit: John-Morgan)

I am not expert (in anything!) but, unless I am very much mistaken, these scientists are striving for the simplicity on the other side of complexity that Einstein craved.

When confronted with a complicated system, scientists typically strive to identify underlying simplicity which is then articulated as natural laws and fundamental principles. However, complex systems often seem immune to this approach, making it difficult to extract underlying principles.

Simplicity and quantum complexity.

I particularly like the reference to “these systems have memory and are predictable to some extent; they are more complex than a coin toss”.

Which leads me, nicely, on to a recent paper by Nassim Taleb! “Why We Don’t Know What We Talk About When We Talk About Probability”

Taleb is one of the most well known and widely published, critics of the dangerously “naive” practice of applying raw mathematical probabilities [applied to individual or independent events e.g. the coin toss or spin of a roulette wheel] to the, serious and very real, world of finance and insurance*: where it is not ignorance of the subject that is the problem, so much as the blatant disregard for the medium and long term impact upon corporate profitability and social resilience.

A manifestation of the unacceptable face of “Irresponsible Capitalism”

Read more of this post

Complexity: size doesn’t always matter


Complexity is a measure of the total amount of structured information (which is measured in bits) that is contained within a system and reflects many of its fundamental properties, such as:

  • Potential –  the ability to evolve, survive
  • Functionality – the set of distinct functions the system is able to perform
  • Robustness – the ability to function correctly in the presence of endogenous/exogenous uncertainties

In biology, the above can be combined in one single property known as fitness. Read more of this post

Ontonix and Altair Engineering Sign Software Partnership Agreement.


 

Como, 7-th November, 2011. Ontonix and Altair Engineering (www.altair.com) sign a software partnership agreement to offer Ontonix software products through Altair’s HyperWorks Partner Alliance platform. The HyperWorks Partner Alliance strives to provide the most comprehensive offering of software applications across multiple relevant domains related to Computer Aided Engineering. Through the HWPA, Ontonix will provide OntoNet, its complexity and robustness management engine, which alows engineers to measure the robustness and complexity of engineered products.
“High complexity is a prelude to inefficiency and vulnerability; therefore it becomes necessary to use a measure of product complexity as a design attribute. Technology developed by Ontonix allows engineers to conceive new solutions and designs while keeping complexity in the CAE loop from day one. Along with stresses, frequencies, or fatigue life, complexity can also become a design target,” says Dr. J. Marczyk, Founder and Chief Technical Officer of Ontonix. “If CAE is to cope with the inevitable increase of product complexity, complexity must enter the CAE loop.
OntoNet also provides a unique measure of system robustness. Computation of robustness is based on data produced by Monte Carlo Simulation, Design of Experiments, Parametric/Sensitivity studies, or time-domain simulations, enabling engineers to exploit their expansive amounts of existing data to the fullest extent” he concluded.

Power Laws & Complexity Management


image 

Mark Buchanan is one of the best writers on this most complex of subjects and this article (click on image for link) covers exactly what it says on the tin!

I particularly appreciate that he tackles key aspects of the wider subject in such a manner as to make it readily understood by anyone with a desire to learn and apply what the knowledge.

Business leaders NEED to understand the nature of  complexity and the threat of self-generated risk (excessive complexity for poor structure, processes, etc.): risk resulting from the execution of the processes that facilitate functionality.

Also, to appreciate the systemic risk exposures communicated by organisations with whom they trade…without which they would fail…reinforces the most pressing need for in-depth assessment of existing and prospective partners…both up and downstream.

MB illustrates this point and the exposure that comes from Global Supply Networks “beautifully”, by recounting the sorry tale of the enforced departure of Swedish company Ericsson from the mobile handset market thanks to a factory fire in Mexico!

 

image

Enhanced by Zemanta

ENGINEERING ANALYSIS: measure resilience on-line


Ontonix Logo

COMPLEXITY MANAGEMENT
Special Edition: ENGINEERING  ANALYSIS

NEW ENGINEERING PORTAL  – MEASURE ROBUSTNESS ONLINE

Jet engine Design for Resilience (D4R) is a web-based service that enables engineers to measure the robustness and complexity of systems.

Using the latest developments in maths and science we deliver a graphical tool which not only measures how robust a system is, it also ranks the system variables in terms impact on functionality and resilience.

Complex products cannot be engineered to perform safely and efficiently if  robustness and complexity are not measured. How robust are your designs? Can you afford not to know?

Read more.

View the tutorial

Check out the demo