When ‘Big Data’ is too much data…


Suppose you are monitoring a system, say a human brain, a chemical plant, an asset portfolio, a traffic system. Suppose there are hundreds of parameters that you are monitoring. How do you get the idea of how things are globally going? Which parameter do you look at? How do you "add them up"? How can you blend all the information into one parameter that would cover an idea of the situation?

One way to map (transform) multiple channels of data onto one scalar function is via complexity. Complexity is a scalar function obtained from a sampled vector x(t) of N channels. The function is computed as C = f (T; E), where T is the Topology of the corresponding System Map (see examples of such maps for an EEG, or an ECG) and E is entropy. Given that entropy is measured in bits, C is also measured in bits, and represents the total amount of structured information within the N-channel data set.

If the N channels of data are sampled each at a certain frequency but within a moving window of a certain width, the result is a time-dependent function of data complexity C(t). The process is fast and may be performed in real-time using OntoNet™, our Quantitative Complexity Management engine, as illustrated in the scheme below (the blue arrow indicates the direction of time flow).

.

Enhanced by Zemanta

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s