The Shannon-Khinchin Axioms, and Uncertainty in Complex Systems

Volatility in financial markets, major fluctuations in weather conditions, the outbreak of infectious diseases, the unpredictability of interactions among different animal species, and intermittent failures of power grids are phenomena from diverse fields but have two things in common. Firstly, they are manifestations of high-to-extreme uncertainty, and secondly, the environments in which they operate are complex systems.

This uncertainty can be best understood through the concept of entropy. Entropy is loosely defined as the average level of information, surprise, or uncertainty inherent in the possible outcomes of any variable. Some of the common measures are Rényi entropy, Shannon entropy, Tsallis entropy, von Neumann entropy, and others.

Entropy in Complex Systems

Complex systems are generally considered to be stochastic in nature. This is on account of two factors. Firstly, the different components within a complex system, and/or the interactions among them, may not follow deterministic laws (as regular systems do) but evolve randomly due to endogenous or exogenous factors. Secondly, a complex system might evolve deterministically, but we may have only partial information about it (e.g., due to computational constraints) – and this limited understanding then introduces stochasticity, or a probabilistic description of the system.

Researchers generally consider three primary directions for understanding uncertainty in stochastic systems:

  • statistical mechanics for understanding the systemic properties of physical systems,
  • information theory for quantifying the information produced by probabilistic systems, and
  • the maximum entropy principle for predicting the distribution functions of probabilistic systems.

In the case of simple systems, all three aspects of entropy are equivalent. The mathematical formula for this is shown below and is called the Boltzman-Gibbs-Shannon entropy.

S = -kB W(i=1) pi (log pi),

where kB = Boltzmann constant, i = microstate, and pi = probability of the microstate i from 1 to W

A set of four fundamental principles, jointly known as the Shannon-Khinchin Axioms, characterize the qualitative properties of entropy inherent in a system, and these help to approximate the uncertainty in probabilistic systems.

The Shannon-Khinchin Axioms

Axiom 1 – Continuity: Entropy is a continuous function of all its arguments. It depends only on the state probabilities, and not on other variables.

Axiom 2 – Maximality: Entropy is maximum when all states are equally probable, leading to the highest uncertainty.

Axiom 3 – Additivity: Entropy remains the same when a new state with zero probability is added.

Axiom 4 – Expandability: The entropy created by combining two independent systems equals the sum of the entropies of the individual systems. If one is dependent on the Technical Analysis of the Efficiency of Dask other, the entropy of the combined system is the entropy of the independent system added to the conditional entropy of the dependent system.

The Violation of Ergodicity Creates the Uncertainty

Ergodicity (or Ergodic Theory) is a strong mathematical approach for understanding (and predicting) the long-term average behavior of complex systems. While conventional systems meet all Shannon-Khinchin axioms, complex systems violate the fourth axiom (Expandability) – the part related to ergodicity. Complex systems are evolutionary, path- and-history-dependent, and possess long-range and co-evolving interactions. As a result, they are non-ergodic in nature.

Violating Axiom 4 implies that we cannot fully understand or predict the outcomes of complex systems when they are merged (or split.) This final outcome is determined by ‘how’ the merge (or split) takes place. This violation has far-reaching consequences while modeling probabilistic systems (note: complex systems are probabilistic.) Firstly, it determines the impact of failed-tailed and power-law distribution functions that play a significant role in complex systems. Secondly, it plays an influential role in predicting path-dependent processes.

Closing Comments

Complex systems are characterized by the fact that their different components or elements (i.e., the subsystems) interact very strongly with one another. Moreover, the states of these individual subsystems often change due to these interactions. So, as complex systems evolve, so do their individual components with constant changes in boundary conditions. This generally leads to complex systems being out-of-equilibrium vis-à-vis the standards set by simple systems.

Acknowledgement:

The Theory of Complex Systems – Thurner, Hanel & Klimek

Share this article.