Dealing With Systemic Risk Shocks In Complex Systems: Early Over-reactions & Extreme Measures Hold The Key

The materialization of systemic risk in any system leads to high instability and turbulence. This disruption is even greater in the case of complex systems. Power grid failures, financial market collapses, natural disasters, and global pandemics are some examples. This paper attempts to explain why overreacting and taking extreme measures at the early stages of the systemic risk fallout are key to survival and recovery.

The Phenomena of Systemic Risk and Complex Systems: Inadequately Understood, Often Ignored

At this juncture, however, the impact on the broader economy and financial markets of the problems in the subprime market seems likely to be contained.” Ben Bernanke, ex-Chairman, the United States Federal Reserve System, March 2007.

The above statement, certainly, did not age well. Shortly thereafter, we witnessed the most severe global economic crisis since the 1930s. Many other seminal examples of economic shocks are indelibly etched in the annals of history – the well-recorded panics of the 1700s, 1800s & early 1900s, the Great Depression, the OPEC oil price shock of 1973, the Latin American debt crisis in the 1970s & 1980s, the Japanese asset price bubble of the late 1980s, the Asian financial crisis of 1997, and others.

As I write this paper, several institutions have started publishing reports on the regional and global economic projections over the next several quarters. Almost all of them paint a grave, recessionary picture of the economy, and rightly so. However, a deeper dive into some of these reports reveals two fundamental flaws in the manner they arrive at their conclusions.

  1. They either ignore, or pay only nominal attention to the most critical factor of economic modeling under high uncertainty – Systemic Risk.
  2. They treat the ecosystems of macro-economic & macro-financial agents as simple systems instead of what they really are – Complex Systems.

Many of these same institutions had confidently published a completely different set of numbers about the economy in the recent past. While very few could have predicted the emergence of COVID-19 as a global pandemic, the basic question remains: “If the forecasting models of these institutions proved ineffective at a time of lesser ambiguity (say, before November 2019), can we trust these models to forecast better in the middle of extreme volatility and ambiguity?” Even if some of these forecasts do turn out to be accurate in the future, it is likely to be more out of chance (i.e., guesstimates turning out to be correct) rather than pure economic/statistical modeling ingenuity.

The Inevitability of Systemic Risk

Systemic risk refers to the serious risk that builds up within a system undetected and eventually sets up a chain of events that brings down the entire system, or at least a large part of it. Often unknown, it usually comes into light only after the risk materializes. It is inevitable and can never be fully identified, predicted, or prevented. At most, measures can be taken to regulate it to a certain extent.

Systemic risk has two important characteristics:

  1. A Trigger (or systemic) event that causes a series of downstream events.
  2. High tail risks (i.e., significant risks associated with low probability events), where the frequency and magnitude of the tail events are very difficult to predict.

Contrary to the modeling approach adopted by some risk engineers, a purely data-driven approach does not work in systemic risk modeling. This is due to a couple of factors.

  1. The ‘absence of historical data evidence’ of a certain outcome does not necessarily indicate ‘evidence of the absence’ of that particular outcome, especially in the case of systemic risk that often leads to unknown, unforeseen behavior.
  2. Causation, an important element in systemic risk modeling, can seldom be determined only through the data alone. It also needs an understanding (or modeling) of the processes that generated that data.

The Dynamics of Complex Systems

Complex systems exhibit behaviors that are difficult to predict (or model) because there is no linear relationship between the nature and properties of the individual components (and their sub-components), and those of the entire systems. The overall system behavior is driven by the dynamic interactions among these multiple components and sub-components, and even very minor perturbations can drastically change it within a short span of time. This emergence of complex phenomena (e.g., cascading failures) is extremely difficult to predict.

As a result, complex systems cannot be understood purely on the basis of their individual elements. This is in contrast to simple (regular) systems whose behaviors are measurable functions of their individual parts. The creation of a brain as a result of billions of neurons coming together; the formation of large social groups driven by the interaction of hundreds of people; and the generation of weather patterns as a result of the interaction of air currents, heat energy, and other physical forces are all examples of complex systems.

To explain further, let’s look at some of the known variables of the COVID-19 crisis: the unresolved healthcare problem; the massive reduction in economic activities due to the large-scale lockdowns; the re-allocation (or migration) of capital/non-capital resources from a wide range of sectors to only a few (e.g., healthcare & emergency services); the disruption of the global supply chains; the deep impact to the small & medium business segments that often account for 25 to 40% of GDP, and 40 to 60% of employment in most countries; etc. Additionally, certain expected variables might also come into play. For instance, the extreme weather events that are predicted this year, loss of perishable goods and massive price reductions due to lack of inventory space, and the economic & non-economic disruptions due to the loss of human capital. Finally, new and unknown variables might make things worse. For instance, we do not know if COVID-19 will attract other diseases. If that happens, there could be a whole new set of shockwaves, especially in developing nations or smaller economies.

The global socio-economic-political ecosystems’ emergent behavior is a function of the multiple interactions of the above variables at multiple levels. The trigger event (the release of the virus) has cascaded to create a chain of catastrophic events, and there is no indication of when this will stop. The introduction of newer variables in the near-to-mid-term will complicate things even more. There are really no mathematical models that can be built to effectively simulate, understand, or predict this type of complex, emergent phenomenon.

Overreact Early, Take Extreme Measures

It is important to understand three critical characteristics of systemic risks in complex systems.

Firstly, the shockwaves generated by the fallout of systemic risk in a complex system follow the power-law distribution curve. Events triggered by these shockwaves are point vectors on this curve. Apart from the direction of the vectors (intuitively downhill), there is no robust technique to accurately estimate the magnitude of the vectors (i.e., to measure the extent of the fall and subsequent recovery), especially if the curve is characterized by long or fat tails.

Secondly, the real impact of the shockwaves is not just the magnitude of the downfall, but also the pace of acceleration of that fall. For instance, the contraction of the economy in many countries has been at a faster pace than witnessed during the previous recession. And this pace will only increase with time till it eventually bottoms out.

Thirdly, losses (costs of shocks) may keep accumulating over a long period of time, much after the crisis is over. Hence, even though each individual loss event might be a small vector in itself, the resultant loss vector will be huge.

These three characteristics, in turn, necessitate two critical actions:

  1. Measures for response remediation cannot follow the normal distribution curve (as the chain of events follows the power-law curve.) Hence, they might need to be ‘what is considered extreme under normal circumstances’.
  2. Measures need to be deployed to take effect at an early stage of the cycle. Otherwise, the actual impact may be much less than intended.

Managing the Fallout of Systemic Risk

Systemic risk is inevitable, and cannot be fully identified, predicted or eliminated. While we can try to understand it better through modeling or simulation, this strategy works only to a small extent. The best approach is to respond better to the fallout when the risk materializes. Two principal strategies to ensure this are building resilience, and reducing complexity.

1. Building Resilience (robustness against failures)

The strategy involves two approaches – (i) resilience through redundancy, and (ii) resilience through disruption.

The first approach involves maintaining redundant structures (e.g., disaster recovery sites, or backup services) to fall back upon in the event of partial/full failures of the original systems. While the ‘cost of redundancy’ may appear expensive, it often turns out to be less than the ‘cost of recovery’ when crisis strikes. For instance, global trade is generally organized on the basis of economies of scale and cost minimization, leading to the creation of centralized hubs (e.g., China for manufacturing certain products.) While this approach may appear cost-efficient at face value, it creates high systemic risk, particularly for essential services like food supplies or medicine. In the event of a COVID-19-type situation, the fallout of this risk may have a catastrophic impact on the global supply chains, and the cost of survival and recovery will be significantly higher than the cost savings from functional centralization.

The second approach involves building robustness through constant exposure to stress and failures. Nassim Nicholas Taleb, in his book Antifragile: Things That Gain from Disorder, describes how certain systems become stronger when exposed to increased levels of shocks and stress. Bones getting stronger due to increased load (Wolff’s law) is a good example. Chaos Engineering is another one. Failures are constantly injected into a new system as it gets built; fixes are developed for remediation and robustness; and the result is a fault-tolerant, resilient, steady-state piece of engineering.

2. Reducing System Complexity

Complexity reduction is the removal of non-essential elements within a system or limiting its scope to function under a known (or predictable) range of conditions. Two transformation strategies are generally deployed for this.

  • Transformation by Abstraction – decoupling the components (or sub-components) within a complex system
  • Transformation by Combination – grouping the functions of certain components, and then rationalizing the overall structure.

Effective deployment of the above strategies can make even small companies more robust against economic & non-economic shocks than some of their bigger peers. Out of the two strategies, resilience development has a greater impact.

Closing Comments

Despite all our scientific and technological advancements, the concepts of systemic risk and complex systems are still not adequately grasped by most decision-makers. This is one of the primary reasons why measures undertaken by financial institutions and governments to avert financial crises, prevent fraudulent actions, or stop money laundering do not make long-lasting impacts or yield benefits beyond a certain point.

A high correlation has often been observed between the timing of extreme measures, and the probability of survival/recovery during major crisis events. In other words, the earlier these extreme measures are undertaken to deal with the turbulence of systemic risk shocks, the greater are the chances of an early recovery and long-term survival.

Finally, there is nothing ‘normal’ in the case of systemic risk fallouts. Just when we tend to think that we have things under control, the crisis may strike a second time or even a third. The key is to err on the side of overreaction and overcaution.

Acknowledgment:

Global decentralization for risk mitigation and security, Joseph Norman, Applied Complexity Science.

Share this article.