CQuickie: Dealing with Complex Systems – The “God complex” and Other Biases


Note:
ComplexiQuickies
, short: CQuickies, are quickly written idea sketches.  They are related to the subject of complexity. But, in contrast to regular blog posts on complexity research, they are based on less or no scientific literature to support them.


In this first Cquickie on  dealing with complex systems, I´d like to reflect on what Tim Harford says about the God complex (Archie Cochrane) in his brilliant TED talk Trial, error and the God complex from 2011.
The God complex is the conviction to be infallible. Or, as Tim Harford describes it in plain English:

In my own little world, I´m a god. I understand everything. I do not want to have my opinions challenged. I do not want to have my conclusions tested.

So, what happens when this attitude is applied to complex systems?
Before we try to answer this question, let´s recall some of the reasons (= features of complexity understood as non-essentialist problem concepts, see The Scalability Problem | Valonqua) why complex systems tend to overwhelm language-based observers:

(1) Lack of information
Complex  systems can´t be fully understood. And even when there are (less complex) models, interpretations, etc., the coupling of these artifacts might be ambivalent, conflicting, etc.

(2) Self-reference
Self-referential qualities (recursions / feedback loops) can be observed in complex systems. This leads to indeterminacy based on recursive causality.

(3) Nonlinearity
Linear cause-and-effect attributions fail because there are too many causes, too many effects, and too many recursions. So, small causes can have large effects or the other way around.

(4) Complex <> complicated (for more details, see one of the subsequent CQuickies)
Complex systems aren´t  the same as complicated systems. For the latter, there are usually more or less simple / simplified and replicable solutions. The former lack such solutions, but there are exceptions to this rule (see, for example, relatively simple swarm optimization algorithms that are able to simulate complex collective behavior of natural / artificial swarms [Bonabeau / Dorigo / Theraulaz 1999; Dorigo / Birattari 2007]).

(5) Interactional and evolutionary dynamics
Complex systems aren´t static, but exhibit rich and dynamic interactions. This means further they are subject to evolutionary processes, that is: mutations or variations, selections, and retentions.

(6) Unpredictability
Because of their self-referential indeterminacy, nonlinearity, and dynamic qualities, complex systems are observed as unpredictable, but not as chaotic (in the sense of deterministic chaos, see Wikipedia 2016o).

(7) Emergence (for more details, see the subsequent regular blog post)
Complex systems are characterized by emergent properties that can´t be reduced to the individual qualities of their elements. However, if such a reduction is possible, the emergent properties resulted from a lack of knowledge. So, emergence is an observer-dependent phenomenon that might vary depending on the degrees of knowledge of language-based observers.

(8) Memory-based adaptation / learning (for more details, see one of the subsequent regular blog posts)
Memory functions enable complex systems to learn and adapt. But, the more sophisticated these memory functions in higher-developed complex systems become, the less complex systems have to adapt immediately to perturbations in their environments. So, higher complex systems have essentially three options:
Option 1: Learn by adapting to the memory-based constructions that react to perturbations attributed to the environment or to systems in the environment.
Option 2: Wait and see what happens next.
Option 3: Don´t learn or adapt. Instead, expect the environment or the systems in the environment to change.

(9) Other aspects of complex systems
Distributedness instead of central control or centralized coordination, flexibility, robustness, etc.

Given these characteristics of complex, esp. artificial, biological, psychic, and social systems, the God complex is often a sure recipe for failure or even disaster when dealing with complex systems and situations. But, there´s more to it because the God complex with its infallibility claim represents just the tip of the iceberg of informational distortions. Underneath it, there are many biases that can distort information-processing and decision-making activities [see Wikipedia 2016q]. For example:

  • Confirmation bias: The selective processing of information so that one´s preconceptions are confirmed and the discarding of information that might contradict them [see, for instance, Nickerson 1998 for the pervasiveness of this kind of bias].
  • Overconfidence bias:

    […] a well-established bias in which a person’s subjective confidence in his or her judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. [Wikipedia 2016q]

One main benefit of many of these biases might be that they act as complexity-reducing mechanisms because they can reduce uncertainty, ambivalence, etc. and, by doing so, calm the mind (-> less anxiety, worries, uneasiness, etc.).
But, the big disadvantage  of these biases is that they can foster feasibility manias and interventionist illusions pf control, too. As a result, projects, policies, etc. regarding complex systems are likely to fail – and, in a worst case scenario, people get killed.

These cognitive biases are quite pervasive. And no one is exempt from them. So, what can we do?

  • Being humble vis-à-vis complex systems and situations is a good start. But, it´s not enough.
  • Being aware that such biases exist is a good start, too. But, it´s not enough.
  • A more systematic debiasing training regarding the handling of complex systems and situations is probably necessary (planning games, simulations, fail-fast- and fail-safe-experiments, etc.) so to minimize or even prevent the harm done by systematic biases and inadequate strategies of complexity management.

So, what are your thoughts on this?

For some more adequate strategies of complexity management, see the subsequent CQuickie Dealing with Complex Systems – Possible Strategies.

References 

[Bonabeau / Dorigo / Theraulaz 1999] Bonabeau, E. / Dorigo, M. / Theraulaz, G. (1999), Swarm Intelligence. From Natural to Artificial Systems, New York / Oxford: Oxford University Press.

[Dorigo / Birattari 2007] Dorigo, M. / Birattari, M. (2007), Swarm intelligence,  in: Scholarpedia, 2(9):1462.
URL: http://www.scholarpedia.org/article/Swarm_intelligence  [accessed May 31, 2016].

[Nickerson 1998] Nickerson, R.S. (1998), Confirmation Bias: A Ubiquitous Phenomenon in Many Guises, in: Review of General Psychology (1998), vol. 2, no. 2, 175-220.
URL: Confirmation Bias [accessed June 1, 2016].

[Wikipedia 2016o] Wikipedia (2016o), Chaos theory,
URL: Chaos theory [accessed May 31, 2016].

[Wikipedia 2016p] – (2016p), List of cognitive biases,
URL: List of Cognitive Biases   [accessed June 1, 2016].

[Wikipedia 2016q] – (2016q), Overconfidence effect,
URL: Overconfidence Bias [accessed June 1, 2016].

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s