Category Archives: Complexity Science

Rezension – Dirk Baecker (2013), “Beobachter unter sich: Eine Kulturtheorie”

(1) Warum sollte man dieses Buch von Dirk Baecker lesen? 

(1a) Allgemein
Eine spannende Lektüre stellen Dirk Baeckers Texte, insbesondere aus jüngerer Vergangenheit, meines Erachtens aus zwei Gründen dar:

  • Grund 1: Während es anfangs noch darum ging, die Bielefelder Systemtheorie auf Basis des Formenkalküls von Spencer Brown umzuschreiben und dabei zuzuspitzen, bastelt Baecker seit einigen Jahren an einem theoretischen Alternativprogramm.  Dieses läßt die minimalontologisierte Systemprämisse Niklas Luhmanns, “Es gibt (selbstreferenzielle soziale) Systeme” , zugunsten formbasierter Beobachtungsleistungen fallen (siehe [1]).
  • Grund 2: Diese formbasierten und abstrakten Beobachtungsleistungen versprechen einen Anschluß an die allgemeinere Komplexitätsforschung, weil sich nun die folgende Frage auftut:
    Kann ein generalisierter Beobachtungsbegriff nicht mehr nur für Bewußtsein oder Soziales (interaktionale Zusammenhänge, Organisationen, etc.), sondern auch für nicht-sinnhafte Dynamiken auf maschineller, biologischer und physikalisch-chemischer Ebene verwendet werden (siehe als Studie in diese Richtung [3])?
    Die Anschlußfrage wäre dann: Falls diese Beobachtungsgeneralisierung möglich ist, könnte das dazu beitragen, die disziplinspezifischen Kommunikationsbarrieren, die meiner Meinung nach in der Komplexitätsforschung existieren, etwas zu minimieren?
    Anders formuliert: Könnte eine formbasierte Lingua franca zwecks Verbesserung der transdisziplinären Kommunikation in Sachen Komplexitätsforschung Verwendung finden?

(1b) Speziell dieses Buch
Diesen Text würde ich als einen Höhepunkt des jahrelangen Werkelns an einem sozialen Kalkül im Anschluß an George Spencer Browns “Laws of Form” ansehen. Hierbei wird ein generalisierter Beobachtungsbegriff mit einem entsprechenden Verständnis von Kultur kurzgeschlossen.
Kurzum: Wer heutzutage über Dirk Baeckers sozialen Kalkül diskutieren möchte, kommt an diesem Referenztext eigentlich nicht vorbei. Andere (allgemeinere) Texte zu Baeckers Sozialkalkül können demgegenüber eher als theoretische Vorarbeiten angesehen werden.

(2) Wie nähert man sich diesem Buch an?
Dieser überaus ambitionierte Fachtext setzt Kenntnisse in mindestens folgenden Bereichen voraus:

  • Dem originalen Formenkalkül von George Spencer Brown (hilfreich sind dabei: [5,6])
  • Baeckers Interpretation in Richtung Sozialkalkül, siehe bspw. [2,4].
  • Luhmanns Systemtheorie seit ihrer autopoietischen Wende, weil sie quasi als Kontrastfolie zu Baeckers Theorieprogramm fungiert. Ein guter Einstieg kann hierbei über [7] erfolgen.

Lektüreerleichternd sind darüber hinaus auch Kenntnisse

  • des Deutschen Idealismus (Fichte, Hegel, etc.),
  • des Poststrukturalismus (Derrida, etc.),
  • von Gotthard Günthers kenogrammatischem und polykontexturalem Ansatz sowie Grundkenntnisse in Harrison C. Whites Netzwerkansatz.

Die Lektüre dieses Fachtextes ist also recht voraussetzungsreich. Insbesondere die ersten 60 Seiten fahren sehr heftiges formtheoretisches Geschütz auf, bei dem man zuerst einmal ins Schlucken kommt. Allerdings kann man diese Seiten auch zunächst überspringen, um später darauf (wiederholt) zurückzukommen.

(3) Was ist neu?
Leserinnen und Leser, die mit der soziologischen System- und Formtheorie von Luhmann / Baecker vertraut sind, wird einiges bekannt vorkommen. Allerdings wird hier die formbasierte Beobachtertheorie sehr profund und konsistent entwickelt, und zwar in folgenden Schritten:

  • Der Einstieg erfolgt mit einer ausführlichen Präsentation der formalen Grundlagen.
  • Dann wird die Vorgeschichte des Beobachterbegriffs in der Subjekt-Philosophie (Kant und die Deutschen Idealisten) erzählt.
  • Der dritte Schritt besteht in der Generalisierung des Beobachters über das klassische Subjekt hinaus, so daß nunmehr neben Bewußtsein und Sozialem auch andere selbstorganisierende Dynamiken zerebraler, zellulärer, maschineller und sonstiger biologischer Art (Stichwort: Schwärme) als mögliche Beobachter studiert werden können.
    Nicht-bewußtseinsförmige und nicht-soziale Beobachter sind freilich bislang formtheoretisch kaum erforscht, so daß Baecker hier eine Einladung ausspricht, ähnlich wie in [3] entsprechende Studien zu versuchen.
  • Der vierte Schritt ist eine eingehende Behandlung der Problematik der Negation (als Implikation, Widerstreit, etc.) im Rahmen der formal-abstrakten Beobachtertheorie.
  • Zu guter Letzt erfolgt eine form- und beobachtungstheoretische Charakterisierung und Anwendung des Kulturbegriffs, der oft nur als ein diffuser Ausdruck verwendet wird (siehe [8]).

(4) Fazit
Beobachter unter sich: Eine Kulturtheorie ist ein ambitionierter, faszinierender und virtuoser Referenztext für all diejenigen, die sich mit Dirk Baeckers Sozialkalkül und Kulturverständnis beschäftigen möchten. Dabei sind die Verständnishürden teilweise recht hoch, so daß manche Leserinnen und Leser wohl  an der Lektüre scheitern werden. In diesem Fall ist der Umweg über die oben genannten Lektürehilfen sinnvoll.

PS –
Sollte jemandem auffallen, daß diese Rezension Ähnlichkeiten zu einer entsprechenden Amazon-Rezension von Baeckers Text aufweist… Ich habe mir in diesem Fall erlaubt, mich ein wenig selbst zu plagiieren. Honni soit…  🙂

(5) Buchausgaben
Gebundene und Kindle-Ausgaben sind bspw. bei Amazon unter folgendem Link erhältlich:
Beobachter unter sich

(6) Literaturhinweise
[1]
Baecker, D. (2015c), Es gibt keine sozialen Systeme, Verhandlungen der Kongresse der Deutschen Gesellschaft für Soziologie, April 2015 (Preprint: 2014):
URL des Preprints:   Es gibt keine sozialen Systeme [Zugriff: 5. Juni 2016].
[2]  – (2015b),  Working the Form: George Spencer-Brown and the Mark of Distinction*.
URL (auch eine deutsche Version ist verfügbar): Working the Form
[Zugriff: 26. Mai, 2016].
[3] – (2014), Neurosoziologie: Ein Versuch, Berlin: edition unseld.
[4] –  (Hrsg.) (1993), Kalkül der Form, Frankfurt / M.: Suhrkamp.
[5] Lau, F. (2012), Die Form der Paradoxie. Eine Einführung in die Mathematik und Philosophie der “Laws of Form” von George Spencer Brown, Heidelberg: Carl Auer.
[6] Schönwälder-Kuntze, T. / Wille, K. / Hölscher, T. (2009, 2 ed.), George Spencer Brown. Eine Einführung in die “Laws of Form”, Wiesbaden: VS Verlag für Sozialwissenschaften.
[7] Luhmann, N. (2011, 6. Aufl.), Einführung in die Systemtheorie, Heidelberg: Carl Auer.
[8] Fuchs, P. (2016), Was fangen wir nur mit Kultur an?,
URL: Was fangen wir nur mit Kultur an? [Zugriff: 5. Juni 2016].

Advertisements

CQuickie: Dealing with Complex Systems – The “God complex” and Other Biases

Note:
ComplexiQuickies
, short: CQuickies, are quickly written idea sketches.  They are related to the subject of complexity. But, in contrast to regular blog posts on complexity research, they are based on less or no scientific literature to support them.


In this first Cquickie on  dealing with complex systems, I´d like to reflect on what Tim Harford says about the God complex (Archie Cochrane) in his brilliant TED talk Trial, error and the God complex from 2011.
The God complex is the conviction to be infallible. Or, as Tim Harford describes it in plain English:

In my own little world, I´m a god. I understand everything. I do not want to have my opinions challenged. I do not want to have my conclusions tested.

So, what happens when this attitude is applied to complex systems?
Before we try to answer this question, let´s recall some of the reasons (= features of complexity understood as non-essentialist problem concepts, see The Scalability Problem | Valonqua) why complex systems tend to overwhelm language-based observers:

(1) Lack of information
Complex  systems can´t be fully understood. And even when there are (less complex) models, interpretations, etc., the coupling of these artifacts might be ambivalent, conflicting, etc.

(2) Self-reference
Self-referential qualities (recursions / feedback loops) can be observed in complex systems. This leads to indeterminacy based on recursive causality.

(3) Nonlinearity
Linear cause-and-effect attributions fail because there are too many causes, too many effects, and too many recursions. So, small causes can have large effects or the other way around.

(4) Complex <> complicated (for more details, see one of the subsequent CQuickies)
Complex systems aren´t  the same as complicated systems. For the latter, there are usually more or less simple / simplified and replicable solutions. The former lack such solutions, but there are exceptions to this rule (see, for example, relatively simple swarm optimization algorithms that are able to simulate complex collective behavior of natural / artificial swarms [Bonabeau / Dorigo / Theraulaz 1999; Dorigo / Birattari 2007]).

(5) Interactional and evolutionary dynamics
Complex systems aren´t static, but exhibit rich and dynamic interactions. This means further they are subject to evolutionary processes, that is: mutations or variations, selections, and retentions.

(6) Unpredictability
Because of their self-referential indeterminacy, nonlinearity, and dynamic qualities, complex systems are observed as unpredictable, but not as chaotic (in the sense of deterministic chaos, see Wikipedia 2016o).

(7) Emergence (for more details, see the subsequent regular blog post)
Complex systems are characterized by emergent properties that can´t be reduced to the individual qualities of their elements. However, if such a reduction is possible, the emergent properties resulted from a lack of knowledge. So, emergence is an observer-dependent phenomenon that might vary depending on the degrees of knowledge of language-based observers.

(8) Memory-based adaptation / learning (for more details, see one of the subsequent regular blog posts)
Memory functions enable complex systems to learn and adapt. But, the more sophisticated these memory functions in higher-developed complex systems become, the less complex systems have to adapt immediately to perturbations in their environments. So, higher complex systems have essentially three options:
Option 1: Learn by adapting to the memory-based constructions that react to perturbations attributed to the environment or to systems in the environment.
Option 2: Wait and see what happens next.
Option 3: Don´t learn or adapt. Instead, expect the environment or the systems in the environment to change.

(9) Other aspects of complex systems
Distributedness instead of central control or centralized coordination, flexibility, robustness, etc.

Given these characteristics of complex, esp. artificial, biological, psychic, and social systems, the God complex is often a sure recipe for failure or even disaster when dealing with complex systems and situations. But, there´s more to it because the God complex with its infallibility claim represents just the tip of the iceberg of informational distortions. Underneath it, there are many biases that can distort information-processing and decision-making activities [see Wikipedia 2016q]. For example:

  • Confirmation bias: The selective processing of information so that one´s preconceptions are confirmed and the discarding of information that might contradict them [see, for instance, Nickerson 1998 for the pervasiveness of this kind of bias].
  • Overconfidence bias:

    […] a well-established bias in which a person’s subjective confidence in his or her judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. [Wikipedia 2016q]

One main benefit of many of these biases might be that they act as complexity-reducing mechanisms because they can reduce uncertainty, ambivalence, etc. and, by doing so, calm the mind (-> less anxiety, worries, uneasiness, etc.).
But, the big disadvantage  of these biases is that they can foster feasibility manias and interventionist illusions pf control, too. As a result, projects, policies, etc. regarding complex systems are likely to fail – and, in a worst case scenario, people get killed.

These cognitive biases are quite pervasive. And no one is exempt from them. So, what can we do?

  • Being humble vis-à-vis complex systems and situations is a good start. But, it´s not enough.
  • Being aware that such biases exist is a good start, too. But, it´s not enough.
  • A more systematic debiasing training regarding the handling of complex systems and situations is probably necessary (planning games, simulations, fail-fast- and fail-safe-experiments, etc.) so to minimize or even prevent the harm done by systematic biases and inadequate strategies of complexity management.

So, what are your thoughts on this?

For some more adequate strategies of complexity management, see the subsequent CQuickie Dealing with Complex Systems – Possible Strategies.

References 

[Bonabeau / Dorigo / Theraulaz 1999] Bonabeau, E. / Dorigo, M. / Theraulaz, G. (1999), Swarm Intelligence. From Natural to Artificial Systems, New York / Oxford: Oxford University Press.

[Dorigo / Birattari 2007] Dorigo, M. / Birattari, M. (2007), Swarm intelligence,  in: Scholarpedia, 2(9):1462.
URL: http://www.scholarpedia.org/article/Swarm_intelligence  [accessed May 31, 2016].

[Nickerson 1998] Nickerson, R.S. (1998), Confirmation Bias: A Ubiquitous Phenomenon in Many Guises, in: Review of General Psychology (1998), vol. 2, no. 2, 175-220.
URL: Confirmation Bias [accessed June 1, 2016].

[Wikipedia 2016o] Wikipedia (2016o), Chaos theory,
URL: Chaos theory [accessed May 31, 2016].

[Wikipedia 2016p] – (2016p), List of cognitive biases,
URL: List of Cognitive Biases   [accessed June 1, 2016].

[Wikipedia 2016q] – (2016q), Overconfidence effect,
URL: Overconfidence Bias [accessed June 1, 2016].

Subproject 1: The Paradigm of (Social) Complexity – Part II-2d: System Formation and Maintenance – The Distinction “Element / Relation”

System / environment and element / relation have largely supplanted the much older guiding distinction whole / parts when discussing the formation of a complex system.
I´ve presented the first of these more modern distinctions in the previous blog post [The Distinction “System / Environment” | Valonqua]. The second guiding distinction, element / relation, is the subject of this blog post.

(1) Bottom-up or top-down?

(1a) A bottom-up perspective seems to be prevalent in complexity, esp. complex adaptive systems (CAS) research where simple or even complex elements are sometimes seen as (ontologically) pre-given. The (dynamic) interactions / relations between these elements  are then crucial for a complex system to pop up (see the subsequent blog post on emergence):

Social agents, whether they are bees or people or robots, find themselves enmeshed in a web of connections with one another and, through a variety of adaptive  processes, they must successfully navigate through their world. Social agents interact with one another via connections. These connections can be relatively simple and stable, such as those that bind together a family, or complicated and ever changing, such as those that link traders in a marketplace.  [Miller / Page 2007, 10]

(1b) In contrast, from a difference-based perspective [see, for instance, Cilliers 1998; Luhmann 1995}, the assumption of elements as (ontologically or not) preexistent doesn´t make sense any more. The main reason is that elements and relations are seen as differences:

Just as there are no systems without environments or environments without systems, there are no elements without relational connections or relations without elements. In both cases the difference is a unity (in fact, we say “the” difference), but it operates only as a difference. Only as a difference can it connect processes of information processing. [Luhmann 1995, 20]

But, this perspective can be specified in various ways. For example:

  • as co-constitution in a connectionist network [Cilliers 1998] or
  • from a top-down perspective [Luhmann 1995], as constructs of the emerging (in this case: social) system. In other words, it´s up to the emerging (social) system to determine what is and what isn´t an element / a relation of the system:

If one were to ask what elements (e. g., atoms, cells, or actions) “are,” one would always come upon highly complex facts that must be attributed to the system’s environment. Then an element would be what functions for a system as a unity that cannot be further dissolved (even if, viewed microscopically, it is a highly complex compound). When one says “cannot be further dissolved,” this also means that a system can constitute and change itself only by interrelating its elements, not by dissolving and reorganizing them.  [Luhmann 1995, 22 – my emphasis]

So, the main difference between these bottom up- and top-down perspectives regarding the conceptualisation of the social (here understood as the coordination of human behavior) is:

  • In a  bottom-up approach à la CAS, people (humans, individuals, agents, etc.) represent (social) components of an emerging complex adaptive system such as an organization.
  • In Luhmann´s systems theoretical top-down approach, a human individual is interpreted  as  a kind of agglomeration of various (biological, neural, cerebral, etc.) systems. Each of these systems follows its own operational mode. But, some of them are structurally coupled so that they are able  to perturb each other in a co-evolutionary drift.
    The  structural coupling by means of nonverbal, oral, written, etc. media forms (= perturbation mechanisms) is then seen as crucial for the co-evolution of two or more consciousness systems processing thoughts and the social systems processing communications.

In this context, the social dimension refers to human behavior, action, and communication. But, the underlying coordination problem is much more general and has to be solved by

  • artificial entities such as virtual agents in simulations, artificial neural networks, robot swarms, etc.

and

  • all kinds of biological enitities [biological neural networks, fungi (Witzany 2012), plants (McGowan 2013; Witzany / Baluska 2012), bacteria (perhaps even viruses, see Hamzelou  2010), social insects, and more complex animals (Witzany 2014), including humans].

 

As this is an important topic, I´d like to dedicate several blog posts to this problem and its entity- and media-specific comnunication solutions.

(2) Large number of elements with rich and dynamic interactions
According to [Villiers-Botha / Cilliers 2010, pp. 27-28;  Cilliers 1998, pp. 3-4], complex systems consist of a large number of elements.
But, such a vast number isn´t enough. The quality, i.e. the dynamics and richness, of the relations / interactions between these elements is crucial, too. That is:

  • Dynamic means that there are some operations [such as self-organizational (order-from-noise or -chaos) processes, autopoiesis,  conditioned coproduction, etc.] going on that affect the elements of the complex system. In contrast, a static system where nothing happens (for example, a mathemical set of polygons, see Wikipedia 2016h) isn´t a complex system.
  • Rich refers to the fact that elements can somehow (physically, informationally, etc.) influence many other, but not all elements of a complex system.

This perspective of elements and relations / interactions implies a few important points:

(2a) Various types of complexity: 

  • If each and every element of a system with a vast number of elements was connected to every other element, the result would be an incomprehensible complexity that overwhelms an observer of the system (which could be the system itself). So, this kind of complexity resembles a theoretical horizon that a viable complex system can never achieve.
    If we combine this theoretical horizon with the distinction system / environment, we can differentiate between an incomprehensible system complexity and an incomprehensible environmental complexity. These are limit concepts that can´t be further specified. The only thing we can deduce is that there´s a complexity differential between system and environmental complexity because the first is always lesser then the latter. And this complexity differential applies to incomprehensible complexity, too.
  • reduced level of complexity is necessary for the formation of a viable complex system to happen. That is, some elements of the system are connected to many other, but not to all elements of the system:

Complexity, in this sense, means being forced to select; being forced to select means contingency; and contingency means risk. [Luhmann 1995, 25]

In short, system formation equals organized complexity (Warren Weaver) or structured complexity (Niklas Luhmann) where the following equation applies:

reduction =
selection =
contingency
=
(neither impossibility nor necessity)
risk  

  • Note:
    – This understanding of structured or organized complexity refers to both system (internal) and environmental (external) complexity. And this means that the structured environmental complexity, which exceeds the complexity of the system, is a construction of the system itself.
    – Further, a selection can be interpreted as an operation that results from the differential of structured environmental and structured system complexity.
    This applies especially to complex systems that operate under extreme time pressure such as human interaction systems where actions and reactions usually occur  in the blink of an eye. But, if it took hours, days, etc. for coordinating human behavior in face-to-face encounters, such a social system would rather sooner than latter collapse.
    In a strict sense, complexity can´t be directly observed – it´s incomprehensible. So, the formation of a complex system is always already engaged in the process of reducing internal and external complexities, which leads to system-dependent structured system and environmental  complexities.
    But, reduction processes by means of selective and contingent abstractions / specializations, functional or other system differentiations, etc. are only one side of the coin.
    The other side of the coin is the possible increase of system-dependent internal and external complexities. Example: When a(n) (human) organization grows in size (that is, it has more specialists, more functional departments, more divisional units, more  branches, etc.), it constructs a more specific system complexity. By doing so, it can handle more structured environmental complexity, too. This means, for example, more contacts to its public, its customers, or to members of other organizations, etc.
    In sum: It´s not only about the reduction of complexity, but also about the increase of a more specific internal and external (structured) complexity.

(2b) Other qualities of dynamic and rich interactions

  • Self-reference / recursion / feedback loops lead to unpredictability

[…] there are systems that have the ability to establish relations with themselves and to differentiate these relations from relations with their environment. ]Luhmann 1995, 15]

The effect of any activity can feed back onto itself, sometimes directly, sometimes after a number of intervening stages. This feedback can be positive (enhancing, stimulating) or negative (detracting, inhibiting). Both kinds are necessary. [Cilliers 1998, 4]

Or, to put it a bit differently: An observer can assume self-referential operations (recursions or negative / positive feedback loops) in some complex systems.
Example: The re-entrant use of the distinction system / environment within  consciousness systems or social systems. This re-entry of the distinction on the side of the system creates a paradoxical indeterminacy for tertium-non-datur-observers, i.e. observers operating with binary logical principles (the laws of identity, non-contradiction, and excluded middle, see Wikipedia 2016m) because the system is, at the same, what it is (system = system) and what it is not (system =  environment).
This means, for instance,  in the case of:
– consciousness systems: thinking (thinking as self-referential operation / thinking about something as other-referential operation),
– communication (i.e. social) systems such as organizations: communicating (communicating as self-referential operation / communicating about something as other-referential operation).
Alternative general wordings could be:
– self-reference = self-reference and self-reference = other-reference, in short: self-reference (self-reference / other-reference)
or
– inside (inside / outside)
or, in a very general differential sense: the paradoxical re-entry of the two-sided form (marked state / unmarked state) within itself [see Baecker 2012a, 2015b; Wikipedia 2016n].
main consequence of such self-referential or recursive somersaults is: unpredictability.

  • Nonlinearity
    Elements usually influence their immediate neighbors so that interactions are often short-range. But, mediated by other elements that can modify the influences in various ways (suppression, intensification, etc.), long-range influences are possible, too [see Villiers-Botha / Cilliers 2010, 28].
    Because of this and the recursive causality mentioned above, small causes can have large effects and vice versa. This means for complex systems that can be observed as self-referential: There are too many causes, too many effects, and too many recursions so that simple causal attributions fail! 
  • Conditioning of relations

Systems are not merely relations (in the plural!) among elements. The connections among relations must also somehow be regulated. This regulation employs the basic form of conditioning. That is to say, a determinate relation among elements is realized only under the condition that something else is or is not the case. […]
Conditioning can also concern the availability of specific elements, the presence of catalytic agents, or the realization of higher-level relations among relations. [Luhmann 1995, 23 – my emphasis]

If such contingent conditionings are successful, they act as constraints , i.e. as restrictions and enabling conditions [cf. Luhmann 1995, 23].

(2c) Complexity understood as lack of information

When the number is relatively small, the behaviour of the elements can often be given a formal description in conventional terms. However, when the number becomes sufficiently large, conventional means (e.g. a system of differential equations) not only become impractical, they also cease  to assist in any understanding of the system. [Cilliers 1998, 3]

In other words, an observer, who could be the system itself, lacks the information to fully understand a complex system, i.e. structured system complexity,  or its environment, i.e. structured environmental complexity. As a result, an observer is always already overwhelmed by these complexities.
But, language-based observers are able to problematize their lack of knowledge and re-introduce it in language-based systems (families, organizations, etc.), for example, as a concept, as an unknown quantity, as uncertainty, risk, anxiety, etc. [see Luhmann 1995, 28].

On a final note, I´d like to mention that the following blog posts won´t continue this series of complexity features.Rather, they´re blog posts hors série called ComplexiQuickies. These quickies are related to the subject of complexity, but they act as quickly written and open idea sketches.
So, the first ComplexiQuickie discusses the following question: How can we deal with complex systems or situations? And the next regular blog post regarding features of complexity discusses the problem of emergence.

References

[Allen / Maguire / McKelvey 2011]  Allen, P.  / Maguire, S. / McKelvey, B. (eds.) (2011), The SAGE Handbook of Complexity and Management, Los Angeles et al.: SAGE.

[Baecker 2012a] Baecker, D. (2012a),  Aristotle and George Spencer-Brown, URL: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2073361 [accessed Sept 18, 2015].

[Baecker 2015b] – (2015b),  Working the Form: George Spencer-Brown and the Mark of Distinction*.
URL: Working the Form [accessed May 26, 2016].

[Cilliers 2010] Cilliers, P. (2010), Difference, Identity and Complexity, in: [Cilliers / Preiser 2010], 3-18.

[Cilliers 1998] – (1998), Complexity and Postmodernism: Understanding Complex Systems, London / New York: Routledge.

[Cilliers / Preiser 2010] – / R. Preiser (eds.) (2010), Complexity, Difference and Identity. An Ethical Perspective, Dordrecht et al.: Springer.

[Hamzelou  2010] Hamzelou, J.(2010), Viruses use ‘hive intelligence’ to focus their attack, in: New Scientist (Jan. 21, 2010),
URL:  https://www.newscientist.com/article/dn18423-viruses-use-hive-intelligence-to-focus-their-attack  [accessed May 24, 2016].

[Luhmann 1995] Luhmann, N. (1995), Social Systems, Stanford California: Stanford University Press.

[McGowan 2013] McGoan, K. (2013), How plants secretly talk to each other, in: Wired (Dec.2013).
URL: http://www.wired.com/2013/12/secret-language-of-plants/ [accessed May 26, 2016].

[Mesjasz 2010]  Mesjasz, C.  (2010), Complexity of Social Systems, in: Acta Physica Polonica (2010), vol. 117, no. 4, 706-715.
URL: http://przyrbwn.icm.edu.pl/APP/PDF/117/a117z468.pdf [accessed March 20, 2016].

[Miller / Page 2007] Miller, J.H. / Page, S.E. (2007),  Complex Adaptive Systems. An Introduction to Computational Models of Social Life, Princeton / Oxford: Princeton University Press.

[Mitleton-Kelly 2003] Mitleton-Kelly, E.(2003), Ten principles of complexity and enabling infrastructures, in: id.  (ed.) Complex Systems and Evolutionary Perspectives on Organisations: the Application of Complexity Theory to Organisations, Oxford, UK: Elsevier, 3-20.
URL:Ten principles of complexity and enabling infrastructures [accessed April 28, 2016].

[Villiers-Botha / Cilliers 2010] Villiers-Botha, T. de / Cilliers, P. (2010),  The Complex “I”: The Formation of Identity in Complex Systems, in: [Cilliers / Preiser 2010], 19-38.

[Wikipedia 2016h] Wikipedia (2016h), Set (mathematics),
URL:https://en.wikipedia.org/wiki/Set_(mathematics) [accessed May 23, 2016].

[Wikipedia 2016l] – (2016l), Self-organization,
URL: https://en.wikipedia.org/wiki/Self-organization [accessed May 23, 2016].

[Wikipedia 2016m] – (2016m), Law of thought,
URL: Law of thought [accessed May 26, 2016].

[Wikipedia 2016n] – (2016n), Laws of Form,
URL: Laws of Form [accessed May 26, 2016].

[Witzany 2014]  Witzany G. (ed.) (2014) Biocommunication of Animals, Dordrecht et al.: Springer.

[Witzany 2012]  – (ed.) (2012),Biocommunication of Fungi, Dordrecht: Springer.

[Witzany / Baluska 2012] / Baluska, F. (eds.) (2012), Biocommunication of Plants, Berlin / Heidelberg: Springer.

Subproject 1: The Paradigm of (Social) Complexity – Part II-2c: System Formation and Maintenance – The Distinction “System / Environment”

As we saw in the previous blog post (The Scalability Problem | Valonqua ), there are several non-essentialist approaches that can be used to replace an essentialist conceptual strategy regarding features of complexity.
One of these non-essentialist approaches is to couple difference-based how-questions  (How – that is, by means of which differences – is something, here: a feature XY, created?) with an equivalence functionalist observational schema where various functionally equivalent solutions are related to a problem [see Knudsen 2011; Luhmann 1995].
As a result, a list of complexity features (elements with rich and dynamic interactions, emergence, nonlinearity, etc.), previously interpreted as essential for an abstract complex system and its empirical manifestations, is transformed into an open-ended list of difference- and problem-based questions – or, in short: into a list of features-as-problem-concepts.
In this and the following blog post/-s, I´d like to discuss some of these features.

Feature System Formation and Maintenance

How is the formation and maintenance of a (complex) system possible?
As I intend to write several blog posts on the idea of systems as differences, the conceptualization of the social dimension, types of (social) systems, etc., I give here just a few hints:
Whole / parts is a  possible distinction (synonym here: difference) to designate a system. But, it´s a very traditional distinction that  can be traced back to Greek antiquity [see σύστημα – Wiktionary].
Although, it`s still useful in certain, esp. technical contexts, it’s often replaced by two more modern distinctions in complexity research: system / environment and element / relation.
The guiding distinction system / environment involves a few crucial questions:

  • System formation
    How is a system able to differentiate itself from its environment?
    – Or, to bring in the observer explicitly: Which explanatory mechanisms can a language-based observer come up  with to account for the formation of a discipline-specific and concrete system?
  • Boundary determination and maintenance
    How can a boundary be determined?

Given the abstract concept of boundary, the concept of the difference between system and environment, one cannot decide whether the boundary belongs to the system or to the environment. Viewed logically, the difference itself is something third. If one includes the problem of the difference in degree of complexity as an aid to interpretation, however, then one can relate boundaries to the function of stabilizing this difference in degree, for which only the system, not the environment, can develop strategies. Viewed from the system’s perspective, they are “self-generated boundaries” –membranes, skins, walls and doors, boundary posts and points of contact.  [Luhmann 1995, p. 29]

Note: A fuzzy boundary or no boundary at all, won´t do it because there would be no (complex) system whatsoever. And if, for example, the immune system of your body is no longer able to differentiate between the inside and the outside, you´ll probably get sick [-> (autoimmune) disease] or even die.
For Luhmann, the determination of the boundary of a complex system is an empirical and not simply an analytical question of an external and language-based observer [see Luhmann 1995, p. 30]:

Boundaries count as adequately determined if problems concerning their location or the assignment of events as being inside or outside of them can be solved using the system’s own means–for example, if an immune system can use its own modes of operation to discriminate, in effect, between internal and external, or if the societal system, which is composed of communications, can decide by communication whether something is communication or not. For a (scientific) observer, where the boundaries lie may still remainanalytically unclear, but this does not justify viewing the bounding of systems as a purely analytical determination.  (The situation is quite different, naturally, if it is a question of bounding research objects!) An observer interested in reality remains dependent here on the system’s operative possibilities of determination.

How is the boundary dynamically maintained [see Bailey 2008]?
This means that a boundary of a complex system isn´t established once and for all. Instead, such a system has to operate in time. And, while doing so it continually reproduces the boundary between inside and outside, system and environment.
Note: If complex systems are characterized by operational and temporal dynamics, it doesn´t make sense to conceptualize media such as language [Cilliers 1998] or technology [Kevin Kelly´s concept of technium comes to mind, see Kelly 2010] as such systems because they aren´t able to operate (by) / reproduce themselves.
But, this might change as soon as truly self-replicating machines are in (widespread) use [see Wikipedia 2016j; see also the interesting Replicating Rapid-Prototyper  (RepRap) project: http://www.reprap.org/].
Here`s a picture of RepRap version 1.0 named Darwin [Reprap_Darwin by CharlesC, licensed under CC BY-SA 3.0]:

reprap_darwin

  • And boundary determination / maintenance refers to the differentiation of a (sub-)system (see below) as well:

Next to systems’ constituting their own elements, boundary determination is the most important requirement of system differentiation [Luhmann 1995, 29].

  • Openness and closure of a complex system
    How does a complex system regulate the relationship of openness and closure regarding its environment?
    – A complex system can´t be completely open (some kind of boundary being maintained is necessary). But, it can´t be completely hermetic either. So, is a complex system always open for energy, matter, and information? [For the generic options being available, see Bailey 2008].

500px-opensystemrepresentation-svg

OpenSystemRepresentation by Krauss, licensed under CC BY-SA 4.0

Or, are there, for instance, complex systems that are open for energy and matter, but closed for information? And if such a complex system creates and processes information only within the system, which perturbation mechanisms (besides causal relationships) can be conceptualized so that the environment can somehow influence the system?

  • (Sub-)System differentiation
    How can the difference system / environment be used within the system to form subsystems (for example: science -> scientific disciplines -> scientific subdisciplines -> scientific subsub…disciplines)?
  • Characteristics of the environment of a complex system
    – The environment of a complex system can´t be completely arbitrary or chaotic because such highly unstable conditions would undermine the viability of a complex system.
    – We can distinguish (at least) three types of environment of a complex system:
    -> Environment 1: On this very basic operational level, the environment represents what the system is not. This resembles a kind of flat difference where the environment is undefined or unmarked (George Spencer Brown / Dirk Baecker).
    -> Environment 2: On this second observational level, the environment is a construction of the complex system.
    Example: A complex, esp. language-based system such as a consciousness system or a social system (family, organization, etc.) uses the difference system / environment within itself.
    To put it differently: The difference system / environment is used by means of a selfreferential salto within the system. Or, as Bielefeld system theorists would prefer to say: The difference system / environment re-enters on the side of the system: system (system / environment).
    Note: We´ll come back to this subject when we discuss another feature-as-problem-concept of complex systems, recursivity / self-reference / feedback loops.
    -> Environment 3: Other (complex) systems in the environment of a complex system. This refers to an external observer able to observe the complex system and the (complex) systems (with their respective environments) in the environment of the first system to be studied.
  • Two types of complexity
    With the distinction system / environment, it´s possible to differentiate two types of complexity. One type refers to the environment. Therefore, it`s called environmental complexity.
    This type of complexity has two subtypes:
    -> An unspecified or disorganized environmental complexity that ressembles an overwhelming world complexity.
    ->  A specified and organized environmental complexity.
    The first subtype is a hypothetical negative correlate of the complex system. The second subtype is a construct of the system.
    The other (specified and organized) type of complexity is called system complexity.  And W. Ross Ashby´s Law of Requisite Variety [see Wikipedia 2016k] is applicable to it.
    In this context, two other points are of interest:
    First, there´s an asymmetrical relation between environmental and system complexity. In short, there´s a complexity differential. This means that the environment is always more complex than the complex system itself.

In other words, the difference between system and environment stabilizes the difference in relative degrees of complexity. The relation between system and environment is necessarily asymmetrical. The difference in degree of relative complexity goes in one direction and cannot be reversed. Every system must maintain itself against the overwhelming complexity of its environment, and any success, any permanence, any reproduction makes the environment of all other systems more complex. Given many systems, each evolutionary success increases the difference in complexity for other systems in relation to their environments and thus works selectively on what then remains possible. [Luhmann 1995, 182].

    Second, the internal complexity of a system can be both reduced and increased.

For more details on complexity, see the subsequent blog post in which I discuss the other guiding distinction that is relevant in the context of system formation: element / relation.

References

[Allen / Maguire / McKelvey 2011]  Allen, P.  / Maguire, S. / McKelvey, B. (eds.) (2011), The SAGE Handbook of Complexity and Management, Los Angeles et al.: SAGE.

[Ashby 1958] Ashby , R.W. (1958), Requisite variety and its implications for the control of complex systems, in: Cybernetica 1:2, 83-99, republished on the web byHeylighen, F. —Principia Cybernetica Project.
URL: http://pespmc1.vub.ac.be/Books/AshbyReqVar.pdf accessed  May 26, 2016].

[Bailey 2008] Bailey, K. D. (2008), Boundary maintenance in living systems theory and social entropy theory, in: Systems Research and Behavioral Science (2008), vol. 25, issue 6, 587–597.
URL: Boundary maintenance in living systems theory and social entropy theory [accessed  May 10, 2016].

[Cilliers 2010] Cilliers, P. (2010), Difference, Identity and Complexity, in: [Cilliers / Preiser 2010], 3-18.

[Cilliers 1998] – (1998), Complexity and Postmodernism: Understanding Complex Systems, London / New York: Routledge.

[Cilliers / Preiser 2010] – / R. Preiser (eds.) (2010), Complexity, Difference and Identity. An Ethical Perspective, Dordrecht et al.: Springer.

[Kelly 2010] Kelly, K. (2010), What Technology Wants, New York et al.: Penguin.

[Knudsen 2010] Knudsen, M. (2010), Surprised by Method—Functional Method and Systems Theory, in: Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 11(3), Art. 12.
URL: http://www.qualitative-research.net/index.php/fqs/article/view/1556/3067  [accessed April 27, 2016].

[Luhmann 1995] Luhmann, N. (1995), Social Systems, Stanford, Cal.: Stanford University Press.

[Mesjasz 2010]  Mesjasz, C.  (2010), Complexity of Social Systems, in: Acta Physica Polonica (2010), vol. 117, no. 4, 706-715.
URL: http://przyrbwn.icm.edu.pl/APP/PDF/117/a117z468.pdf [accessed March 20, 2016].

[Miller / Page 2007] Miller, J.H. / Page, S.E. (2007),  Complex Adaptive Systems. An Introduction to Computational Models of Social Life, Princeton / Oxford: Princeton University Press.

[Mitleton-Kelly 2003] Mitleton-Kelly, E.(2003), Ten principles of complexity and enabling infrastructures, in: id.  (ed.) Complex Systems and Evolutionary Perspectives on Organisations: the Application of Complexity Theory to Organisations, Oxford, UK: Elsevier, 3-20.
URL:Ten principles of complexity and enabling infrastructures [accessed April 28, 2016].

[Wikipedia 2016i] – (2016i), Open system (systems theory),
URL: https://en.wikipedia.org/wiki/Open_system_(systems_theory) [accessed May 4, 2016].

[Wikipedia 2016j] – (2016j), Self-replicating machine,
URL: https://en.wikipedia.org/wiki/Self-replicating_machine  [accessed May 10, 2016].

[Wikipedia 2016k] – (2016k), Variety (cybernetics),
URL: https://en.wikipedia.org/wiki/Variety_(cybernetics) [accessed May 10, 2016].

Subproject 1: The Paradigm of (Social) Complexity – Part II-2b: Features of Complexity – The Scalability Problem

The fact that multipleperhaps even conflicting conceptualizations of complex systems are inevitable (see Intro “Complex Systems” | Valonqua) challenges a basic assumption of many complexity researchers [for example, Cilliers 2010; Mitleton-Kelly 2003; Sporns 2007]. Namely, the assumption that a(n) (open-ended) list of general complexity features is applicable to complex systems on different scales and across various scientific disciplines.

Why is this assumption problematic? There are two main problems involved with this assumption:

Problem 1 – The essentialist relationship between the abstract and its concrete manifestations
When we assume that an abstract complex system consisting of some key characteristics (such as  rich interactions, nonlinearity, unpredictability, etc) forms the basis of  discipline-specific realizations of complex systems, we follow an essentialist conceptual strategy.
Essentialist often refers to the (traditional) attempt to determine stable identities (that is, the truth, the real meaning, or the essence of something). But, it can refer to the relationship between the transcendental conditions of possiblity of something  (here: the features of an abstract complex system) and its empirical realizations (here: the concrete complex systems in the natural, technical, and social sciences), too.
Underlying the essentialist relationship between the transcendental and the empirical  is a deterministic view. This means that the transcendental conditions of possibility are able to determine all their empirical (concrete) realizationsConsequently, new concrete phenomena aren´t really new, but just controlled variations of what is already known by the transcendental determinations.
This means further that the concrete realizations don´t affect the transcendental conditions of possibility. In short: The transcendental affects the empirical, but not vice versa – at least in a traditional perspective.
Such essentialist conceptual strategies [for more details see, for instance, Bormann 2012, 2003c; Hofstadter 1985] aren´t plausible any more and have been widely criticized since the 1950s. Instead, alternate non-essentialist strategies have been proposed. Examples:

  • Ludwig Wittgenstein´s concept of family resemblance:

It argues that things which could be thought to be connected by one essential common feature may in fact be connected by a series of overlapping similarities, where no one feature is common to all. [Wikipedia 2016e]

This could mean in our context:
First, we have to deal with an open-ended series of complexity features which may be characteristic for some concrete complex systems, but not for others.
Second, we always compare concrete complex systems with each other, but not some abstract entity (the transcendental) with its concrete actualizations (the empirical phenomena).
Example: Instead of presupposing a proto-form (the essence) of character k, we only compare concrete variations of this character, as illustrated by the following figure from [Hofstadter 1985, 275]:

SpiritofLetterK-BlogpostNr4

 

 

 

 

As you can see, these variations are similiar in some, but not in other regards. And there isn´t a single feature that is common to all variations of character k.

  • Jacques Derrida´s open-ended series of interrelated quasi-transcendental infrastructures [see Gasché 1986]  such as différance [Wikipedia 2016f] or itérabilité where the relationship between the transcendental and the empirical is intertwined.
    That is: The transcendental underlies the empirical, but is, at the same time, affected by the latter.  Therefore, this non-essentialist relationship becomes unstable, complex,  and circular.
    This means further: In such an open-ended series of quasi-transcendentalities (différance, itérabilité, etc.), we don´t have to deal with  essentialist conditions of possiblity (simple origins, essences, etc.) any more. Rather, each of these non-concepts can play the role of an empirico-transcendental indecidability – in certain contexts [see Gasché 1986].
    This could mean for complexity research: We can deconstruct some texts on complex systems and see if such empirico-transcendental indecidabilities emerge.
  • Similar to Derrida´s deconstructive approach, sociological systems theory (Luhmann et al.) would give the (traditional) guiding distinction transcendental / empirical a selfreferential twist.
    That is: Instead of assuming a simple and linear founding principle (here: the transcendental conditions of possibility), the distinction transcendental / empirical is interpreted as re-entering on either side of the distinction: transcendental (transcendental / empirical) or empirical (transcendental / empirical). And this leads to a similar empirico-transcendental indecidability (a paradox for observers using binary logic) as in the deconstructive case.
    In more general terms: Paradoxes are the non-essentia-list (non-)beginnings of current universalist approaches. And these indecidabilities have to be unfolded by using various strategies of deparadoxation.
    I´ll talk more about this subject in some of the subsequent blog posts on sociological systems theory.
  • Another option is an equivalence functionalist approach, which uses the observational schema of problem and several solutions being functionally equivalent [see Knudsen 2011; Luhmann 1995]. This means:

If one wants to check the fruitfulness of  generalizations, one must position the concepts used at the most general level of analysis, not as concepts describing possibilities but as concepts formulating problems. Thus general systems theory does not fix the essential features to be found in all systems. Instead, it is formulated in the language of problems and their solutions and at the same time makes clear that there can be different, functionally equivalent solutions for specific problems. Thus a functional abstraction is built into the abstraction of generic forms that guides comparison of different system types  [Luhmann 1995, 15, my emphasis]

This is the non-essentialist strategy I´d like to follow when discussing the open-ended list of (key) features of complex systems.
To be more precise: If we connect this equivalence functionalist schema with a difference-based way of asking questions (What-questions are replaced by how-questions, that is: how – by means of which differences – is feature XY constituted?) then the list of key features (previously seen as essential) is transformed into a list of difference- and problem-based questions (see next blog post).

Problem 2 – The narrowing of the solution space
This means in our context that a specific solution is overgeneralized or represented as the only solution. Examples:

  • A bottom-up perspective of system formation (that is, a system emerges from the interplay of its elements) seems to dominate in complexity, esp. complex adaptive systems (CAS) research. But, at least, for the emergence of social systems, a top-down perspective is plausible, too [see Luhmann 1995].
  • Complex (adaptive) systems that are interpreted as open systems interacting with their environment [see, for instance, various authors in Allen / Maguire / McKelvey 2011; Cilliers 1998, 4; Wikipedia 2016g].
    The problem isn´t simply that this perspective belongs to an older tradition of systems theoretical research. The problem is rather that there might be different solutions to the problem of openness / closure of a (complex) system:
    – Within the same scientific domain: It depends on the ingenuity of the scientific observers to come up with plausible explanation mechanisms of how disciplin-specific complex systems might regulate their openness and closure.
    – Between different scientific domains: Different complex systems (chemico-physical dissipative structures, biological cells, swarms of biological or artificial agents, human organizations, etc.) might have developed different solutions to the openness-closure-problem.
    In short: There might be more than one mode of boundary (as system) maintenance for complex (adaptive) systems!
  • Mechanisms for coordinating behavior and actions: Coordination mechanisms are necessary for evolved complex (adaptive) systems such as collections of cells, biological and artificial swarms, animal and human interactions, human organizations, etc.
    But, it´s a mistake to believe that exchange- or transmission-based information models (esp. variations of the sender-receiver-model of communication) are the only relevant approaches in this context. These approaches are well suited for modeling technical communication processes (i.e., data and signal exchanges). They might be less suited for conceptualizing biological communication processes. And these technical approaches are probably not at all suited for the conceptualization of human communication [see Baecker 2013; Derrida 1971; Luhmann 1992a].
    Again: For different types of complex systems, we have to expect various mechanisms that are able to solve the problem of behavioral coordination.

With these two problems at the back of our minds, we can formulate a(n open-ended) list of (complexity-features related) questions that are based on two perspectives:

  • A perspective that focuses on which differences are used to conceptualize a feature XY (How-questions)
  • The equivalence functionalist problem-solutions-schema mentioned above.

And this list of questions is the subject of the subsequent blog posts.

References 

[Baecker 2013] Baecker, D(2013), Systemic Theories of Communication, in: Cobley, P. / Schulz, P.J. (eds.) (2013), Handbooks of Communication Sciences, vol. 1, Berlin: de Gruyter Mouton, 85-100. URL:http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1865641 [accessed
Sept 20, 2015].

[Bormann 2012] Bormann, P. (2012), Einige Überlegungen zu essentialistischen Strategien.
URL: https://kybernetiks.wordpress.com/2012/06/09/kurze-uberlegungen-zu-essentialistischen-strategien/  [accessed April 27, 2016].

[Bormann 2003 c] – (2003 c), Zur Problematik des “Possibilismus”. Der mögliche Rückfall der Systemtheorie in den Essentialismus.
URL:  http://www.fen.ch/texte/gast_bormann_possibilismus.pdf   [accessed April 27, 2016].

[Cilliers 2010] Cilliers, P. (2010), Difference, Identity and Complexity, in: – / R. Preiser (eds.) (2010), Complexity, Difference and Identity. An Ethical Perspective, Dordrecht et al.: Springer, 3-18.

[Derrida 1971] Derrida, J. (1971), Excerpt from “Signature, Event, Context”. URL: http://hydra.humanities.uci.edu/derrida/sec.html  [accessed Febr 25, 2016].

[Gasché 1986] Gasché, R. (1986), The Tain of the Mirror: Derrida and the Philosophy of Reflection, Cambridge, Mass. / London: Harvard University Press.

[Hofstadter 1985] Hofstadter, D.R. (1985), Metafont, Metamathematics, and Metaphysics: Comments on Donald Knuth’s Article “The Concept of a Meta-Font”, in:  – (1985), Metamagical Themas: Questing for the Essence of Mind and Pattern, Basic Books, 260-296.

[Knudsen 2010] Knudsen, M. (2010), Surprised by Method—Functional Method and Systems Theory, in: Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 11(3), Art. 12.
URL: http://www.qualitative-research.net/index.php/fqs/article/view/1556/3067  [accessed April 27, 2016].

[Luhmann 1995] Luhmann, N. (1995), Social Systems, Stanford, Cal.: Stanford University Press.

[Luhmann 1992a] – (1992a), What is Communication?, in: Communication Theory (1992), 2: 251–259. URL: http://www.scribd.com/doc/46278964/Luhmann-What-is-Communication#scribd [accessed Sept 18, 2015].

[Mitleton-Kelly 2003] Mitleton-Kelly, E.(2003), Ten principles of complexity and enabling infrastructures, in: id.  (ed.) Complex Systems and Evolutionary Perspectives on Organisations: the Application of Complexity Theory to Organisations, Oxford, UK: Elsevier, 3-20.
URL:Ten principles of complexity and enabling infrastructures [accessed April 28, 2016].

[Sporns 2007] Sporns, O. (2007), Complexity, in: Scholarpedia, 2(10):1623.
URL: http://www.scholarpedia.org/article/Complexity  [accessed April 25, 2016].

[Wikipedia 2016e] Wikipedia (2016e), Family resemblance,
URL: https://en.wikipedia.org/wiki/Family_resemblance [accessed April 25, 2016].

[Wikipedia 2016f] – (2016f), Différance,
URL: https://en.wikipedia.org/wiki/Diff%C3%A9rance [accessed April 25, 2016].

[Wikipedia 2016g] – (2016g), Complex adaptive systems,
URL:https://en.wikipedia.org/wiki/Complex_adaptive_systems [accessed April 28, 2016].

Subproject 1: The Paradigm of (Social) Complexity – Part II-2a: Intro “Complex Systems”

(2a) Intro Complex Systems

The features of complexity such as unpredictability, nonlinearity, etc. that I´d like to discuss in a series of blog posts refer to organized complexity in the sense of [Weaver 1948]. And organized complexity is just another expression for the formation  of an entity that
is composed of elements (synonyms: parts, components, or agents) and their relations (as dynamic interactions). It escapes simple causal logic or other scientific approaches such as probability theory and statistics, too.

Such an entity is usually called  a (complex) system (synonyms in certain contexts: network, swarm, etc.).
However, this is not an essentialist definition of a system, but rather a problem concept. So, the question isn´t any more what a (complex) system really is (its true meaning, nature, or essence), but:

  • How do we conceptualize a complex system in a specific (physico-chemical, biological, psychological, sociological, etc.) domain?

And this is equivalent to the following two questions:

  • Which differences are or can be used to conceptualize such a system?
  • How is a complex system possible? That is, which are the explanatory mechanisms, esp.  regarding the openness / closeness and boundary maintenance of such a system?

This non-essentialist or deontologized perspective refers to three positions.

Position 1 – Differences:
D
ifferences are constitutive for creating something as something on an information (not a material!) level. Examples:

  • Foreground / background as a basic difference for perceptual phenomena. See, for instance, this GliderGun from Conway´s Game of Life [Gol-gun by Toxic~frwiki, licensed under CC BY-SA 3.0]:
    Gol-gun
  • The word father instead of equivalent expressions such as son, daughter, mother, grandfather, etc. that are equally plausible, for example, in family related contexts. So, the meaning of the language form father isn´t an intrinsic quality, but it depends on what it is not (namely, son, daughter, mother,  etc.).
    Or, to put it in more general terms following the Swiss linguist Ferdinand de Saussure whose insights have later been radicalized by difference-based approaches such as deconstruction (Jacques Derrida), form theory (George Spencer Brown / Niklas Luhmann / Dirk Baecker), or discourse theory (Ernesto Laclau / Chantal Mouffe):

The sign is determined by the way in which it differs from all the other signs in the system – “in language there are only differences without positive terms”. The sign is a node in a network of relationships.
The relationships are not determined by the sign, rather, the sign is the result of interacting relationships. [Cilliers 2010, p. 6, referring to Saussure].

Position 2 – Observer dependency:
The expression observer means here, in a general or abstract sense, every processor of differences. Therefore, the traditional observer as the processor of perceptions is just a special case of a system capable of processing differences. Other possible observers are, for example:

  • immune systems
  • neural systems / networks,
  • brains (cerebral systems),
  • psychic systems (as processors of various perceptions),
  • consciousness systems (as processors of thoughts), or
  • social systems (families, organizations, etc.).

Position 3 – (Operative) Constructivism:
If every phenomenon depends on media specific differences used by a specific observer in a particular context then these phenomena can be interpreted as difference-based and observer dependent (operative) constructions.
Consequences: 

  • Observer relationalism:
    An operative constructivist position isn´t equivalent to a position of relativistic indifference. It´s rather equivalent to a position of observer relationalism. So, the crucial question in this context is: Which observing system uses which differences in a particular context to construct this (and not a different) reality?.
  • Multiperspectivity (synonym sometimes: polycontexturality):
    Observer relationalism includes often a plurality of possible observers (= multiple constructions) when observing a scientific, economic, political, etc. phenomenon. And this irreducible plurality of observing systems supplants the traditional guiding difference of subject / object.
  • Reality as  reality construction:
    The expression reality is just the short version for: difference-based and observer dependent (operative) reality construction on an information level. 
  • Deconstruction:
    Every (realist) description can be deconstructed by asking: Which observing system uses which context relative differences in this description?

In sum:
We can conceptualize a complex system in more than one way. For example, as

  • complex adaptive system (CAS) or a swarm intelligence system emerging from the dynamic interactions of its elements (agents) (CAS theory, see, for example, Miller / Page 2007 or Dorigo / Birattari 2007 for research on swarm intelligence). Examples: Flocks of birds, swarms of insects, etc.

Traffic in Hanoi as an example of a CAS

BBC – Example of a swarm intelligence system

Igor Nikolic – Complex adaptive systems – TED Talk 2010

  • An autopoietic (living) system (Humberto Maturana / Francisco Varela, see Wikipedia 2016d).
    Example: A biological  cell [3D-SIM-4 Anaphase 3 color (3D representation of two mouse daughter nuclei in a late stage of nuclear division (Telophase)) by Lothar Schermelleh, licensed under CC BY-SA 3.0]:
    ]:
    3d-sim-4_anaphase_3_color
  • A self-referential system where the observing system distinguishes between itself and its environment by processing the difference system / environment dynamically within the system (Bielefeld School of social systems theory: Niklas Luhmann, Dirk Baecker, Peter Fuchs, etc.).
    Examples: Consciousness systems and communication-based social systems such as interactions, organizations, and society.
  • A biological, neural, cerebral, psychic, conscious, social, etc. system as the construction or attribution of a language-based observer according to the German sociologist Dirk Baecker.
    It´s then the responsibility of this language-based observer to elaborate on the explanatory mechanisms of such a constructed system (its mode of boundary maintenance, its possible recursivity / self-reference, etc.).
  • A discourse with empty signifiers (Essex School of discourse analysis: Ernesto Laclau, Chantal Mouffe, etc. – see Wikipedia 2016e).
  • A difference (or rather: différance)-based connectionist network (Cilliers 1998). Cillier´s example: Language.

    etc.

This deontologized view of concepts as unresolved problems for which several functionally equivalent solutions can be conceived has consequences for our (open-ended) list of features of complexity referring to complex systems, too.
As this is an important point which challenges a basic assumption of complexity research (namely, the assumption that such general features can simply be applied to concrete and discipline-specific complex systems on all scales – see, for example, Cilliers 2010; Mitleton-Kelly 2003; Sporns 2007), I´d like to dedicate a whole blog post to it (see the next post called Features of Complexity: The Scalability Problem).

Note:
While you´re waiting for the next blog post to be published, it´s perhaps a good idea to have a look at Melanie Mitchell´s video Complexity – A Guided Tour on Youtube. This video gives some nice examples and shows why the new paradigm of (social) complexity is so intriguing:

If you´d like further information on complex systems research, these  introductory texts (suited for non-specialists) are good reads:

  • Füllsack 2011 (an excellent text that is, unfortunately, only available in German)
  • Mitchell 2009 (in English)

 

References

[Cilliers 2010] Cilliers, P. (2010), Difference, Identity and Complexity, in: [Cilliers / Preiser 2010], 3-18.

[Cilliers 1998] – (1998), Complexity and Postmodernism: Understanding Complex Systems, London / New York: Routledge.

[Cilliers / Preiser 2010] – / Preiser, R. (eds.) (2010), Complexity, Difference and Identity. An Ethical Perspective, Dordrecht et al.: Springer.

[Dorigo / Birattari 2007] Dorigo, M. / Birattari, M. (2007), Swarm intelligence,  in: Scholarpedia, 2(9):1462.
URL: http://www.scholarpedia.org/article/Swarm_intelligence  [accessed March 23, 2016].

[Füllsack 2011] Füllsack, M. (2011),  Gleichzeitige Ungleichzeitigkeiten. Eine Einführung in die Komplexitätsforschung, Wiesbaden: VS Verlag für Sozialwissenschaften / Springer Fachmedien.

[Luhmann 2006] Luhmann, N. (2006), System as Difference, in: Organization (2006), 13 (1), 37-57. URL: https://steffenroth.files.wordpress.com/2012/03/systems-as-difference.pdf [accessed Sept 18, 2015].

[Luhmann 1995] – (1995), Social Systems, Stanford California: Stanford University Press.

[Mesjasz 2010]  Mesjasz, C.  (2010), Complexity of Social Systems, in: Acta Physica Polonica (2010), vol. 117, no. 4, 706-715.
URL: http://przyrbwn.icm.edu.pl/APP/PDF/117/a117z468.pdf [accessed March 20, 2016].

[Miller / Page 2007] Miller, J.H. / Page, S.E. (2007),  Complex Adaptive Systems. An Introduction to Computational Models of Social Life, Princeton / Oxford: Princeton University Press.

[Mitchell 2009] Mitchell, M. (2009), Complexity. A Guided Tour, Oxford et al.: Oxford University Press.

[Mitleton-Kelly 2003] Mitleton-Kelly, E.(2003), Ten principles of complexity and enabling infrastructures, in: id.  (ed.) Complex Systems and Evolutionary Perspectives on Organisations: the Application of Complexity Theory to Organisations, Oxford, UK: Elsevier, 3-20.
URL:Ten principles of complexity and enabling infrastructures [accessed April 28, 2016].

[Nicolis / Rouvas-Nicolis 2007] Nicolis, G. / Rouvas-Nicolis, C. (2007), Complex Systems, in:  Scholarpedia, 2(11):1473.
URL: http://www.scholarpedia.org/article/Complex_systems [accessed March 23, 2016].

[Sporns 2007] Sporns, O. (2007), Complexity, in: Scholarpedia, 2(10):1623.
URL: http://www.scholarpedia.org/article/Complexity  [accessed March 23, 2016].

[Weaver 1948] Weaver, W.(1948), Science and Complexity, in: American Scientist, vol. 36 (4), 536-544. URL: http://people.physics.anu.edu.au/~tas110/Teaching/Lectures/L1/Material/WEAVER1947.pdf [accessed March 22, 2016].

[Villiers-Botha / Cilliers 2010] Villiers-Botha, T. de / – (2010), The Complex “I”: The Formation of Identity in Complex Systems, in: [Cilliers / Preiser 2010], 19-38.

[Wikipedia 2016a] Wikipedia (2016a), Complex Systems, URL: https://en.wikipedia.org/wiki/Complex_systems [accessed Febr 24, 2016].

[Wikipedia 2016b] Wikipedia (2016b), Social Complexity, URL: https://en.wikipedia.org/wiki/Social_complexity [accessed  Febr 24, 2016].

[Wikipedia 2016c] Wikipedia (2016c), Complexity, URL: https://en.wikipedia.org/wiki/Complexity [accessed Febr 25, 2016].

[Wikipedia 2016d] Wikipedia (2016d), Autopoiesis, URL: https://en.wikipedia.org/wiki/Autopoiesis  [accessed Febr 25, 2016].

[Wikipedia 2016e] Wikipedia (2016e), Essex School of discourse analysis, URL: https://en.wikipedia.org/wiki/Essex_School_of_discourse_analysis [accessed March 23, 2016].

Subproject 1: The Paradigm of (Social) Complexity – Part II-1: Complexity – The Definition Problem

(1) Complexity – The Definition Problem

There is no absolute definition of what complexity means; the only consensus among researchers is that there is no agreement about the specific definition of complexity. [Wikipedia 2016 c]

Wikipedia is right. But, the search for an essentialist definition (the ultimate identity, the essence, or the true nature) of any media form (sign, mark, etc.) is futile – right from the start.

Why?

Well, as Jacques Derrida´s deconstruction has taught us since the late 1960s, any media form depends on permanent de- and recontextualizations so that the ultimate (essential) meaning of such a form can´t be determined once and for all.

Or, to put it differently: If the meaning of a sign depends on its context and if this context can´t be closed once and for all – otherwise the sign couldn´t be used in different contexts anymore ( = the collapse of any medium!) – then the meaning of a sign is always provisionary and somehow incomplete (that is, radically context-dependent = non-essentialist) [cf. Derrida 1971].

Accordingly, we can only choose a provisionary interpretation among other possible interpretations – or, in this case, a provisionary definition among other possible definitions. But, it´s impossible to know what the essence or true nature of complexity (or any other phenomenon) really really isTherefore, the more scientific (sub-)disciplines and approaches are involved in trying to define complexity, the more (sometimes even incompatible) interpretations / definitions have to be expected.

And that´s the reason why difference-based approaches such as social systems theory (Luhmann et al.) opt for replacing the notorious What is xy? question by the question of How (that is, by means of which differences) is xy constructed?
In other words: In order to understand a phenomenon xy, we can study the network of differences being used to specify xy.

So, our question could be formulated as follows: Which differences are used to specify the concept of complexityPossible answers would be:

  • organized / disorganized  [Wikipedia 2016 c],
  • decomposable (complicated) / nondecomposable (complex) [Le Moigne 1990, p. 25]
  • simple (reduced to simple entities) / implex (reduced complexity) [ibid.],
    etc.

As those differences often refer to characteristics of complexity, it´s better to jump right into the next section (Part II-2) where I discuss some important features of complex systems.

References