system design

»If you think of purpose, you must also think of coincidence and folly.«
Nietzsche

Besides coincidences and errors, purposes are the essential design factors of a system. Everything that exists is supported by its environment — otherwise, something else would have prevailed. This support is usually based on local utility considerations, but not infrequently also on disinformation: unknown alternatives can only be used by chance.

In this context, Stafford Beer coined the acronym POSIWID (“the purpose of a system is what it does”): „It stands for bald fact, which makes a better starting point in seeking understanding than the familiar attributions of good intention, prejudices about expectations, moral judgment or sheer ignorance of circumstances.“

While true coincidence can hardly be controlled, an organization’s knowledge base has considerable influence on its purposes, on other coincidences, and — last but not least — on its “organizational intelligence”.
The fundamental problem of designing more intelligent organizations cannot be solved in a targeted way without a pragmatic treatment of organizational knowledge quality. Yet as long as an organization is not under acute threat, there is typically no pressing incentive to do so.

On the contrary, it is usually in the best interest of organizational structures not to question their own performance. Each area of responsibility must normally be regarded as efficient; otherwise, it would not be fulfilling its responsibility. (This marks an essential difference between project and process organization: operations thrive on stability, projects on change; they rarely coexist without friction.)

Avoiding true improvement can even be advantageous from a systemic perspective: liability risks in decision-making are based on market conventions, which are not absolutely fixed. It is therefore “system-rational” (in Luhmann’s sense) to follow the corridor of common practice in order to avoid sanctions. Exceeding this corridor can increase pressure on the system as a whole — not to mention obstacles driven by individual interests, such as injelitance.

If common practice is exceeded over a longer period, this can even trigger “arms races” and destabilize long-established dynamic balances—a regular side effect of successful disruptive competition:

In the absence of acute disruptive threats, the path of least resistance is therefore to remain calm and adapt as best as possible to the system environment. As a result, the demand for truly fundamental transformation is generally low. Meanwhile, the exponential development of information technology opens up fundamentally new possibilities for organizational transformation — yet these are regularly underutilized due to the usual resistance to change.

This creates a major advantage for disruptors: new competitors face fewer system-rational legacy problems and can focus directly on the most efficient solutions.

Effectively closing this competitive gap ultimately requires a consensual break with system rationality —meaning a fundamental problematization of areas that so far could not, or should not, be addressed as problems.
Here, the phenomenon of Passive (or Qualitative) Disinformation offers the simplest point of access, and thus new, pragmatic and empirically valid opportunities for tackling the corresponding challenges in system design.


© 2020-2025 Dr. Thomas R. Glueck, Munich, Germany. All rights reserved.