System theories
"Well-done! Very clean and essentialized." "Accurate and easy-to-follow... highly recommended."
The ISO 42010 standard on the architectural description of systems takes no position on what a system is. "Users of the Standard are free to employ whatever system theory they choose." It turns out there are several theories, and so several ways to describe an entity, object or enterprise as a system. It is a mistake to assume one is the true way, or to think that different views can readily be integrated.
This article discusses
Contents: Classification systems. Dynamic systems. Cybernetics, System Dynamics, Event-driven activity systems. Scaling up to social systems thinking.
By the way, though this article is relevant to my day job, teaching classes in enterprise, solution and software architecture , and it might be of passing interest to enterprise and software architects, it won't help you get any certificate in those topics.
Systems in general
System: a collection of elements related in a pattern or by rules.
This definition is so generalized it is of little help to people teaching and applying particular system theories. Different schools of systems thinking bring different elements to the fore.
Given that the "system of interest" is a pattern that relates elements, everything outside that is outside the system. Moreover, even elements inside its boundary are only parts of the system in so far as they are related in the system's pattern.
Further reading
One way to introduce the broad scope of systems thinking is to skim through the vocabulary used in discussion of systems, which you can find in this other article.
Before discussing the dynamic systems of interest to most systems thinkers, I want to mention a kind of system that is passive or inactive.
Classification systems
In other words, description is a biological tool. The ability to describe reality evolved in organisms because they found that having or forming accurate models of the world is useful.
No material, physical, things were described until biological entities evolved bio-chemical ways to detect, remember and recognise them. The social brain hypothesis is that the evolution of human intelligence is closely tied to the evolution of the human brain and to the origin of language. Eventually, humankind evolved ways to describe things in words and graphical models, and we devised classification systems.
A classification system categorizes things as instance of types and relates those types in some kind of knowledge structure, including but not only taxonomical hierarchies and ontological networks.
To put it another way, as soon as you have described one thing, you can envisage other things of the same type. For example, having described one unicorn, you can envisage others of the same type. And having described one universe, some physicists envision parallel universes.
To avoid ambiguity in discussing one domain of knowledge, we use a controlled vocabulary and relate types in a taxonomical hierarchy or an ontological network. Inevitably, when a discussion spans more than one domain or system of interest, ambiguities arise, and we have difficulties with "semantic interoperation".
Time as state change
Classification systems are passive structures. In practice, most of the systems of interest to systems thinkers are dynamic, meaning they change state over time.
A cosmologist's view
To measure time is to measure state changes in things we observe in space. If there was a big bang, there was no time before it, because time requires space. Things cannot move or change unless there is space from them to be in and change in.
The second law of thermodyamics (entropy increases) tells us to that to arrange things in a pattern (or in a system of interest) requires the input of energy. Ashby wrote that in the cybernetic view of systems (see below), a sufficient supply of energy is normally taken for granted.
Albert Einstein said time is an illusion that moves relative to an observer. The physicist Julian Barbour wrote a book on the illusion of time, saying change is real , but time is not; it is only a reflection of change.
A psychologist's view
State changes happened in space, over time, before and regardless of our observation of them. To pick up Ensteinâs point - to observe state changes in the world (such as the hands of a clock moving) is to experience parallel state changes in the brain. As time flows forward, our brains consume energy to lay down memories.
Our psychological impression of how fast time passes is a different thing. In a period when changes happen in close succession, time appears to pass quickly. In a period when no change is happening, time appears to pass slowly. But paradoxically, in retrospect, the former period is remembered as substantial, and the latter is not remembered.
A quantum physicist's view
In 2022, a team of physicists published a paper suggesting quantum systems (or single particles) can move both forward and backward in time. Read Time Can Actually Flow Backward, Physicists Say . But if time were to flow backward at a higher level, we humans could not detect or descibe it, since our thinking would be reversed and our memories erased.
Dynamic systems - overview
A dynamic system is a pattern of behavior in which elements interact in rule-bound ways to advance the state of the system and produce effects one element cannot produce on its own.
Below, in associating names with different approaches to thinking about dynamic systems, I do not mean to imply the named person is the only authority.
In cybernetics (after Weiner, Ashby and others), state variables interact in rule-bound ways to advance the macro state of a system. Two varieties of cybernetics can be identified.
In event-driven activity systems, actors/components interact, in ways governed by rules, to advance the state of an open system and/or produce outputs.
After talking to different observers with diferent perspectives, users of Checklandâs method abstract different soft systems from a business (or other purposeful social entity), each being a human activity system that transforms inputs into required outputs.
Organic systems (Bertalanffy, Maturana)
Ludwig von Bertalanffy was a biologist who promoted the idea of a general system theory that includes the concepts of system state (which advances over time), inputs and outputs (from and to the wider environment), feedback from output to input, and the hierarchical composition of smaller parts into larger wholes.
Since the cells of an organism share the same DNA, interact within one body, and follow the rules of their roles to maintain and advance the state of the entity, that entity is reasonably viewed as a system. And the concept of hierarchical composition is readily seen in how biologists see the human body.
Smaller is different
If you successively decompose the material universe into ever smaller components of different kinds (from the universe to galaxies to solar systems, all the way down to atomic particles) you will several times cross the boundary from one domain of knowledge to another.
Similarly, if you successively decompose an organism into ever smaller components of different kinds (from organs to cells to organelles to bio-chemicals, all the way down to atomic particles) you will several times cross the boundary from one domain of knowledge to another.
There is no hope of integrating a discussion of coarse-grained components (say, organs of the human body) with a discussion of atomic particles. Even though the systems defined at each level might be seen as homomorphic - descriptions of the same structure - they are incompatible (a point to be picked up later).
See the discussion of Anderson's hierarchy in System theories and EA
Aside: When people want to understand or manage a large number of elements that are connected in a complex network, they often impose a hierarchy on it. Note that the successive decomposition of an organism into ever smaller parts of different kinds is one thing; and the successive decomposition of an organization into ever smaller elements of the same kind is another.
Emergence, complexity and consciousness
Consciousness may be explained as a side effect of biological evolution.
Emergence occurs in the simplest of systems, even in one with only two elements. For example, forward motion emerges from the interaction between a sail and a wind. After billions of years of evolution, an intelligent animal is now an extremely large and complex system, containing billions of interacting elements. The human brain is said to be most complex organ of all.
If consciousness is the ability to compare memories of the past with perceptions of the current, and envisagings of the future, and to position the self in all of them, then it seems reasonable to assume consciousness emerges from, and requires, a thinking machine of the extraordinary complexity we see in those few animal species that demonstrate forethought and self-awareness.
Does consciousness imply human decision making is not a deterministic process? Perhaps, but I see no reason to assume that. And how complex systems are built from simpler ones is a topic addressed in cybernetics.
Cybernetics (Weiner, Ashby)
In cybernetics (the science of steering how a system behaves) the elements of a system are state variables that interact in a pattern of behavior to advance the macro state of the system.
In my reading of Ashbyâs books ("Design for a Brain" and "Introduction to Cybernetics"), his system is a set of interrelated variables, along with the rules that determine how variable values change, that are abstracted from observing the behavior of a real world entity.
Ashby referred to a physical entity (regardless of any observer, and with countless definable variables) as real machine. Be it mechanical, organic or social, I call that an entity.
Ashby referred to variables selected by an observer as of interest to them, as a system. He went on to include all the variables needed to render the whole system âregularâ. I call a set of related state variables that advance in a rule-bound way, an abstract system.
Ashby sometimes to referred to the realization of selected variables by a physical entity as a machine without quotes. I call that a real system, and relate the concepts in this triad.
We often speak of an entity and a real system as if they were the same thing. A clock is an entity or system whose state advances under its own internal drive. An organism is an entity or system that is stimulated to act in response to input events, and is a manifestation of its DNA.
However, Ashby's system is an abstraction from whatever physical entity we observe it in.
Abstracting variables of interest from a physical entity
DfB 2/3. "The first step is to record the behaviours of the machine's individual parts. To do this we identify any number of suitable variables. A grandfather clock, for instance, might provide the following variables
DfB 3/11 "At this point we must be clear about how a âsystemâ is to be defined. The real pendulum, for instance, has not only length and position; it has also
"Any suggestion that we should study âallâ the facts is unrealistic, and actually the attempt is never made. What is try [true?] is that we should pick out and study the facts that are relevant to some main interest that is already given.â
DfB 7/1 "one ceases to think of the real physical object with its manifold properties, and selects that variable in which one happens to be interested."
In other words, we abstract an observed system from the physical entity in which it is observed. If one orchestra (a well-nigh infinitely complex thing) were to be replaced on a concert platform by another orchestra, as long as it follows the same symphony score, the system of interest to me will be unchanged.
Abstracting rules that govern changes to variable values
ItC 2.1 The basic terms and concepts of cybernetics include rule-bound state changes. Let me distil Ashby's basic terms and concepts.
ItC 2/2 âa transformation defines/determines for each state variable value (each operand) in the systems of interest what the next value will be (its transform).â
Ashbyâs transformation defines the rules that determine a state change. Akin to how a composer specifies how one note leads to the next on sheet of music.
System change
ItC 4/1 âthe word âchangeâ if applied to such a machine can refer to two very different things. There is.
Recommended by LinkedIn
"The distinction is fundamental and must on no account be slighted.â
In my words, Ashby distinguished:
For example, contrast the note-by-note state changes in the performance of a symphony with a change to a symphony score made by the composer. Or else, contrast progress in the life history of an animal, with the creation of a different animal via sexual reproduction.
As I see it, three kinds of mutation may be distinguished:
For some, all three are changes to a given system. For me, a mutation produces a new system. When the change is small, we call it new system version or generation. And when the mutation is sufficiently large, we rename the system.
In my view, not being clear about the distinctions above bedevils the use of the term "system" in social systems thinking discussion.
Representing system state changes on a graph
An abstract system (a description) generalizes (classifies, typifies) the elements and workings of a real system. In cybernetics, the essential elements are a set of state variables, and a set of rules that determine how variables advance from any given state to the next.
The trajectory of quantitative variables can be projected on a graph with the y axis showing variables values going up and down over time (as in System Dynamics).
The trajectory might be cyclical or chaotic. The trajectory of 2 or 3 variables can be drawn as 2 or 3 dimensional shape, and this structure may be fractal (an avenue I have not explored).
Aside: Ashby uses the term behavior for a state change trajectory, some (instead or as well) use the term behavior for the rules of a system.
Ashby's famous theorem and law
A regulator must know the current state of any target entity that the system regulates - be it a machine or an animal. As a thermostat must know the current temperature of the air in room regulates, and a brain knows the current salinity of a body.
Ashby is known for two principles followed by the controller or regulator of a system - whose desirable state is represented by a set of state variable values.
Note for enterprise and software architects: only the ârequisiteâ variety is needed. Capturing more variable values than you need, in the hope they might help you in future, will likely turn your data lake into a septic lagoon.
In a simple feedback loop, a regulator responds to feedback by adjusting the state of what it regulates. But note the earlier discussion of system change.
The general idea is that regulators <use> models to <represent the state of> things they <monitor and direct>. This can be represented in a triadic graphic.
In âDesign for a Brainâ Ashby presented the brain as a regulator that monitors and maintains the state of the body, and seeks to change a pattern behavior when it is not working.
Second-order cybernetics is discussed in a later article. A more well known variant of cybernetics is System Dynamics, named here with capitals because it is the name given to it by its leading authority, Forrester.
System Dynamics (Forrester, Meadows)
The interacting elements of a system in System Dynamics are quantities (stocks) that interact in cause-effect relationships (flows). The pattern of behavior can be represented in a Causal Loop Diagram (CLD) like the one below, which shows how every change in one stock changes another stock, in the same or opposite direction.
When sucb a model is animated, over time, the quantities of the stockes may increase or decrease in surprising or problematic ways.
People often draw a causal loop diagram to tell a story, or present a political position. But as Donnella Meadows pointed out, they rarely verify the model by completing a System Dynamics model, quantifying how the system behaves over time, and comparing that with reality. Challenges include how to
Perhaps the biggest issue for a systems thinker wanting to use a model to explain or predict real-world behaviour is the impact on its usefulness of a) variables not included in the model and b) actors changing how they respond to events when conditions change.
Event-driven activity systems
The interacting elements in an event-driven activity system are actors/components that act in rule-bound ways to advance the state of the system and/or produce outputs. The system is
Encapsulation
An open system is encapsulated within a wider environment, and interacts with it by consuming inputs and producing outputs.
Some systems, like the organs of a human body, are naturally and physically bounded. Other systems, like a business, are bounded only when observer has decided what to declare as inputs and outputs of interest. Since the boundary is purely logical, different observers may take different views of what counts as inputs and outputs, and draw different boundaries.
Inside an open system's boundary, actors/components interact to produce effects that one element cannot produce on its own. They interact in regular activities, using resources, to meet aims, by advancing the state of the system and transforming inputs into required outputs.
The atomic activities the system can perform may be defined as discrete events that are
The granularity of events
A discrete event is an atomic and indivisible process, definable by the states of the affected system (pre and post conditions) before and after the event, regardless of what happens in between. Several such discrete acts (aka operations, methods or transactions) may be listed in an interface or role definition that publicises what a system has the capability to do.
The granularity of a discrete event depends on the granularity of the system of interest. A large system can be decomposed into smaller subsystems that are coupled to each other. Similarly, a long process can be decomposed into shorter processes that are connected in sequence. When defining a business activity system, we may define both finer-grained events (such as room cleaning), and coarser-grained events (such as hotel room occupancy).
Simulating continuous systems
An event-driven activity system can simulate a continuous (or analogue) system by recording and reacting to events frequently enough to ensure any invariant condition of interest is always true. For example, if you look frequently enough at your child, and react swifly enough, you can ensure they never pick their nose. (Will Harwood tells me this is the domain of "sampling theory".)
Software and business activity systems
Both software and human activity systems can be defined as open systems - as black boxes that consume inputs and produce outputs. Where we choose to draw the boundary of the system is a choice we make. It does not always coincide with a physical boundary, like a wall or a skin.
To encapsulate an open system of this kind, to define its interface, we define the inputs, and for each input we define the rules (pre and post conditions) that govern changes to state variable values inside the system, and any output that is produced in response to the input.
Software activity systems
Software systems are special kind of event-driven activity system that maintain records of entities and events of interest in databases. Some use digital twins to represent structures and behaviors that they simulate.
Business activity systems
Businesses depend on human and computer actors interacting in regular activities to create and use data that represents entities and events (customers, products, orders etc.) the business wants to monitor and direct.
Such a business activity system is event-driven. The events trigger actors to perform activities, using resources, to advance the state of the system and transform inputs into required outputs, to meet given aims. The concept graph below is an informal representation of the systems of interest to enterprise architects.
The activities in such a system:
However, a big issue for those who see a business as system is that human actors are not limited to playing roles and performing activities in any definable system. They have lives outside of the business, and within it, they can make decisions, ignore rules given to them and perform actions that nobody could predict or model - as will be discussed in related articles.
Scaling up social systems thinking
Ashby envisaged cybernetics could be scaled up from his small examples to large and complex organic and social entities. From
It is true that very large and complex systems may be designed using Ashby's principles. However, system theories have their limits, especially when it comes to human social systems.
The slow evolution of biological organisms (changing the phenotype of a species, in tiny ways, randomly, and discarding most changes as more harmful than beneficial) is not a good model for the rapid evolution of business organizations.
For the latter, a regulator/controller needs a) awareness of what may change in the environment, b) the ability to detect changes, and c) the know how to change the regulated/control system accordingly.
For a more sociological persecpective of systems thinking, read the two articles on social systems below.
Related articles
When is an entity not well-called a system? When we have applied no system theory to it. When we have no description or model of it as a system, and no prospect of completing one. So, to speak of it as system leaves the listener none the wiser.
If you want to read this article in the context of a book, watch this space. Related articles include:
Note: for those who want to research further by finding Ashby's books on the internet.
In "Design for a Brain" chapters 2 and 14, and "Introduction to Cybernetics" chapters 2, 3, 4 and 6), Ashby's use of words was not consistent. You might interpret his words differently from how I do.
Ashby referred to a physical entity (regardless of any observer, and with countless definable variables) as real machine, or real system, and sometimes a 'machine' with quotes. Be it mechanical, organic or social, I call that an entity.
Ashby referred to particular variables, descriptive of an entity, and selected by an observer as being of interest, as a system. He went on to speak of a system as a regular system that includes both the variables of interest, and any other variables needed to render the whole system âregularâ. By regular, I believe he meant rule-bound. I call a set of state variables that advance in a rule-bound way, an abstract system.
Ashby sometimes to referred to the realization of variables by a physical entity as a machine without quotes. I call the realization of an abstract system, a real system.
Â
Systems thinker interested in change management and getting people to collaborate and co-create. A technical person who thinks like a business person. Researcher of systems corruption and systems optimisation.
1mo"Every system is a soft system in the sense it is an observerâs view of what a real world entity does. Checkland noted that people find it difficult to grasp that in behavior of one entity, many different systems (sometimes conflicting) may be observed. This perhaps most obvious in the case of a human social entity. Many systems thinkers (like von Foerster) confuse a social entity in which humans do whatever they choose (like a card school) with a social system in which humans follow the rules that define the system (like a game of poker). They hide this confusion by speaking of the social entity as a âcomplex systemâ. The real word entity that you're referring to is THE SYSTEM. If you take implicit specs of that system - that's the specs of the system. Everything else is studying systems and our understanding of it. If the observer is part of the dynamic system under investigation - they are affecting it, because qualitative research methods are used - phenomenology
Systems thinker interested in change management and getting people to collaborate and co-create. A technical person who thinks like a business person. Researcher of systems corruption and systems optimisation.
1mo"a real system (such as a game of poker) is a real-world instantiation of an abstract system (as defined in a "poker" rule book) by a real world entity (such as a card school). Card games as realizations of abstract systems A card school is a social entity. What a card school does outside the abstract rules of a card game lies outside of the corresponding real system..." Not sure where you're going with this one? A card school is totally unrelated system/entity to the specification and implementation of "Poker" system.
Systems thinker interested in change management and getting people to collaborate and co-create. A technical person who thinks like a business person. Researcher of systems corruption and systems optimisation.
1mo"A model is both abstract and real. It is a model only in relation to whatever phenomenon it is created or used to model. And it only plays the role of being a model in the processes of being created or used in that way." Reverse engineering a model out of an "envisaged" phenomena is always an approximation.
Systems thinker interested in change management and getting people to collaborate and co-create. A technical person who thinks like a business person. Researcher of systems corruption and systems optimisation.
1mo"There is a major ambiguity in sociological discussion of systems. Is "the system" a social entity (a group of communicating actors) or a social system (a rule-bound pattern of behavior)?" Splitting hairs - every communication (every relationship) is predicated on rules and patterns - there are protocols, norms, etiquette, conventions, standards...
Systems thinker interested in change management and getting people to collaborate and co-create. A technical person who thinks like a business person. Researcher of systems corruption and systems optimisation.
1mo"Every system we abstract from observations and envisagings of how the universe works is defined by the particular rules it follows. The trouble is, the rules of one system may conflict with the rules of another. Human actors play roles in biological, psychological and social systems, each with its own rules - be they instinctive, social conventions, business rules or government legislation. Suppose you are a member of two social entities, a church and a business. The church may impose a rule on your behavior that conflicts with a rule of the business. So. when you ..." This is another example of systems corruption - in this case it's a design corruption - the specification not living up to the motivation for the system.