Because of feedback delays within complex systems, by the time a problem becomes apparent it may be unnecessarily difficult to solve.
Words and sentences must, by necessity, come only one at a time in linear, logical order. Systems happen all at once.
Tags: [[systems]]
A system* is an interconnected set of elements that is coherently organized in a way that achieves something.
Tags: [[systems]]
A system is more than the sum of its parts. It may exhibit adaptive, dynamic, goal-seeking, self-preserving, and sometimes evolutionary behavior.
Tags: [[systems]]
It’s easier to learn about a system’s elements than about its interconnections.
Tags: [[systems]]
Many of the interconnections in systems operate through the flow of information. Information holds systems together and plays a great role in determining how they operate.
Purposes are deduced from behavior, not from rhetoric or stated goals.
one of the most frustrating aspects of systems is that the purposes of subunits may add up to an overall behavior that no one wants.
Tags: [[systems]]
Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems.
Changing interconnections in a system can change it dramatically.
A stock is the foundation of any system. Stocks are the elements of the system that you can see, feel, count, or measure at any given time.
Tags: [[systems]]
A stock is the memory of the history of changing flows within the system.
Tags: [[systems]]
If you understand the dynamics of stocks and flows—their behavior over time—you understand a good deal about the behavior of complex systems.
Tags: [[decision-making]] [[systems]]
The human mind seems to focus more easily on stocks than on flows. On top of that, when we do focus on flows, we tend to focus on inflows more easily than on outflows.
Stocks generally change slowly, even when the flows into or out of them change suddenly. Therefore, stocks act as delays or buffers or shock absorbers in systems.
The time lags that come from slowly changing stocks can cause problems in systems, but they also can be sources of stability.
Tags: [[systems]]
The time lags imposed by stocks allow room to maneuver, to experiment, and to revise policies that aren’t working.
Stocks allow inflows and outflows to be decoupled and to be independent and temporarily out of balance with each other.
Most individual and institutional decisions are designed to regulate the levels in stocks.
Systems thinkers see the world as a collection of stocks along with the mechanisms for regulating the levels in the stocks by manipulating flows.
Tags: [[systems]]
A feedback loop is a closed chain of causal connections from a stock, through a set of decisions or rules or physical laws or actions that are dependent on the level of the stock, and back again through a flow to change the stock.
Balancing feedback loops are goal-seeking or stability-seeking. Each tries to keep a stock at a given value or within a range of values.
Balancing feedback loops are equilibrating or goal-seeking structures in systems and are both sources of stability and sources of resistance to change.
Tags: [[systems]]
Reinforcing feedback loops are self-enhancing, leading to exponential growth or to runaway collapses over time. They are found whenever a stock has the capacity to reinforce or reproduce itself.
Tags: [[systems]]
The concept of feedback opens up the idea that a system can cause its own behavior.
Tags: [[systems]]
The information delivered by a feedback loop—even nonphysical feedback—can only affect future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback. Even nonphysical information takes time to feedback into the system.
A stock-maintaining balancing feedback loop must have its goal set appropriately to compensate for draining or inflowing processes that affect that stock. Otherwise, the feedback process will fall short of or exceed the target for the stock.
Tags: [[systems]]
Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior.
A delay in a balancing feedback loop makes a system likely to oscillate.
Delays are pervasive in systems, and they are strong determinants of behavior. Changing the length of a delay may (or may not, depending on the type of delay and the relative lengths of other delays) make a large change in the behavior of a system.
In physical, exponentially growing systems, there must be at least one reinforcing loop driving the growth and at least one balancing loop constraining the growth, because no physical system can grow forever in a finite environment.
Nonrenewable resources are stock-limited. The entire stock is available at once, and can be extracted at any rate (limited mainly by extraction capital). But since the stock is not renewed, the faster the extraction rate, the shorter the lifetime of the resource.
Renewable resources are flow-limited. They can support extraction or harvest indefinitely, but only at a finite flow rate equal to their regeneration rate. If they are extracted faster than they regenerate, they may eventually be driven below a critical threshold and become, for all practical purposes, nonrenewable.
Tags: [[systems]]
Resilience is a measure of a system’s ability to survive and persist within a variable environment.
Because resilience may not be obvious without a whole-system view, people often sacrifice resilience for stability, or for productivity, or for some other more immediately recognizable system property.
Systems need to be managed not only for productivity or stability, they also need to be managed for resilience—the ability to recover from perturbation, the ability to restore or repair themselves.
Self-organization produces heterogeneity and unpredictability. It is likely to come up with whole new structures, whole new ways of doing things. It requires freedom and experimentation, and a certain amount of disorder.
Systems often have the property of self-organization—the ability to structure themselves, to create new structure, to learn, diversify, and complexify. Even complex forms of self-organization may arise from relatively simple organizing rules—or may not.
Tags: [[systems]]
In hierarchical systems relationships within each subsystem are denser and stronger than relationships between subsystems. Everything is still connected to everything else, but not equally strongly.
Tags: [[organization-design]] [[systems]]
Hierarchical systems evolve from the bottom up. The purpose of the upper layers of the hierarchy is to serve the purposes of the lower layers.
Everything we think we know about the world is a model.
Everything we think we know about the world is a model. Our models do have a strong congruence with the world. Our models fall far short of representing the real world fully.
Tags: [[systems]]
System structure is the source of system behavior. System behavior reveals itself as a series of events over time.
Many relationships in systems are nonlinear. Their relative strengths shift in disproportionate amounts as the stocks in the system shift. Nonlinearities in feedback systems produce shifting dominance of loops and many complexities in system behavior.
Everything, as they say, is connected to everything else, and not neatly. There is no clearly determinable boundary between the sea and the land, between sociology and anthropology, between an automobile’s exhaust and your nose. There are only boundaries of word, thought, perception, and social agreement—artificial, mental-model boundaries.
Tags: [[systems]]
The greatest complexities arise exactly at boundaries. There are Czechs on the German side of the border and Germans on the Czech side of the border. Forest species extend beyond the edge of the forest into the field; field species penetrate partway into the forest. Disorderly, mixed-up borders are sources of diversity and creativity.
There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion—the questions we want to ask.
Tags: [[systems]] [[favorite]]
It’s a challenge to stay creative enough to drop the boundaries that worked for the last problem and to find the most appropriate set of boundaries for the next question. It’s also a necessity, if problems are to be solved well.
At any given time, the input that is most important to a system is the one that is most limiting.
Tags: [[systems]]
There always will be limits to growth. They can be self-imposed. If they aren’t, they will be system-imposed.
When there are long delays in feedback loops, some sort of foresight is essential. To act only when a problem becomes obvious is to miss an important opportunity to solve the problem.
Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system.
Tags: [[systems]] [[decision-making]]
The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole.
Tags: [[decision-making]] [[systems]]
Allowing performance standards to be influenced by past performance, especially if there is a negative bias in perceiving past performance, sets up a reinforcing feedback loop of eroding goals that sets a system drifting toward low performance.
If the winners of a competition are systematically rewarded with the means to win again, a reinforcing feedback loop is created by which, if it is allowed to proceed uninhibited, the winners eventually take all, while the losers are eliminated.
Tags: [[systems]] [[economy]]
Rule beating is usually a response of the lower levels in a hierarchy to overrigid, deleterious, unworkable, or ill-defined rules from above.
System behavior is particularly sensitive to the goals of feedback loops. If the goals—the indicators of satisfaction of the rules—are defined inaccurately or incompletely, the system may obediently work to produce a result that is not really intended or wanted.
You can often stabilize a system by increasing the capacity of a buffer.5 But if a buffer is too big, the system gets inflexible.
Physical structure is crucial in a system, but is rarely a leverage point, because changing it is rarely quick or simple. The leverage point is in proper design in the first place.
Tags: [[systems]]
As we try to imagine restructured rules and what our behavior would be under them, we come to understand the power of rules. They are high leverage points. Power over the rules is real power.
We can’t control systems or figure them out. But we can dance with them!
Before you disturb the system in any way, watch how it behaves.
It’s especially interesting to watch how the various elements in the system do or do not vary together. Watching what really happens, instead of listening to peoples’ theories of what happens, can explode many careless causal hypotheses.
Tags: [[systems]]
starting with history discourages the common and distracting tendency we all have to define a problem not by the system’s actual behavior, but by the lack of our favorite solution.
Tags: [[systems]]
You can drive a system crazy by muddying its information streams. You can make a system work better with surprising ease if you can give it more timely, more accurate, more complete information.
Tags: [[systems]]
The language and information systems of an organization are not an objective means of describing an outside reality—they fundamentally structure the perceptions and actions of its members.
My impression is that we have seen, for perhaps a hundred and fifty years, a gradual increase in language that is either meaningless or destructive of meaning. And I believe that this increasing unreliability of language parallels the increasing disintegration, over the same period, of persons and communities.…
Pretending that something doesn’t exist if it’s hard to quantify leads to faulty models.
“Intrinsic responsibility” means that the system is designed to send feedback about the consequences of decision making directly and quickly and compellingly to the decision makers.