Systems thinking is a way of seeing the world as a series of interconnected and interdependent systems rather than lots of independent parts. As a thinking tool, it seeks to oppose the reductionist view — the idea that a system can be understood by the sum of its isolated parts — and replace it with expansionism, the view that everything is part of a larger whole and that the connections between all elements are critical.
Systems are essentially networks made up of nodes or agents that are linked in varied and diverse ways. What we want to do in systems thinking is being able to identify and understand these relationships as part of the exploration of the larger systems at play. Everything is interconnected, every system is made up of many subsystems, and is itself a part of larger systems. Seeing things in this way helps to create a more flexible view of the world and the way it works, and it illuminates opportunities for addressing some of its existing and evolving problem arenas.
Living successfully in a world of systems requires more of us than our ability to calculate. It requires our full humanity - our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality. In one of the chapters from her famous book - Thinking In Systems, Donella Meadows summarized the most general “systems wisdoms” she has absorbed from modeling complex systems and from hanging out with modelers. These are the take-home lessons, the concepts and practices that penetrate the discipline of systems so deeply that one begins, however imperfectly, to practice them not just in one’s profession, but in all of life. They are the behavioral consequences of a worldview based on the ideas of feedback, nonlinearity and systems responsible for their own behavior. After spending the last couple days finishing the book, I want to share these 15 systems wisdoms/guidelines as I believe everyone can benefit from adopting them.
1 - Get the beat of the system
Before you disturb the system in any way, watch how it behaves. If it’s a piece of music or a whitewater rapid or a fluctuation in a commodity price, study its beat. If it’s a social system, watch it work. Learn its history. Ask people who’ve been around a long time to tell you what has happened. If possible, find or make a time graph of actual data from the system - peoples’ memories are not always reliable when it comes to timing.
This guideline is deceptively simple. Until you make it a practice, you won’t believe how many wrong turns it helps you avoid. Starting with the behavior of the system forces you to focus on facts, not theories. It keeps you from falling too quickly into your own beliefs or misconceptions, or those of others.
Starting with the behavior of the system directs one’s thoughts to dynamic, not static, analysis - not only to “What’s wrong?” but also to “How did we get there?” “What other behavior modes are possible?” “If we don’t change direction, where are we going to end up?” And looking to the strengths of the system, one can ask “What’s working well here?” Starting with the history of several variables plotted together begins to suggest not only what elements are in the system, but how they might be interconnected.
And finally, starting with history discourages the common and distracting tendency we all have to define a problem not by the system’s actual behavior, but by the lack of our favorite solution. Listen to any discussion, in your family or a committee meeting at work or among the pundits in the media, and watch people leap to solutions, usually solutions in “predict, control, or impose your will” mode, without having paid any attention to what the system is doing and why it’s doing it.
2 - Expose your mental models to the light of day
When we draw structural diagrams and then write equations, we are forced to make our assumptions visible and to express them with rigor. We have to put everyone of our assumptions about the system out where others can see them. Our models have to be complete, and they have to add up, and they have to be consistent. Our assumptions can no longer slide around (mental models are very slippery), assuming one thing for purposes of one discussion and something else contradictory for purposes of the next discussion.
You don’t have to put forth your mental model with diagrams and equations, although doing so is a good practice. You can do it with words or lists or pictures or arrows showing what you think is connected to what. The more you do that, in any form, the clearer your thinking will become, the faster you will admit your uncertainties and correct your mistakes, and the more flexible you will learn to be. Mental flexibility - the willingness to redraw boundaries, to notice that a system has shifted into a new mode, to see how to redesign structure - is a necessity when you live in a world of flexible systems.
Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be viewed. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible. Consider all of them to be plausible until you find some evidence that causes you to rule one out. That way you will be emotionally able to see the evidence that rules out an assumption that may become entangled with your own identity.
Getting models out into the light of day, making them as rigorous as possible, testing them against the evidence, and being willing to scuttle them if they are no longer supported is nothing more than practicing the scientific method - something that is done too seldom even in science, and is done hardly at all in social science or management or government or everyday life.
3 - Honor, respect, and distribute information
You’ve seen how information holds systems together and how delayed, biased, scattered, or missing information can make feedback loops malfunction. Decision makers can’t respond to information they don’t have, can’t respond accurately to information that is inaccurate, and can’t respond in a timely way to information that is late. I would guess that most of what goes wrong in systems goes wrong because of biased, late, or missing information.
If I could, I would add an 11th commandment to the first 10: Thou shalt not distort, delay, or withhold information. You can drive a system crazy by mudding its information streams. You can make a system work better with surprising ease if you can give it more timely, more accurate, more complete information.
Information is power. Anyone interested in power grasps that idea very quickly. The media, the public relations people, the politicians, and advertisers who regulate much of the public flow of information have far more power than most people realize. They filer and channel information. Often they do so for short-term, self-interested purposes. It’s no wonder our that social systems so often run amok.
4 - Use language with care and enrich it with systems concepts
Our information streams are composed primarily of language. Our mental models are mostly verbal. Honoring information means above all avoiding language pollution - making the cleanest possible use we can of language. Second, it means expanding our language so we can talk about complexity.
A society that talks incessantly about “productivity” but that hardly understands, much less uses, the word “resilience” is going to become productive and not resilient. A society that doesn’t understand or use the term “carrying capacity” will exceed its carrying capacity. A society that talks about “creating jobs” as if that’s something only companies can do will not inspire the great majority of its people to create jobs, for themselves or anyone else. Nor will it appreciate its workers for their role in “creating profits.” And of course a society that talks about a “Peacekeeper” missile or “collateral damage,” a “Final Solution” or “ethnic cleansing,” is speaking “tyrannies.”
The first step in respecting language is keeping it as concrete, meaningful, and truthful as possible - part of the job of keeping information streams clear. The second step is to enlarge language to make it consistent with our enlarged understanding of systems. If the Eskimos have so many words for snow, it’s because they have studied and learned how to use snow. They have turned snow into a resource, a system with which they can dance. The industrial society is just beginning to pay attention to and use complexity. Carrying capacity, structure, diversity, and even system are old words that are coming to have richer and more precise meanings. New words are having to be invented.
5 - Pay attention to what is important, not just what is quantifiable
Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can’t measure. Think about that for a minute. It means that we make quantity more important than quality. If quantity forms the goals of our feedback loops, if quantity is the center of our attention and language and institutions, if we motivate ourselves, rate ourselves, and reward ourselves on our ability to produce quantity, then quantity will be the result. You can look around and make up your own mind about whether quantity or quality is the outstanding characteristic of the world in which you live.
Pretending that something doesn’t exist if it’s hard to quantify leads to faulty models. You’ve already seen the system trap that comes from setting goals around what is easily measured, rather than around what is important. So don’t fall into that trap. Human beings have been endowed not only with the ability to count, but also with the ability to assess quality. Be a quality detector. Be a walking, noisy Geiger counter that registers the presence or absence of quality.
If something is ugly, say so. If it is tacky, inappropriate, out of proportion, unsustainable, morally degrading, ecologically impoverishing, or humanly demeaning, don’t let it pass. Don’t be stopped by the “if you can’t define it and measure it, I don’t have to pay attention to it” ploy. No one can define or measure justice, democracy, security, freedom, truth, or love. No one can define or measure any value. But if no one speaks up for them, if systems aren’t designed to produce them, if we don’t speak about them and point toward their presence or absence, they will cease to exist.
6 - Make feedback policies for feedback system
You can imagine why a dynamic, self-adjusting feedback system cannot be governed by a static, unbending policy. It’s easier, more effective, and usually much cheaper to design policies that change depending on the state of the system. Especially where there are great uncertainties, the best policies not only contain feedback loops, but meta-feedback loops - loops that alter, correct, and expand loops. These are policies that design learning into the management process.
An example was the historic Montreal Protocol to protect the ozone layer of the stratosphere. In 1987, when that protocol was signed, there was no certainty about the danger to the ozone layer, about the rate at which it was degrading, or about the specific effect of different chemicals. The protocol set targets for how fast the manufacture of the most damaging chemicals should be decreased. But it also required monitoring the situation and reconvening an international congress to change the phase-out schedule, if the damage to the ozone layer turned out to be more or less than expected. Just 3 years later, in 1990, the schedule had to be hurried forward and more chemicals added to it, because the damage was turning out to be much greater than was foreseen in 1987.
That was feedback policy, structured for learning. We all hope that it worked in time.
7 - Go for the good of the whole
Remember that hierarchies exist to serve the bottom layers, not the top. Don’t maximize parts of systems or subsystems while ignoring the whole. Don’t go to great trouble to optimize something that never should be done at all. Aim to enhance total systems properties, such as growth, stability, diversity, resilience, and sustainability - whether they are easily measured or not.
8 - Listens to the wisdom of the system
Aid and encourage the forces and structures that help the system run itself. Notice how many of those forces and structures are at the bottom of the hierarchy. Don’t be an unthinking intervenor and destroy the system’s own self-maintenance capacities. Before you charge in to make things better, pay attention to the value of what’s already there.
9 - Locate responsibility within the system
That’s a guideline both for analysis and design. In analysis, it means looking for the ways the system creates its own behavior. Do pay attention to the triggering events, the outside influences that bring forth one kind of behavior from the system rather than another. Sometimes those outside events can be controlled. But sometimes they can’t. And sometimes blaming and trying to control the outside influence blinds one to the easier task of increasing responsibility within the system.
“Intrinsic responsibility” means that the system is designed to send feedback about the consequences of decision making directly and quickly and compellingly to the decision makers. Because the pilot of a plane rides in the front of the plane, that pilot is intrinsically responsible. He/she will experience directly the consequences of his/her decisions.
10 - Stay humble - stay a learner
The thing to do, when you don’t know, is not to bluff and not to freeze, but to learn. The way you learn is by experiment, by trial and error. In a world of complex systems, it is not appropriate to charge forward with rigid, undeviating directives. “Stay the course” is only a good idea if you are sure you are on course. Pretending you are in control even when you aren’t is a recipe not only for mistakes, but for not learning from mistakes. What’s appropriate when you’re learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it’s leading.
That’s hard. It means making mistakes and, worse, admitting them. It means what psychologist Don Michael calls “error-embracing.” It takes a lot of courage to embrace your errors.
11 - Celebrate complexity
The universe is messy, nonlinear, turbulent, and dynamic. It spends its time in transient behavior on its way to somewhere else, not in mathematically neat equilibria. It self-organizes and evolves. It creates diversity and uniformity. That’s what makes the world interesting, that’s what makes it beautiful, and that’s what makes it work.
There’s something within the human mind that is attracted to straight lines and not curves, to whole numbers and not fractions, to uniformity and not diversity, and to certainties and not mystery. But there is something else within us that has the opposite set of tendencies, since we ourselves evolved out of and are shaped by and structured as complex feedback systems. Only a part of us, a part that has emerged recently, designs buildings as boxes with uncompromising straight lines and flat surfaces. Another part of us recognizes on every scale from the microscopic to the macroscopic. That part of us makes Gothic cathedrals and Persian carpets, symphonies and novels, Mardi Gras costumes and artificial intelligence programs, all with embellishments almost as complex as the ones we find in the world around us.
12 - Expand time horizons
In a strict systems sense, there is no long-term, short-term distinction. Phenomena at different time-scales are nested within each other. Actions taken now have some immediate effects and some that radiate out for decades to come. We experience now the consequences of actions set in motion yesterday and decades ago and centuries ago. The couplings between very fast processes and very slowly ones are sometimes strong, sometimes weak. When the slow ones dominate, nothing seems to be happening; when the fast ones take over, things happen with breathtaking speed. Systems are always coupling and uncoupling the large and the small, the fast and the slow.
When you’re walking along a tricky, curving, unknown, surprising, obstacle-strewn path, you’d be a fool to keep your head down and look just at the next step in front of you. You’d be equally a fool to just peer far ahead and never notice what’s immediately under your feet. You need to be watching both the short and the long term - the whole system.
13 - Defy the disciplines
In spite of what you majored in, or what the textbooks say, or what you think you’re an expert at, follow a system wherever it leads. It will be sure to lead across traditional disciplinary lines. To understand that system, you will have to be able to learn from - while not being limited by - economists and chemists and psychologists and theologians. You will have to penetrate their jargons, integrate what they tell you, recognize what they can honestly see through their particular lenses, and discard the distortions that come from the narrowness and incompleteness of their lenses. They won’t make it easy for you.
Seeing systems whole requires more than being “interdisciplinary”, if that word means, as it usually goes, putting together people from different disciplines and letting them talk past each other. Interdisciplinary communication works only if there is a real problem to be solved, and if the representatives from the various disciplines are more committed to solving the problem than to being academically correct. They will have to go into learning mode. They will have to admit ignorance and be willing to be taught, by each other and by the system.
14 - Expand the boundary of caring
Living successfully in a world of complex systems means expanding not only time horizons and though horizons; above all, it means expanding the horizons of caring. There are moral reasons for doing that, of course. And if moral arguments are not sufficient, then systems thinking provides the practical reasons to back up the moral ones. The real system is interconnected. No part of the human race is separate either from other human beings or from the global ecosystem. It will not be possible in this integrated world for your heart to succeed if your lungs fail, or for your company to succeed if your workers fail, or for the rich in Los Angeles to succeed if the poor in Los Angeles fail, or for Europe to succeed if Africa fails, or for the global economy to succeed if the global environment fails.
As with everything else about systems, most people already know about the interconnections that make moral and practical rules turn out to be the same rules. They just have to bring themselves to believe that which they know.
15 - Don’t erode the goal of goodness
The most damaging example of the systems archetype called “drift to low performance” is the process by which modern industrial culture has eroded the goal of morality. The workings of the trap have been classic, and awful to behold.
Examples of bad human behavior are held up, magnified by the media, affirmed by the culture, as typical. This is just what you would expect. After all, we’re only human. The far more numerous examples of human goodness are barely noticed. They are “not news.” They are exceptions. Must have been a saint. Can’t expect everyone to behave like that.
We know what to do about the drift to low performance. Don’t weigh the bad news more heavily than the good. And keep standards absolute.
Systems thinking can only tell us to do that. It can’t do it. We’re back to the gap between understanding and implementation. Systems thinking by itself cannot bridge the gap, but it can lead us to the edge of what analysis can do and then point beyond - to what can and must be done by the human spirit.
If you enjoyed this piece, I’d love it if you can share it over social media so others might stumble upon it. You can sign up for my newsletter in the footer section below to receive my newest articles once a week.