“... when a machine is observed to be (improbably) moving at a constant rate, even under varying load, we shall look for restraints – e.g., for a circuit which will be activated by changes in rate and which, when activated, will operate upon some variable (e.g., the fuel supply) in such a way as to diminish the change in rate.” [1]

“Work expands so as to fill the time available for its completion.” Parkinson’s Law

We’ll start with a puzzle: why is software slow? And why does the same device seem to get slower and slower as it ages?

This state of affairs should be surprising, considering that the phone you probably carry in your pocket is orders of magnitude faster (and roughly ten times cheaper) than the desktops of two decades ago, yet both accomplish similarly mundane tasks.

The obvious answer to the puzzle is that the software is trying to do too much with the resources available to it. In addition to being obvious, this answer is also mostly wrong.[2] In this third installment in the Gregory Bateson sequence (parts one and two) we’ll take a look at the puzzle of slow software through the lens of Bateson’s particular brand of thinking, but first, a detour into anthropology.

Patterns in Bateson’s Anthropology

Gregory Bateson’s earliest anthropological work was a study of the Iatmul tribe of Papua New Guinea, published in his 1936 book Naven. He identified behaviors that led to what he termed schismogenesis, a tendency of groups to subdivide by forming internal boundaries much like differentiating cells in an embryo.

In interpersonal conflict among the Iatmul, boasting was met with boasting. This would lead to symmetrically escalating tensions akin to an arms race. (Bateson also observed asymmetrically escalating behaviors, e.g. dominance met with submission that would lead to increasing dominance.) Minor conflicts among groups would increase to a breaking point and the smaller group in a conflict would go off to form their own community.[3]

Later when he began to study Balinese tribes he noticed that these escalating sequences were almost entirely absent, in stark contrast to the Iatmul. He describes the exception that proves the rule:

“The most important exception to [the absence of escalating behaviors] occurs in the relationship between adults (especially parents) and children. Typically, the mother will ... excite the child, and for a few moments cumulative interaction will occur. Then just as the child, approaching some small climax, flings its arms around the mother’s neck, her attention wanders. At this point the child will typically start an alternative cumulative interaction, building up toward temper tantrum. The mother will either play a spectator’s role, enjoying the child’s tantrum, or, if the child actually attacks her, will brush off his attack with no show of anger on her part. These sequences can be seen either as an expression of the mother’s distaste for this type of personal involvement or as context in which the child acquires a deep distrust of such involvement. The perhaps basically human tendency towards cumulative personal interaction is thus muted. It is possible that some sort of continuing plateau of intensity is substituted for climax as the child becomes more fully adjusted to Balinese life. This cannot at present be clearly documented for sexual relations, but there are indications that a plateau type of sequence is characteristic for trance and for quarrels.”

Bateson continues with more examples: conflicts were frozen, formalized, and tended to plateau rather than spiral out of control. Furthermore, Balinese culture mirrored these dynamics in its values and attitudes towards balance and stability. Culture and its values, it seems, influence the dynamics that play out in a society and in turn are influenced by them.

Cybernetics: the Study of Control

Far from the familiar associations with science fiction cyborgs and cyberspace or internet chat room cybersex, the cyber- of cybernetics originally had rather different connotations. From Ancient Greek κυβερνητικός (“good at steering”) via the French cybernétique (“the art of governing”), cybernetics was the study of communication and control in animals and mechanical devices (thermostats, steam engine governors) that regulate crucial variables (temperature, fuel supply) without and despite outside intervention. This qualitative discipline eventually birthed the more quantitative fields of modern control theory and system dynamics.

Qualitative cybernetics is still relevant, and in fact I argue that a good intuitive understanding of systems with feedback loops, what Jay Forrester calls “systems thinking,” is essential for navigating our increasingly dynamic and shifting world. Everything from household budgets, health, and relationships to inventory management and company morale can be understood in these terms: there are quantities (prices, account balances, emotional states) that affect and are affected by other quantities (or themselves), and predictable dynamics take hold.

It’s worth emphasizing that the intuitions of systems thinking are only a starting point. Experience, quantitative analysis, and empirical observation are still necessary for making good decisions.

To hone our intuitions, let’s enumerate the common possibilities and form a preliminary taxonomy of cybernetic behaviors.

Exponential Growth and Decay

Imagine a cluster of 100 bacterial cells in a Petri dish, and suppose that each cell divides once every hour. After an hour there will be 200 cells; after two hours, 400. Twenty-four hours later, assuming no cells die and they don’t run out of food, the bacteria colony will have exploded into a population of 1,677,721,600. That’s a lot of cells!

When a quantity A* leads to more of *A in the future, A* will grow exponentially.[4] This is also true of circular chains of quantities *A, B*, *C, ..., A. Some more examples:

  • A savings account with compound interest
  • That “feedback” sound when a microphone gets too close to a speaker
  • An anxiety attack
  • Arms races
  • A nuclear chain reaction in a critical mass of fissile material
  • Moore’s Law: the number of transistors on a chip doubling every two years

These processes eventually run out of steam: external factors take over and exponential growth slows, stops, or reverses. The food supply runs out or the source of power reaches inherent limits. Growth processes that seem to violate this rule (e.g. the economy and arguably Moore’s Law) are a continual source of fascination and speculation.

When a quantity of A* now causes less of *A in the future, A decays exponentially. Again this is true of circular chains where the multiplicative sign is negative. The above examples change from growth to decay when damping is introduced (e.g. control rods in a nuclear reactor) or sensitivity decreases (e.g. turning down the gain of the amplifier).

Growth curves are often sigmoidal overall, initially approximating exponential growth, followed by exponential decay.[5]

Stable Equilibrium

The above section covers “dumb” systems, those that don’t change their behavior in response to external conditions, or are at most governed by one or two factors. Interesting things start to happen when the rate of change of a quantity depends in more complex ways on the quantity itself.

Imagine a car with a safety mechanism whereby, if the speed of the car exceeds say 112 mph, the mechanism prevents the engine from producing additional power. This effectively prohibits the car from accelerating beyond 112 mph. This special case of non-linear response is called negative feedback – note that in this context negative refers to the direction of response and has nothing to do with negative emotions or criticism. (Although sometimes criticism is a valuable form of negative feedback in this sense and can have quite positive effects!)

In his 1967 essay “Cybernetic Explanation” Bateson envisions the phenomena of the universe linked together in “complexly branching and interconnecting chains of causation,” some of which form closed circuits as in the examples above.

“Consider a variable in the circuit at any position and suppose this variable subject to random change in value (the change perhaps being imposed by impact of some event external to the circuit). We now ask how this change will affect the value of this variable at that later time when the sequence of effects has come around the circuit. Clearly the answer to this last question will depend upon the characteristics of the circuit and will, therefore, be not random. In principle, then, a causal circuit will generate a non-random response to a random event at that position in the circuit at which the random event occurred.”[6] (emphasis added)

These circuits, though composed of simple, inert parts, begin to feel alive. They “push back” against outside influences, and this can lead to stability or homeostasis. Homeostatic systems are typically composed of several negative feedback loops.

A control system in its simplest form consists of an input measurement, an internally determined target value often called a set point, and an output actuator that causes the input to move closer to the target value, thus gaining homeostatic control of its environment. Some examples:

  • An HVAC system with a thermostat keeps the inside temperature within a narrow range despite arbitrary variation in outside temperature.[7]
  • Drivers on the highway can generally stay within their lane without precise knowledge of the curvature of the road or the strength of crosswinds. (Estimation of the road’s curvature does help – see the models discussion in the next section.)

The set point is usually a static property of the system’s configuration, but is not always easy to specify or predict. Occasionally multiple actors implicitly agree on a comfortable shared set point: the speed of a two people walking next to each other, or of a flock of birds or a school of fish; the frequency of rhythmic clapping or chanting at a rally or the entrainment of cricket chirps to a single, synchronized beat.

Oscillation, Overshoot, and Collapse

Control systems are not always well-tuned. Too large an output response or too much delay in the feedback loop will produce increasing oscillations and a loss of control, as in the (rather tasteless) video below (warning: footage of falling humans and possible minor injuries).

The frustration of limited control will be familiar to anyone who’s tried taking a shower in a house with old pipes, when there is a long delay between turning a knob and the resultant change in water temperature. This is also evident in the movements of a person who’s had too much alcohol: they either sloppily miss targets or exhibit overly deliberate movement to compensate. Interestingly, well-practiced movements (dancing, say) are still smooth, and this can be dangerous: people usually feel so comfortable about the mechanics of driving that they don’t notice their impairment.

Some measures of environmental quality (soil fertility, availability of fresh water, impact of pollutants, etc.) also exhibit feedback delays of this type, and these delays are further exacerbated by data and model quality problems. In their seminal – and controversial – book Limits to Growth, Meadows, Meadows, and Randers warn against a potential overshoot and collapse in global resource stocks due to political, informational, and technological-infrastructural delays.

A control system’s performance can be improved by augmenting it with an internal representation or model of the environment. The model makes it possible to respond to environmental conditions in real time, trading off measurement accuracy against timeliness. People and animals do this implicitly, and we only notice how well it normally works when it leads to errors: think of quickly picking up a full carton of milk and then realizing that it is actually empty!

Anti-Inductive Processes

A final note: some feedback systems are inherently unpredictable because predictions affect the future of the system in ways that make the predictions wrong. Efficient markets are the classic example, but there are many others. This is of course an oversimplification: market movement is influenced by a whole host of feedback loops, both positive (e.g. irrational exuberance) and negative (e.g. incorporation of real-world signals that affect underlying asset value).

Biological Control and The Self

The human body is full of homeostatic control systems regulating everything from body temperature, blood oxygenation, and pH to emotional state and neurochemical concentrations. Bateson refers to this kind of regulation as somatic to distinguish it from the genetic adaptation that occurs at much longer timescales. These somatic control systems are typically conservative, redundant, and robust:

“Among higher organisms it is not unusual to find that there is what we may call a “defense in depth” against environmental demands. If a man is moved from sea level to 10,000 feet, he may begin to pant and his heart may race. But these first changes are swiftly reversible: if he descends the same day, they will disappear immediately. If, however, he remains at the high altitude, a second line of defense appears. He will become slowly acclimated as a result of complex physiological changes. His heart will cease to race, and he will no longer pant unless he undertakes some special exertion. If now he returns to sea level, the characteristics of the second line of defense will disappear rather slowly and he may even experience some discomfort.”[8]

Certain pathologies of emotional control share characteristics of poorly tuned mechanical systems. The oscillations of bipolar disorder, e.g., suggest that the pathology might be caused either by the equivalent of underdamping or by a delayed response to signals, or both. And this suggests that mental training could be developed to mitigate the underlying cause – the interventions of cognitive behavioral therapy can be thought of in this way. Bateson considers similar dynamics in the development of schizophrenia in Part III of Steps.

Self-Control

As an illustration of this kind of control, hold out your hand. Notice how your hand is still; it neither raises nor lowers. Place a small weight in your hand: it barely moves. Decide to move your hand two inches higher: your muscles adjust and your hand promptly moves to a higher location.

It might occur to you to ask, if there’s a location where my muscles will just move my hand, what chose that particular location? Well, you say, obviously I chose it. Ok, now imagine that your hands are cold and you’re holding them over a fire. Are you still sure you chose how high to hold your hand? Again, set points are sometimes difficult to predict.

Bateson points out that much, if not all, of what we feel we control is in fact governed by the circumstances, by the things we come in direct contact with. He asks us to imagine a woodsman chopping down a tree. Does the woodsman control precisely where the axe lands?

“Each stroke of the axe is modified or corrected, according to the shape of the cut face of the tree left by the previous stroke. This self-corrective (i.e., mental) process is brought about by a total system, tree-eyes-brain-muscles-axe-stroke-tree; and it is this total system that has the characteristics of immanent mind.”[9]

Here Bateson’s use of “mental” and “mind” refer to the special, surprising, unpredictable behavior of cybernetic systems. This is the Ecology of Mind of the book’s title.

We are usually unaware of how dependent we are on these tiny, quick feedback loops until they are removed. Try touch-typing on a flat surface or in mid air to appreciate the tactile feedback of a computer keyboard. Imagine executing a left turn without touching a steering wheel. Undoubtedly the effectiveness of visualization training – vividly playing through a purely mental version of a tennis serve or a Beethoven sonata – derives in part from this discrepancy: the immediate tactile feedback in the “real” scenario and the imagined feedback of the visualized one.

Solving Parkinson’s Law

Let’s return to the puzzle of slow software and this time let’s identify the feedback loops involved. Software development is an iterative process:

  1. There is a problem to be solved which is usually codified as a set of requirements.
  2. The programmer comes up with a solution.
  3. The solution is then applied to the problem and its results are compared to the requirements.
  4. These results invariably fall short of perfection and the programmer tries again, adjusting for the new information.

A basic requirement of all software is to deliver results in a timely and responsive way, and these requirements are fundamentally human ones. Conventional wisdom states that a 100-ms delay at the start of an animation feels laggy, and furthermore achieving the smooth, jitter-free standard of 60 frames per second requires limiting each animation frame to a mere 16-ms compute budget. Once a timing requirement is achieved in a piece of software, no further work is needed: indeed, it would be a waste of development effort to turn a 100-ms delay into a 10-ms delay when users can’t tell the difference.

The final piece of the puzzle is that newer hardware is faster (see again Moore’s Law) and software makers target hardware that is relatively new. If you have an iPhone 5 and you’re running an app built for iPhone 6, there’s a good chance the app will feel sluggish.

The same implicit process is at work with the original Parkinson’s Law, the adage that work expands to fill the time allowed. Perhaps this process is familiar:

  • Do I have time to accomplish Thing X before the deadline?
    • If not, figure out which parts of Thing X are most important and do only those
    • If so, either slack off (the procrastinator’s way) or figure out what would make Thing X even better and do that (the overachiever’s way)

Intuitive Systems Thinking

Isn’t this just differential equations? you ask. The quants can worry about the numbers and I’ll just go on making my decisions intuitively the way I normally do.

But intuition is built on the capacity to imagine what is possible, and systems thinking affords a new way of imagining not only the way things behave but why. Bateson points out in an early essay, “Experiments in Thinking About Observed Ethnological Material,” that progress is made through a combination of loose and strict thinking. And if your loose mode is stuck anthropomorphizing and extrapolating linearly you’ll miss opportunities and fail to maintain your relationships as well as you could.

For me the biggest benefit to this way of thinking is knowing what a realistic solution looks like. When you want stability you start to look for opposing forces and negative feedback; you start to see yourself as operating within the system rather than manipulating the system from the outside. When you want to change the course of large, complex systems of multiple actors you begin to look for opportunities for greater leverage not by acquiring more raw power but by identifying the rules that govern incentives and determine set points. Above all, you now know to turn the handlebars to the right to initiate a turn to the left.

A final word of warning from Bateson:

“In extreme cases, change will precipitate or permit some runaway or slippage along the potentially exponential curves of the underlying regenerative circuits. This may occur without total destruction of the system. The slippage along exponential curves will, of course, always be limited, in extreme cases, by breakdown of the system. Short of this disaster, other factors may limit the slippage. It is important, however, to note that there is a danger of reaching levels at which the limit is imposed by factors which are in themselves deleterious. [English zoologist] Wynne-Edwards has pointed out – what every farmer knows – that a population of healthy individuals cannot be directly limited by the available food supply. If starvation is the method of getting rid of the excess population, then the survivors will suffer if not death at least severe dietary deficiency, while the food supply itself will be reduced, perhaps irreversibly, by overgrazing. In principle, the homeostatic controls of biological systems must be activated by variables which are not in themselves harmful. The reflexes of respiration are activated not by oxygen deficiency but by relatively harmless CO2 excess. The diver who learns to ignore the signals of CO2 excess and continues his dive to approach oxygen deficiency runs serious risks.”

When dealing with feedback systems of real consequence (our bodies, our organizations, our infrastructure and global ecosystem) we must be careful not to train away feedback controls without understanding the failure modes against which they protect.

Further Reading


Photo credits:


[1] Gregory Bateson, Steps to an Ecology of Mind, “Cybernetic Explanation,” p. 410

[2] Or to be more precise, the obvious answer is true, but doesn’t tell you anything about why we always try to do too much.

[3] Steps, “Experiments in Thinking About Observed Ethnological Material,” p. 77

[4] This is true both colloquially and in a technical sense. For details, see exponential growth.

[5] Strictly speaking in sigmoidal growth the quantity itself does not decay exponentially; rather the difference between the quantity and a final value decays, leading the quantity to asymptotically approach that final value.

[6] Steps, “Cybernetic Explanation,” p. 410

[7] Control systems are not omnipotent. An HVAC system can only deal with temperature differentials up to a certain point, and this is why BTUs and insulation are important.

[8] Steps, “The Role of Somatic Change in Evolution,” p. 351

[9] Steps, “The Cybernetics of “Self”: A Theory of Alcoholism,” p. 317


comments powered by