Interaction reveals when a collection has become a system
A desk covered with books, receipts, mugs, and cables can be messy. It can even feel like a system, especially on a bad morning. For the question we are asking here, the objects mostly wait there. A receipt can move while the mug remains a mug. A cable can shift while the book stays closed. Unless we ask about workflow, access, or safety, little about this collection depends on mutual adjustment among its parts. Nothing is listening to anything else.
A crowded hallway is different. One person slows near a doorway. The person behind slows too. A third takes a wider path, and that wider path narrows the space for someone approaching from the other side. Within a few seconds the shape of the whole crowd has changed. No one planned the pattern. Each person responded locally, and the responses became a shared motion that none of them intended and all of them produced.
This difference is model-relative. The desk has physical interactions: gravity, friction, and support are all present. For the question we are asking, those interactions do not produce the behavior of interest. In the hallway, the interaction layer is exactly where the behavior comes from.
Which of these is a complex system rather than a mere collection?
The parked cars sit in their spaces and do not change one another's conditions: a collection. At 5 PM, each driver's exit path depends on what the other drivers are doing, and those mutual adjustments produce traffic patterns no one planned: a complex system.
That hallway gives the first visible ingredient of complexity: mutual influence. When the parts change one another's conditions, a collection starts behaving like a system. Scott E. Page offers a compact frame: complex systems involve diverse, connected, interdependent, often adaptive parts whose interactions produce patterns that can be hard to explain or predict from the parts alone. [1, 2]Citation 1, 2
To make that observation precise, we need shared vocabulary for what systems are made of.

Components, interactions, and boundaries
A system's components are the entities a model treats as its working units. In a traffic model, the components might be vehicles. In an ecosystem model, species or populations. In an economy, firms, households, or regulators. The choice is a modeling decision: what you call a component depends on the question you are trying to answer. A hospital can be one component in a regional healthcare model and a system of thousands of interacting components when studied on its own. [13, 15]Citation 13, 15
Components become interesting when they interact. Interaction means that what one component does changes the conditions another component faces. Cars in a parking lot at midnight sit independently: each occupies its space regardless of the others. Cars at a busy intersection interact continuously, because each driver's acceleration, braking, and lane choice reshapes the options available to nearby drivers.
When a model gives a component conditional behavior, the ability to respond differently depending on its current situation, that component is called an agent. Later modules develop this idea into agent-based modeling. For the primer, the relevant distinction is that components range from passive (a rock in a streambed deflecting water) to highly responsive (an investor adjusting strategy after a market shock). The more responsive the components, the richer the system's possible behavior. [16]Citation 16
A system boundary is the line drawn around what counts as part of the system for the question being asked. Boundaries are accountable to the question. A hallway crowd can be studied as people and doors. If the question concerns a fire drill, alarms, exits, signage, and staff instructions belong inside the boundary too. A narrow boundary can make the mechanism visible. A boundary that is too narrow can hide the cause.
Consider a shared fishing ground. In the stripped-down tragedy-of-the-commons model, the boundary includes the resource and the users, leaving out the institutions that shape use. Under those assumptions, each fisher captures the full benefit of taking more while the cost of depletion spreads across everyone, so the model predicts collapse.
Elinor Ostrom widened the boundary. Studying fisheries, forests, and irrigation systems across dozens of countries, she documented what communities actually build: harvest rules, monitoring that makes cheating visible, graduated sanctions, collective-choice arrangements, and trust from repeated interaction. Those institutions change the model's causal logic. Monitoring alters what users know. Sanctions change the payoff for cheating. Reputation becomes an enforcement mechanism. The predicted outcome reverses: the narrow model predicts inevitable depletion; the wider model predicts conditional self-governance. The same resource, the same users, opposite conclusions. The boundary determined which causal story the model could tell. [10]Citation 10
Ostrom grew up in modest circumstances during the Depression and went on to become the first woman to win the Nobel Memorial Prize in Economic Sciences. She spent decades studying how communities actually solve collective-action problems: fishers in Turkey and Canada, farmers managing irrigation in the Philippines and Spain, forest users in Japan and Switzerland. Her empirical approach forced both theorists and modelers to take institutional context, local monitoring, and graduated sanctions seriously as mechanisms of distributed coordination. [10]Citation 10
You are studying why one coffee shop is always crowded and the one next door is empty. You draw a narrow boundary (just the shop) and a wide boundary (the whole block). Which captures more about the shop's internal operations?
The narrow boundary (the shop itself: layout, menu, service speed) explains internal efficiency but misses foot traffic, neighborhood demographics, and competitor pricing. The wide boundary (the block, transit access, rental costs, online reviews) explains location advantage but obscures what happens inside. The boundary follows the question you are trying to answer, and changing the boundary changes what you can explain.
With components, interactions, and boundaries in hand, we can ask what kinds of system these ingredients produce.
Five kinds of system
In 1948, a science administrator named Warren Weaver clarified a gap that existing scientific frameworks handled awkwardly. Science, he argued, had developed powerful tools for two kinds of problem. Problems of simplicity involve a few variables linked by equations you can solve exactly: a single pendulum, a falling ball, two bodies orbiting each other. Problems of disorganized complexity involve so many independent variables that statistical averages become powerful: the pressure of a gas in a container, the expected yield of a large insurance pool. Between these two extremes lay a vast, underdeveloped territory Weaver called organized complexity: problems where many variables interact in structured ways, so the arrangement of relationships becomes part of the explanation. The hallway crowd from the opening section sits in that territory: the pattern of crowd movement depends on how the people are positioned relative to one another, and the structured interactions among them carry the explanation. [3]Citation 3
Weaver's interest in organized systems traces back to a one-dollar electric motor powered by a dry cell battery, which he took apart, rewired, and rebuilt as a child. That hands-on curiosity traveled with him through a career in mathematics and into the Rockefeller Foundation, where he used his position to push biology toward quantitative, cross-disciplinary methods. His 1948 essay on organized complexity grew from decades of watching scientists struggle with problems that occupied the space between the solvable few-variable case and the statistically tractable many-variable case. [4]Citation 4
Weaver's three regions are a starting point. For this primer, we separate five diagnostic zones. These are modeling categories: ways a system can behave under a given question, scale, and timescale. The same real-world process can shift between zones when the question changes. Keeping the five zones distinct prevents the rest of this course from becoming vague. [13, 15, 21]Citation 13, 15, 21
A simple system has few variables with relationships you can write down and solve. A pendulum swings with a period determined by its length and local gravity. A lever multiplies force according to the ratio of its arms. Predict the next state from the current state using an exact equation: the explanatory power of reduction is complete here, because knowing the parts and their arrangement tells you everything about the whole.
A complicated system has many precisely arranged parts with stable, engineered couplings. A jet engine contains thousands of components, each with a specified role, tolerance, and failure mode. A Swiss watch has hundreds of gears whose interactions are designed to be repeatable. You can take a complicated system apart, study each subsystem, and reassemble your understanding into a full explanation. The difficulty is logistical and managerial, not conceptual.
A complex system has interacting parts whose relationships generate behavior that no single part controls. A city's traffic patterns, an immune system's response to a novel pathogen, a stock market's reaction to unexpected news: in each case, the behavior of interest lives in the interactions. You cannot fully explain a traffic jam by disassembling cars the way you can explain many jet-engine failures by disassembling engine subsystems. The jam is a relationship among drivers under specific conditions: spacing, braking, timing, route choices, and road geometry. [14, 15]Citation 14, 15
Complex System A set of interacting parts whose relationships create a pattern at the level of the whole. Listing parts is insufficient: each part changes the conditions others face, and the phenomenon lives in their mutual adjustment.
A chaotic system follows deterministic rules yet remains practically unpredictable because tiny differences in starting conditions grow exponentially over time. Weather is the classic case: the atmosphere obeys known physical laws, and short-range forecasts work well, yet small measurement uncertainties compound until prediction fails within days. A double pendulum traces a path governed entirely by Newtonian mechanics, yet two runs started a fraction of a degree apart will diverge into visibly different trajectories within seconds. Chaos is sensitivity to initial conditions under deterministic rules. [13]Citation 13
A random system has no exploitable structure at the level of individual events. The decay of a single radioactive atom follows no pattern you can use to predict when it will happen. A well-shuffled deck carries no exploitable memory of its previous arrangement, assuming the shuffle has made that order practically unavailable. When events are genuinely random, the useful description is statistical: averages, distributions, and probabilities over many events. This is Weaver's disorganized complexity, the zone where large numbers and independence make statistics powerful.
These categories overlap at their edges. Weather is chaotic on a two-week horizon and statistical on a climate timescale. A market can be complicated when studied as a regulatory infrastructure and complex when studied as a pricing system. The category depends on the question and the model, not on the object alone. Keeping the five zones distinct is a diagnostic discipline: when someone calls a problem "complex," this primer asks which kind of system they mean, and what evidence supports the label.
Classify these four cases: a jet engine, a double pendulum, a well-shuffled deck, and an immune response to a new pathogen. Which mapping best fits the five-system typology?
A jet engine has many stable, engineered couplings, so it is complicated. A double pendulum follows deterministic mechanics while tiny initial differences quickly grow, so it is chaotic. A well-shuffled deck has no exploitable event-level structure, so it is random. An immune response involves many interacting cells, signals, thresholds, and feedback paths, so the behavior lives in relationships across components: complex.
Complex and complicated differ by decomposability
A watch is complicated because it has many precisely arranged parts whose interactions are stable and engineered. If a gear breaks, you can replace it; each gear keeps the same role throughout. A city commute is complex because drivers, signals, route choices, delays, expectations, and bottlenecks continually change the conditions faced by other parts of the system. The relationships in a complicated system can be catalogued and held fixed. The relationships in a complex system help create the behavior being studied. [21]Citation 21
Herbert Simon's parable of two watchmakers sharpens the distinction. Both Hora and Tempus build watches of a thousand parts. Tempus assembles each watch as a single unit; every interruption means starting over. Hora builds stable subassemblies of ten, combines those into larger subassemblies, and works upward. When interrupted, Hora loses at most a small module. Hierarchical, modular organization is easier to evolve, easier to repair, and easier to analyze. Simon called this architecture "near-decomposability": tight coupling inside each module, looser coupling between them. [22]Citation 22
Simon earned a PhD in political science, won the Nobel Prize in Economics, and made foundational contributions to artificial intelligence, cognitive psychology, and organizational theory, often within the same decade. His breadth reflected a sustained investigation of one question: how do bounded agents make decisions in environments too complex to optimize? That question led him to satisficing, to the architecture of nearly decomposable systems, and to some of the earliest computer programs that modeled human problem-solving. [22]Citation 22
In a nearly decomposable system, subsystems interact intensely within themselves and more weakly across their boundaries. A hospital's emergency department illustrates the pattern. Inside the department, every triage decision reshapes the workload: a trauma arrival pulls nurses from other patients, delays cascade through bed assignments, and the attending physician adjusts priorities minute by minute. Those internal interactions are intense and fast. The department's coupling to the rest of the hospital, through the lab processing blood work, the pharmacy dispensing medications, and admissions finding inpatient beds, is real but slower and looser. On a typical shift, the ER can largely be understood on its own terms. When the lab backs up or admissions freezes, the cross-boundary coupling tightens and the departments can no longer be analyzed separately. The question is always whether the boundary between subsystems is loose enough to permit local analysis and tight enough to hold on the timescale of your question. [13, 22]Citation 13, 22
A factory has three production lines, each with its own quality-control loop. The lines share one raw-material supplier. When the supplier has a shortage, all three lines slow down together. Is this factory decomposable, nearly decomposable, or not decomposable?
Each line runs its own quality-control loop (strong internal coupling), and under normal conditions the lines can be analyzed separately. The shared supplier introduces cross-line coupling that shows up during shortages. This is near-decomposability: local independence in the short run, global coupling on a longer timescale or under stress.
Decomposability A system is decomposable when studying its parts separately still reconstructs the whole explanation. The parts must keep stable roles when isolated. A gear has the same job on the bench and in the watch; a traffic jam disappears when cars leave their spacing and braking relations.
Reduction is useful and incomplete
Reduction works by taking a system apart and explaining the whole in terms of its components. For simple systems this is complete: knowing the mass and length of a pendulum gives you its period. For complicated systems it is powerful: knowing how each subsystem of a jet engine works, and how the subsystems connect, gives you the performance of the whole engine. Reduction has explained everything from planetary motion to gene expression by taking systems apart. Recognizing where it reaches a boundary improves how you use it. [9]Citation 9
The boundary appears when the behavior of interest lives in the interactions rather than in the components. A traffic jam is made of ordinary cars with ordinary engines, tires, and steering systems. Studying each car in isolation tells you everything about how cars work and nothing about why the jam formed. The jam is a relationship among cars: a pattern of spacing, timing, and mutual adjustment that exists only when the cars interact under specific conditions. Take the system apart and the phenomenon vanishes, because the explanation was relational. [13, 15]Citation 13, 15
Anderson's 1972 argument made the same point for physics. The laws governing each level of organization are consistent with the laws governing the level below: solid-state physics does not violate particle physics, biology does not violate chemistry. Yet each level can display regularities that require their own concepts. Superconductivity is a case in point: certain materials, when cooled below a critical temperature, lose all electrical resistance. The phenomenon obeys quantum mechanics at every step, yet a single-electron description leaves the phenomenon out of reach. Zero resistance appears only when billions of electrons interact within a crystal lattice at the right temperature, producing a collective state that no individual electron possesses. The logic mirrors the traffic jam: the phenomenon lives in relationships among components, and disassembling the system to study parts in isolation makes it vanish. [9]Citation 9
The lesson for the rest of this course: reduction tells you what the components can do. It may not tell you what the organized components will do when their relationships become the source of the phenomenon. Complexity science studies those cases.
The next three sections describe three ways that organized interactions produce their effects.
Nonlinearity makes effects state-dependent
In a linear system, effects add cleanly: double the input, double the output. In a nonlinear system, the same small change can have almost no effect in one state and a dramatic effect in another.
One person pausing in an empty hallway barely matters. One person pausing at a crowded doorway can create a bottleneck that persists for minutes. One extra claim in a trusting conversation can open a useful distinction. The same claim in an exhausted room can harden every response that follows.
A rumor-spread model makes this precise. Let I be the number of people who know and N the total population. New transmissions follow \Delta I = \beta \, I \, (N - I), where \beta is the spread rate. Growth depends on the product of two quantities: how many can spread (I) and how many remain to hear it (N - I). Early on, few spreaders means slow growth. In the middle, many spreaders meet many unaware people, and the same person-to-person mechanism produces rapid change. Near saturation, almost everyone already knows, so (N - I) shrinks and spread slows. The mechanism stays constant; its effect is state-dependent.
A heater raises a room's temperature by 1 degree per minute. A rumor spreads through a school, where new transmissions depend on both how many people already know it and how many remain unaware. Which statement is correct?
The heater is linear in this idealized example: it adds a fixed increment per minute regardless of the current temperature. The rumor process is nonlinear because the number of new transmissions depends on the current state of the system in a multiplicative way. A simple form is I_{t+1} = I_t + \beta \, I_t (1 - I_t / N), where I_t is the number who know the rumor, N is the total population, and \beta is the spread rate. The term I_t means more spreaders create more spread. The term (1 - I_t / N) means spread slows as fewer unaware people remain. Each step's size depends on where the system currently sits.
Nonlinearity gives a precise reason prediction can become hard. The effect of a change depends on the state, constraints, and relationships already present. More data may still leave the response dependent on when, where, and under what conditions an intervention arrives. [13, 15]Citation 13, 15
Nonlinearity An input whose effect depends on the system's current state, constraints, and relationships. The same small change can be negligible in one configuration and transformative in another.
Collective behavior arises from local rules
Each component acts on what it can sense from its immediate surroundings, with no access to the state of the whole. In a hallway, the rule might be "keep moving toward the exit while avoiding collisions with whomever is directly ahead." In a cell, one protein may increase or suppress the activity of another protein it physically contacts. In a market, each trader enters bids and offers based on local information, recent prices, and personal risk tolerance.
No single component sees the whole, yet the whole develops coherent structure. A line forms. A debate polarizes or converges. A cell settles into a stable identity. A market price rises as many separate bids and offers interact.
200 people in a large room each follow one rule: 'Stand exactly 2 meters from the nearest person.' What global pattern forms?
The equal-distance rule produces roughly even spacing. Now consider a different rule: "Stand 2 meters from exactly 2 people." That rule produces chains and loops instead, because each person needs exactly two neighbors at the right distance. A small change in the local rule transforms the global pattern entirely.
The global pattern is distributed across relationships. Repeated local responses can make the whole system look coordinated even though the coordination is spread through interactions. Each ant carries local information. Each neuron contributes local signals. Each trader enters partial bids and offers. The pattern lives in the relationships. [13, 14]Citation 13, 14
Deborah Gordon's work on harvester ant colonies at Stanford makes this vivid. Consider foraging. A forager returning with food passes other ants near the nest entrance, each meeting a brief antennal touch lasting less than a second. When she encounters many returning foragers in quick succession, the high contact rate carries a signal: food is plentiful. More ants leave to forage. When returning foragers are sparse, fewer go out. No ant needs a global count of the colony's state, and the queen sends no one. The contact rate IS the information, and the colony's foraging effort adjusts as that rate shifts through the day. What looks from the outside like purposeful collective action is, from the inside, thousands of local responses running in parallel, each one shaped by the interaction that just happened. [8]Citation 8
Emergence names a pattern with a mechanism
Emergence means that a pattern appears at the level of the whole system because parts interact. A queue emerges when people adjust position and speed around a shared bottleneck. A price emerges when many bids, offers, expectations, and constraints meet. A stable cell identity emerges from gene regulation: a skin cell and a liver cell carry the same DNA, yet each expresses a different set of genes, and networks of proteins that activate and suppress one another hold each cell's expression pattern in place, returning it to its characteristic configuration after small disturbances. [13, 15]Citation 13, 15
Philip Anderson gave this pattern a name in 1972: "more is different." Once many components interact, new properties and regularities can appear at the collective level that the properties of isolated components alone would not predict. The queue's wait time, the market's price stability, the cell's maintained identity: each is a property of the organized whole that vanishes when you study the components in isolation. Anderson argued that this holds across every level of scientific organization, and that each level can require its own explanatory principles even when the lower level's laws are fully known. [9]Citation 9
This primer treats emergence as a traceable relationship: the pattern is tied to interactions, rules, constraints, and boundary conditions that can be studied or simulated. A good emergence claim specifies three things: what is interacting, what guides the interaction, and what whole-system pattern results. If you cannot name all three, you have an observation waiting for a mechanism. Emergence is a modeling relationship, a statement about what the interactions produce, and the explanatory work lives in specifying that relationship precisely. [13, 15]Citation 13, 15
Traffic jams offer a crisp example. Vehicles interact through spacing and speed adjustments. The interaction rule is local: slow when the car ahead is too close, accelerate when space opens. The emergent pattern is a backward-propagating wave of congestion that moves against the direction of traffic, even though every car moves forward. Experimental work has demonstrated that this jam can form on a circular track with no bottleneck, no accident, and no on-ramp: the interaction rule alone generates the wave. [25]Citation 25
Pick one emergent pattern you encountered this week, such as a traffic jam, a conversation dynamic, or a line at a store. Can you name (a) what was interacting, (b) what guided the interaction, and (c) what whole-system pattern resulted?
Emergence A whole-system pattern or property that appears because parts interact under rules and constraints. A good emergence claim specifies three things: what interacts, what guides the interaction, and what whole-system pattern results.
Mutual influence is the first diagnostic
A good first model asks how diverse components interact, what feedback carries forward, what state changes, and which system-level patterns those local responses produce.
The next module examines coupling: how the strength and structure of interactions shape system behavior. The feedback module explains why causes become circular through time. State space will give a way to map possible configurations. Networks will show how connection patterns shape what can spread or stabilize. Adaptation will ask how the components themselves change under pressure. Each module adds one mechanism to this toolkit; together they explain why interacting parts produce behavior that none of them planned.
When the parts change one another, you have found the interaction layer. The fuller complexity question is how diversity, feedback, state, and boundary shape the pattern that follows.
Complexity has structure. It lives in relationships, feedback, state, boundaries, and aggregate patterns.
Try this on one ordinary situation
Picture a busy sidewalk, the kind where you have to navigate around oncoming people every few steps.
Watch the mutual adjustments. You angle left to avoid someone coming toward you. That shifts you closer to a person walking your direction, who speeds up slightly, opening a gap behind them that someone else steps into. The person you initially avoided has adjusted their path too, partly because of you. These adjustments are all happening at the same time: everyone is responding to everyone else, and each response changes the conditions the others face.
Find the pattern no one designed. After a block, a rough structure appears. People heading the same direction cluster into informal lanes. Gaps open and close. The crowd has a shape that nobody decided on and nobody controls. Two blocks with the same number of people can produce entirely different crowd patterns, depending on sidewalk width, where people entered the flow, and the state of the crowd when each person arrived.
Draw a boundary. Your analysis included people, walking paths, sidewalk width, and timing. It left out each person's destination, their life history, and the city's zoning code. Those omissions serve the question: you are asking how the crowd pattern forms. If you changed the question to why one block is always more congested than the next, the boundary would expand to include building entrances, bus stops, and signal timing. The boundary follows the question you are trying to answer.
