Self-Organizing Systems: A Tutorial in Complexity

Ethan H. Decker
Department of Biology
University of New Mexico
Albuquerque, NM 87106, USA


This is a tutorial on the processes and patterns of organization in complex natural systems. No technical details are included in describing the models or theories used. Instead, I focus on the concepts of self-organization, complexity, complex adaptive systems, criticality, the edge of chaos and evolution as they pertain to the formation of coherent pattern and structure in nature.


A burning question in physics, chemistry and biology is ``Where does order come from?'' Following the general laws of thermodynamics, physical and chemical systems follow the path of least resistance to dissipate any energy in the system. Eventually the system finds a low energy state, a dead calm, and remains at equilibrium there until some obvious perturbation increases its internal energy. For example, a pot of steaming sugar water will give off matter (water vapor) and energy (heat) until it reaches equilibrium with its environment. Cooling, evaporation and crystallization, governed by simple physical and chemical laws, will drive the system to a point of least energy, and the final resting state is rock candy in the bottom of a dry pot.

Yet the world abounds with systems and organisms that maintain a high internal energy and organization in seeming defiance of the laws of physics. As a bar of iron cools, ferromagnetic particles magnetically align themselves with their neighbors until the entire bar is highly organized. Water particles suspended in air form clouds. An ant grows from a single-celled zygote into a complex multicellular organism, and then participates in a structured hive society. What is so fascinating is that the organization seems to emerge spontaneously from disordered conditions, and it doesn't appear to be driven solely by known physical laws. Somehow, the order arises from the multitude of interactions among the simple parts. The laws that may govern this self-organizing behavior are not well understood, if they exist at all. It is clear, though, that the process is nonlinear, using positive and negative feedback loops among components at the lowest level of the system, and between them and the structures that form at higher levels. It is hoped that the study of such self-organizing systems (SOS) will reveal fundamental laws of organization that affect everything from whirlpools to stock markets.

The study of landscape ecology provides an example of how an SOS perspective differs from standard approaches. Ecologists are interested in how spatial and temporal patterns such as patches, boundaries, cycles, and succession arise in complex, heterogeneous communities. Early models of pattern formation use a `top-down' approach, meaning the parameters describe the higher hierarchical levels of the system. For instance, individual trees are not described explicitly, but patches of trees are. Or predators are modeled as a homogenous population that uniformly impacts a homogeneous prey population. In this way, the population dynamics are defined at the higher level of the population, rather than being the results of activity at the lower level of the individual.

The problem with this top-down approach is that it violates two basic features of biological (and many physiochemical) phenomena: individuality and locality. By modeling a rodent population as a mass of rodents with some growth and behavior parameters, we obviate any differences that might exist between individual rodents. Some are big, some are small, some reproduce more, some get eaten more. These small differences can lead to larger differences - such as changes in the population gene frequencies, individual body size, or population densities - that might have cascading effects at still higher levels.

The tenet of locality means that every event or interaction has some location and some range of effect [Kawata and Toquenaga, 1994]. For example, when a tree falls in the tropics, it often drags down several other trees and leaves a gap in the canopy. Tree gaps have resultant ecological changes that are extremely limited by the location and the size of the gap. Obviously, not every seed in the forest has an equal chance of germinating in the gap, but mathematical models often assume that seeds are uniformly distributed throughout the forest, and that the major influence on germination success is a species' relative abundance in the soil. Ignoring locality obscures the factors that might contribute to spatial and temporal dynamics. For instance, seedlings located on a high water table might grow better than those located on dry soil, and as they grow they might increase the moisture-holding capacity of that area, creating new landscape patterns. This is a simple illustration of the ecological principle that pattern affects process [Watt, 1947] [Huffaker, 1958].

To say that a system is self-organized is to say it is not entirely directed by top-down rules, although there might be global constraints on the system. Instead, the local actions and interactions of individuals generates ordered structures at higher levels with recognizable dynamics. Since the origins of order in SOS are the subtle differences among components and the interactions among them, system dynamics cannot usually be understood by decomposing the system into its constituent parts. Thus the study of SOS is synthetic rather than analytic. Several research institutes now focus on this topic, often from the perspective of one scientific discipline. Others, such as the Santa Fe Institute ( and the Center for Complex Systems Research ( at the University of Illinois, were formed specifically to tackle this subject with a multidisciplinary approach.

This tutorial follows the development of key concepts in the study of self-organizing systems. I begin with a summary of the apparent mechanisms of self-organization: a flow of energy into the system (e.g. the system is thermodynamically open), many parts with local interactions, nonlinear dynamics (e.g. feedback loops), and the emergence of new phenomena at higher levels of organization. Because SOS are often described as `complex systems', I then address the concept of complexity and how it has been recognized in one case as a dynamic state between order and chaos. This state is associated with intriguing computational properties that suggest a model for the behavior and properties of living systems, which are often perceived as the most complex of all.

I describe the theory of self-organized criticality, in which a SOS drives itself naturally to, and maintains itself indefinitely at a `critical' state in which complex phenomena appear. Reasons why biological or abiotic systems might evolve towards self-organized criticality are explored. Computer simulations known as individual-based models are described that might offer the best approach to investigate SOS. Finally, I draw some conclusions about self-organization, complexity, and the potential for these very general theories of organization to describe or explain natural systems.

Mechanisms of Self-organization

Several mechanisms and preconditions are necessary for systems to self-organize [Nicolis and Prigogine, 1989] [Forrest and Jones, 1994]. It is not clear whether these conditions are sufficient to permit self-organization, but they are useful intuitive indicators of the potential for self-organization:

Thermodynamically Open

First, the system (however defined) must be exchanging energy and/or mass with its environment. Adding heat to a pot of water or food to a fish tank are examples of energy flows. A system must be thermodynamically open because otherwise it would use up all the usable energy in the system (maximizing its corollary, namely, the entropy) and reach what is known as heat death. A nicer name for this is thermodynamic equilibrium. It is often said that SOS are ``far from'' thermodynamic equilibrium, but that is not necessarily the case. They only need be far enough to avoid collapsing into a local equilibrium condition. Sometimes that is not very far.

If a system is not at equilibrium, then it is dynamic, meaning it is undergoing continuous change of some sort. One of the most basic kinds of change for SOS is to import usable energy from the environment and export entropy back to it. The idea of ``exporting entropy'' is a technical way of saying that the system is not violating the second law of thermodynamics because it can be seen as a larger system-environment unit. For example, an ant ingests complex sugar molecules that are used to perform metabolic work. This results in an increase in the internal organization of the ant (e.g., growth). However, the sugars have been broken down to simpler molecules that embody less energy; entropy has increased in the ant temporarily. When the ant defecates, these lower-energy, higher-entropy molecules are returned to the environment, permitting the ant to maintain its local increase in order despite - and more precisely, because of - the global increase in entropy. This entropy-exporting dynamic is the fundamental feature of what chemists and physicists call dissipative structures. Nobel Laureate Ilya Prigogine believes dissipation is the defining feature of SOS.

Many Parts with Local Interaction

Since the magic of self-organization lies in the connections and interactions among the parts of the system, it is clear that SOS must have a large number of parts. Cells, living tissue, the immune system, brains, populations, hives, communities, economies, and climates all contain hundreds to trillions of parts. These parts are often called agents because they have the basic properties of information transfer, storage and processing. An agent could be a ferromagnetic particle in a spin glass, a neuron in a brain, or a firm in an economy. Models that assign agency at this level are known as individual-based models, such as ECHO [Hraber and Fraser, 1998]. They use computer simulations to observe how local, nonlinear interactions of many agents can develop into complex patterns. In contrast, traditional system models place agency at the group level by using group-level parameters such as population growth rates [Huston et al, 1988]. Sometimes this is effective, as with the ideal gas law, which accurately models extremely large ensembles of atoms in a gas (on the order of Avogadro's number, or tex2html_wrap_inline45 atoms). In this regard, SOS are ``middle-number'' systems, where the number of components lie between 3 and tex2html_wrap_inline45 .

Nonlinear Dynamics

Self-organization can occur when feedback loops exist among component parts and between the parts and the structures that emerge at higher hierarchical levels. For example, in a draining tub of water, individual molecules are releasing potential energy by following a path of least resistance towards the drain. Feedback among molecules eventually results in patches of molecules that move together as a unit. Eventually, feedback among patches results in the appearance of a whirlpool just above the drain at the macroscopic level. Finally, the behavior of the whirlpool at the macroscopic level will affect the microscopic behavior of molecules approaching the drain as they join the whirlpool.

A system with positive and negative feedback loops is modeled with nonlinear equations (see the sci.nonlinear newsgroup FAQ page: Consequently, the technical term for feedback in SOS is that they are nonlinear systems. Nonlinearity is ubiquitous in SOS and is a fundamental aspect of living systems. In biochemistry, when an enzyme catalyzes reactions that encourage the production of more of itself, the process is called auto-catalysis. It is possible that auto-catalysis was the primary behavior of proto-biotic systems during the origins of life [de Groot, 1995].


Probably the most nebulous concept of the group, the theory of emergence is a recasting of the phrase ``the whole is greater than the sum of the parts'' [Crutchfield, 1994]. More precisely, the whole exhibits patterns and structures that arise spontaneously from the behavior of the parts. Emergence indicates there is no code for a higher-level dynamic in the constituent, lower-level parts [Green, 1993]. Convection currents, eddies, cellular dynamics, the `mind,' and forest patches are examples of emergent phenomena. Some believe emergence is nothing more than a trick of perception, when the observer's attention shifts from the micro-level of the agents to the macro-level of the system. However, emergence fits well into hierarchy theory [Allen and Hoekstra, 1992] as a way of describing how each hierarchical level in a system can follow discrete rule sets. For example, the behavior of individual atoms in an ideal gas can be described using one set of equations, while the behavior of the bulk gas can be described using another set. Further, ideal gases in a turbulent flow can be described by a third set of equations that model the eddies and currents in the flow. Prigogine and Stengers (1984) have argued that macro-scale emergent order is a way for a system to dissipate micro-scale entropy creation caused by energy flux. In other words, a whirlpool spontaneously forms in a draining bath tub because it is a better way to dissipate the potential energy of the standing water than either smooth (laminar) or gurgling, chaotic (turbulent) flow.

Even knowing that self-organization can occur in systems with these qualities, it is not inevitable, and it is still not clear why it sometimes does. In other words, no one yet knows the necessary and sufficient conditions for self-organization.

Complexity at the Edge of Chaos

SOS often display a highly complex kind of organization. Hives have obvious patterns and regularities, but they are not simple geometric structures. Certainly stochastic (random) elements affect the structure and dynamics of a hive, but it is not likely that in a completely deterministic hive the patterns would be simple. Likewise clouds, weather patterns, ocean circulation, community assemblages, economies and societies all exhibit complex forms of self-organization. If so many SOS are characterized by complexity, it is fair to ask, ``What is complexity?''

There is no good general definition of complexity, though there are many definitions [Edmonds, 1997]. Intuitively, complexity lies somewhere between order and disorder, between the glassy-calm surface of a lake and the messy, misty turbulence in gale-force winds. Complexity has been measured by logical depth, metric entropy, information content, fluctuation complexity, and many other techniques. These measures are well-suited to specific physical or chemical applications, but none describe the general features of SOS. Instead, we must settle for the dictionary definition which pulls relative intractability (i.e. we can't understand it yet) and intricate patterning (i.e. we won't be able to understand it ever) into a conceptual taffy. Obviously, the lack of a definition of complexity does not prevent researchers from using the term.

One way to talk about complexity is to describe the boundary between order and chaos - where complexity would feasibly reside - as the edge of chaos [Packard, 1988]; [Langton, 1990]; [Kauffman, 1991, 1993]. Langton (1990) searched for the edge of chaos in a cellular automata (CA). He attempted to find out under what conditions a simple CA could possibly support ``computational primitives.'' A computational primitive is capable of transmission, storage, and modification of information.

In his experiment, a one-dimensional CA is composed of 128 cells arranged in a circle. Each cell is capable of four possible internal states (A, B, C, D). Each cell takes as its input the states of itself and the two neighbors on each side. These five cells are its neighborhood. The cell's internal state at the next time step is determined by the state of its neighborhood and some transition function which describes which internal state it should move to given its input (that is, the state of the neighborhood). All cells are updated simultaneously. For example, one transition function may be to change a cell from D to C when the neighborhood is ``BCDAB'' - this rule could be called ``BCDAB tex2html_wrap_inline49 C''. Thus the neighborhood state is associated with transmission, the automaton's internal state with storage, and the transition function with modification of information.

To examine how order and chaos affect computation, Langton formulates a value tex2html_wrap_inline51 (lambda) that describes the probability that a neighborhood state will lead the cell to an unchanging stable state, called the ``quiescent state.'' When tex2html_wrap_inline51 = 0, all neighborhood states move a cell to the quiescent state, and the entire CA quickly becomes completely ordered, or ``frozen.'' For example, the transition rule ``all neighborhoods tex2html_wrap_inline49 C'' will immediately freeze the CA in a ring of C's. When tex2html_wrap_inline51 = 1, no neighborhood state moves a cell to the quiescent state, and the CA will continue to fluctuate wildly.

When 0 1, the fun begins. As increases from zero, the CA exhibits longer and larger streams of cascading cell transitions called transients. For example, the transition rules may converge on a rule pair which say ``ABAAA tex2html_wrap_inline49 B'' and ``all others tex2html_wrap_inline49 A''. This rule pair would cause the B to flow indefinitely around a continuous ring of A's. As tex2html_wrap_inline51 approaches one, the transients break down and disperse. Transients are interpreted as the CA's ability to compute. The patterns that transients exhibit, particularly when the CA is graphed as a 2 dimensional time series, also hint of that elusive quality: complexity. Thus, computation seems to be possible at the edge of chaos.

Langton claims that as is increased, the CA undergoes a phase transition - an abrupt jump from an ordered phase to a chaotic phase. This transition occurs at a critical value of tex2html_wrap_inline51 = 0.50. Further, at and very near the critical value, average transient length becomes extremely large compared to below or above 0.50. Langton shows that the average mutual information (a kind of complexity measure) of the CA is also maximized at tex2html_wrap_inline51 = 0.50. If tex2html_wrap_inline51 exceeds the critical value, the average mutual information decays as the system becomes more chaotic. Langton suggests that because computation is associated with this critical value at the phase transition, a SOS will need to maintain itself at the ``edge of chaos" in order to compute its own organization:

One of the most exciting implications of
this point of view is that life had its
origin in just these kinds of extended
transient dynamics.... In order to
survive, the early extended transient
systems that were the precursors of life,
as we now know it, had to gain control
over their own dynamical state. They
had to learn to maintain themselves on
these extended transients in the face of
fluctuating environmental parameters,
and to steer a delicate course between
too much order and too much chaos, the
Scylla and Charybdis of dynamical systems.

- Langton (1990)

Phase transitions are physically and mathematically interesting. The transition from cold to hot temperatures for a liquid is smooth; the liquid gradually heats up. But the transition from boiling-temperature liquid to boiling-temperature gas occupies a small space between the two phases (some particular pressure-temperature combination). There is an abrupt change to the gas phase, and the two phases are clearly distinct, separated by the boundary at the phase transition conditions. Such boundaries are very useful for predicting the properties of a system or substance in different conditions. Phase transitions are often the location of interesting dynamics that do not appear in the phase regions. For instance, a simple solid will absorb much more energy per unit mass and will dissolve chemical bonds at the critical melting point. Also, many properties of critical systems will exhibit power law, or scaling distributions around the critical point. For example, the length L of transients in Langton's CA goes as L tex2html_wrap_inline73 , where tex2html_wrap_inline75 (alpha) is the power, or scaling exponent, of the distribution.

Phase transitions also occur in large networks when the connectedness between cells reaches a critical value. In a two dimensional lattice of cells, the number of connections among cells determines the probability that a patch of connected cells spans the entire lattice. When such a lattice-spanning patch exists, it is said that the system percolates [Stauffer and Aharony, 1985]. The boundary between sparse and percolating networks is well-known to be a phase transition: with a very large number of runs or a very large lattice, the boundary region becomes so thin it is approximated by a point. Percolation allows for long-range correlations between cells, so that distant cells are linked through the highly-connected network.

Phase transitions and percolation occur frequently in nature. For example, the ranges of two tree-dwelling squirrel species in New Mexico is divided by the phase-transition boundary between forest patches whose canopies are disjointed and those which are contiguous. Langton's work on phase transitions is compelling because it hints of ways to measure and perceive the special conditions under which self-organization might be possible.

Langton's CA is also a classic example of the Goldilocks concept of complexity. In this traditional fairy tale, a lost girl explores the home of papa, mama and baby bear while they are out on a walk. She finds their porridge and begins to eat it. Papa bear's is too hot, mama bear's is too cold, but baby bear's is just right. Their chairs are likewise too hard, too soft, or just right. Finally, their beds (in which she is finally caught) are either too tall, too short, or just right. In a similar fashion, Langton tunes tex2html_wrap_inline51 to the critical value so that the CA is not too ordered and not too chaotic, but just right for exhibiting complex behavior.

Self-organized Criticality

So far we have described the general mechanisms and features of SOS. We have discussed Langton's experiment, which demonstrates a thin region of complexity between order and chaos at which self-organization might be possible. Before continuing, it is important to note that these hypotheses are not proven, and in fact are under intense scrutiny because of the many assumptions the models make and the many profound conclusions drawn from them [Mitchell et al, 1993]; [Horgan, 1995]; [Sigmund, 1995]. Research in this area continues, though, because of the appeal of a theory of self-organization that could help explain the origins of order and life, and perhaps the process of evolution as well.

Bak et al (1988) studied the behavior of dynamical systems using computer simulations of their `sandpile' model. In this model, sand is poured onto a table in a continuous stream until miniature avalanches occur. When the slope of the pile is low, avalanches will always be small. When the slope is extremely steep, avalanches will always be large and span the entire system. However, low slopes are only obtained by perturbing the system (e.g., shaking the table), and high slopes by heavily loading the system (e.g., getting the sand a little wet so that it sticks, as in sand castles).

As sand is added slowly and continuously, the pile grows until it reaches a critical slope, at which point two things can happen: the pile maintains itself at this critical slope, and avalanches of all sizes occur, resulting in a scaling distribution of avalanche size. Thus, they note, the system self-organizes to this critical state without any fine tuning of the model. They name this process self-organized criticality (SOC), and they speculate that SOC ``might be the underlying concept for temporal and spatial scaling in dissipative nonequilibrium systems'' [Bak et al, 1988]. Turning the relationship on its head, Bak and others propose that in many dynamical systems, power law distributions of phenomena (e.g., avalanches of all sizes) indicate that a system is SOC.

Beyond those mechanisms outlined for self-organization, one essential criterion must be met for systems to exhibit SOC-type behavior: there must be a separation of scales between driving and dissipating processes [Jensen, 1998]. In the case of the sandpile, this means that the rate of sand addition (the driver) must be much slower than the speed of avalanches (the dissipation). If these rates are too similar, sand piles up and slips off in a continuous stream that does not exhibit power law statistics. Similarly, in a forest fire model, the rate of tree growth must be much slower than the rate of fire spread, or else the fire is continuously fueled and fire size does not scale. Further, if the rate of fire ignition is too slow relative to the rate of tree growth, the forest completely fills in between fires and burns down completely every time.

As an empirical test of SOC, Sole and Manrubia (1995) examine the distribution of treefall gaps in the Barro Colorado Island rainforest in Panama. Knowing that treefall and gap formation are vital to rainforest dynamics, they assert that the distribution and abundance of gaps are indicative of the organizational state of the forest. Gaps thus represent the dissipation of energy in the system. They hypothesize that gaps in the forest canopy will show a scaling distribution. In both the empirical data and a simulation, their hypothesis is supported. Further, in the simulation, biomass also shows scaling properties.

Knowing that in the simulation the system starts with an arbitrary set of trees, Sole and Manrubia conclude that the forest self-organizes to a state characterized by a scaling distribution of gaps and biomass. This scaling distribution is also called self-similar, or fractal, because the properties of large regions are identical to properties of small regions. In the case of treefall gaps, a small area has a statistically similar pattern as the entire forest. Self-similarity and fractal geometry abound in nature [Mandelbrot, 1977]. Clouds, have similar roughness, lumpiness, or smoothness at all scales, as do mountains, coastlines and storm systems. A small part of a lung embodies the same geometry as the whole. Fractal geometry was developed, in a sense, because Euclidian dimensions are insufficient to describe natural objects: the intermittent pattern of noise on a telephone line (between 0-D, a point, and 1-D, a line); the ruggedness of a coastline (between 1-D and 2-D, a plane); or the surface area of a tree canopy (between 2-D and 3-D, a solid). Because the distribution of treefall gaps in the forest exhibit fractal self-similarity, as measured by power laws, Sole and Manrubia conclude that this is suspiciously akin to Bak's critical state , and might indicate that the forest has evolved to SOC.

Sole and Miramontes (1995) followed a similar approach with Leptothorax ant colonies, testing whether actual and simulated ant colonies would exhibit self-similar structure with scaling phenomena at a critical value. They find that some properties do scale at a critical density of ants at which the connections between individuals allows for maximum information capacity of the colony. When the critical density is reached, the colony shows pulses of activity that exhibit self-similarity. Sole and Miramontes point out that the key parameter in determining the critical value of density is simply the number of individual agents in their model. This is corroborated by empirical observation: in actual colonies, when the number of ants increases substantially (towards a critical density), ants change the colony boundaries to achieve the critical density value for the new colony size. Thus Bak's SOC appears to be evident again in a biotic system. However, the driving and dissipating factors are not obvious.

Evolution to the Edge

The previous section explored self-organized criticality as a place on the edge of chaos where complexity and computation are possible, and where self-similar fractal structure is evident. It remains to be seen why and how a system would move itself to that state from some other state in the order-chaos spectrum.

For biotic systems, one important addition to the list of mechanisms and conditions for SOS is the ability of agents to adapt. This means agents in a system are capable of changing their internal information processing functions. Such a complex adaptive system (CAS, Forrest and Jones, 1994) is thus nonstationary, since the rule base of the agents will change through time. Living organisms adapt fundamentally by genetic recombination and mutation. In Langton's model, cells would be able to change their transition rules. If there was also some selection criteria by which some rules became more common in the CAS, it would be possible for the CAS to tune itself along the order-chaos spectrum (i.e. lambda). A series of questions about CAS arise: what are the mechanisms of adaptation? Under what conditions are they possible? By what rules do systems move among their adaptive choices (e.g., optimization or selection)? And do adaptive systems always move towards SOC?

While living systems clearly adapt, it is not obvious that they adapt towards a critical state. If a critical state is associated with greater long term survival and reproduction (i.e., fitness), a population might evolve towards a critical state because natural selection removes variants that are farther from the critical state and thus less fit. However, the link between a self-organized critical state - as evident by scaling laws and self-similarity - and fitness has not been established yet. Nonetheless, some scientists (notably more physicists than biologists) believe that SOC might provide a statistical theory of evolution.

Stuart Kauffman (1991, 1993) has developed a system called a coupled fitness landscape to model CAS. In his model (which is a Boolean network), N cells, each capable of A states, have K connections to other cells. The N-cell system is mapped into a K-dimensional ``landscape'' which topographically expresses all possible system states. Kauffman assigns fitness values to each of the A states, so that when system states are calculated throughout the K connections among the N agents, fitness ``peaks'' appear in the landscape which represent optimal system states. This landscape can represent an agent in an adaptive system: a genome, a population, a niche type. Kauffman links several of these landscapes together to study coevolution. One of his findings is that the connectivity K is a major determinant of how orderly or chaotic the system dynamics are. Just as with Langton's CA, a Goldilocks situation occurs. If K is too low, the system is ``frozen'' in its current state and is not very dynamic. If K is too high, the system is chaotic. But if K is just right, the system is able to climb up very high fitness peaks. Kauffman asserts that the connectivity among agents or parts of a CAS is an important factor in determining whether the system can or will evolve to higher fitness peaks. He suggests, then, that CAS such as genomes, brains, populations and communities may evolve towards the edge of chaos.

David Green at ANU also believes connectivity is a major structuring phenomenon, however he studies connectivity in landscapes (1994). In highly-connected landscapes, information travels very quickly and the system becomes more chaotic, whereas in sparsely-connected ones the system quickly settles into a stable or periodic state. He suggests that the number of connections to each unit in a self-organizing system might be the sole parameter that determines the self-organizing dynamics of the system.

A few other hypotheses exist for why SOS may move towards critical states, such as the law of maximum entropy production [Swenson, 1989] and perpetual disequilibration [Ito and Gunji, 1994], but these have yet to move beyond conjecture. Thus, the two most likely mechanisms remain natural selection and physical laws such as the interplay between friction and gravity in Bak's sandpile.

Mechanisms of SOS have been identified as possible sources of self-organization and complexity in biological and ecological systems including many parts; local, spatially-explicit interactions; thermodynamic flux; multiscale effects; nonlinear dynamics, and adaptation [Brown, 1994] [Judson, 1994]. Food webs, forest resiliency, anthropogenic deforestation, and host-parasatoid relationships are all thought to be SOS [Perry, 1995]. SOC is intuitively appealing to some ecologists who have become frustrated with traditional analytical techniques that fail to capture the intricate dynamics of systems as a whole.

While traditional models are good predictors of general ecological dynamics and structures, they are often inadequate at describing or predicting complex phenomena such as intricate habitat patch mosaics, temporal changes in community structure, or convoluted species distribution boundaries [Johnson et al, 1992]; [Judson, 1994]. Further, traditional models often fail to explain the mechanisms which give rise to such patterns [Huston et al, 1988]; [Judson, 1994]; [Kawata and Toquenaga, 1994]. A specific type of agent-based (or object oriented) models have been used to some success for exploring the possible mechanisms of self-organization in natural systems: individual-based models.

Individual-based Models

If most natural systems are self-organized, and biological systems are CAS, it is valid to question the explanatory usefulness of traditional models (such as coupled differential equations) whose assumptions obviate the above factors [Huston et al, 1988]; [Keitt and Johnson, 1995]. Believing this might be the case, scientists have built individual-based computer models which incorporate local interactions, multiple agents, and feedback loops in order to study the mechanisms of complex phenomena. Individual-based models explicitly track individual agents or components in a system. System states and dynamics are displayed and analyzed by the programs at any time during the system's run [Forrest and Jones, 1994]; [Hiebeler, 1994]. This class of models is currently the best means of quantitatively and qualitatively studying SOS.

As an example, several types of individual-based models have been built to study ecological systems, including cellular automata [Caswell and Cohen, 1991] [Langton, 1992], artificial life (A-Life) simulations (see the review by [Kawata and Toquenaga, 1994], and gap models (see the review by [Shugart et al, 1992]). If CA and ecological systems are analogous, statistical properties of CA might apply to ecological systems (e.g., Sole's Forest Game) and could supply explanations for certain ecological patterns. The possible correlations between the two systems are being investigated [Langton, 1994].

Gap models (e.g., FORET, FORSKA, and ZELIG) simulate the spatial variation found in forest stands due to the interactions of developing trees [Shugart et al, 1992]. Though gap models are essentially individual-based models, the limits of growth for individual organisms are imposed globally, and they appear to be based more on allometric data than physiological limitations [Shugart et al, 1992]. Thus they may not help generate realistic mechanistic explanations of large-scale patterns.

A-Life simulations such as ECHO [Hraber and Fraser, 1998], BOIDS [Reynolds, 1999], and SWARM ( most closely resemble biological systems. Individual agents have their own copies of behavioral code "genomes" that let them individually perceive the local environment, evaluate the input, and choose how to act [Forrest and Jones, 1994]; [Hiebeler, 1994]. A-Life worlds employ system-level constraints (e.g., limited resources or spatial structure) but incorporate no global rules governing individual behavior. Thus A-Life simulations offer the best opportunity to study CAS. Further, A-Life environments and agent rules can be based respectively on landscape ecology and plant physiology data (e.g., [Smith and Huston, 1989]; [McCauley et al, 1993]), allowing ecologists to model biological systems without the simplifying assumptions of traditional models.

Excitement about these models begs the question of whether they will be more useful than the generally simpler traditional models. Even if they can simulate a system in more detail, do we need such a fine level of resolution in our models? More importantly, do individual-based models help reveal mechanisms of complexity, or are they simply fancy descriptions?


This tutorial attempts to describe some of the developments in the field of complex systems, from self-organization, through complexity at the edge of chaos and self-organized criticality, to complex adaptive systems. Common aspects of these models include multiple parts, local interactions and connectivity, feedback loops, hierarchic organization and emergence, and energy flux. Such a list applies equally well to whirlpools and weather, lightning bolts, brains, biotic populations and communities, economic markets, air and road traffic systems, and the internet. That such simple models can produce such rich and complex phenomena provides hope that there are general and fundamental laws of organization that apply to all nonequilibrium systems.

Self-organization, complexity and criticality may prove to be extremely relevant to computer science and the study of artificial intelligence (AI). Computers excel at linear tasks for which algorithmic processes, many iterations and brute force are appropriate, such as long series of mathematical calculations. As computers are challenged to do more difficult tasks such as speech or visual pattern recognition, translation, and `learning', researchers will likely look to natural systems for clues as to how these tasks are accomplished in other systems. Some of this information will come directly from computer models that are developed specifically to study biotic systems. For example, an interdisciplinary group at the University of New Mexico, USA, is studying how immune systems distinguish `self' from `other', defend self from other when other is harmful, and adapt as either self or other changes. This biology research feeds directly into computer science research on computer security systems, in which biotic immune systems are used as a model for how computer security may behave in the future. (See their web site at immsec.)

Similarly, AI research will benefit from models of biotic intelligence, many of which directly or indirectly embody the concepts discussed in this tutorial. For example, if it is true that `computational primitives' only exist at the edge of chaos in self-organized systems, and if computational primitives are necessary for systems like brains to exhibit intelligence, then AI will only evolve in computer systems that are SOC or are complex adaptive systems. It is very likely that the study of AI and research in biocomplexity and basic biology will be complementary efforts.

Nonetheless, substantial work remains to be done. Primarily, more rigorous criteria must be developed for testing SOC in empirical systems. Currently, self-similarity and scaling distributions of system phenomena are perceived as diagnostic of SOC. However, a scaling distribution is not sufficient to conclude that a system is SOC; a power law can be produced by a host of processes (A. Allen, pers comm). It must be shown that empirical systems are exact analogs of SOC models. For example, in one recent model of long-term biotic evolution, species that have similar fitness are classified as being members of the same genus [Manrubia and Paczuski, 1998]. This is an incorrect representation of species, as genera are defined strictly by relatedness and ancestry, not by similarities in fitness.

Secondarily, power law behavior must be rigorously tested in empirical studies. It is well known that a lognormal with large variance, for instance, will approximate a power law over several orders of magnitude [Montroll and Schlesinger, 1982]. Appeals to finite-size effects or corrections to scaling will have to be fully justified for explaining why a power law may be a more appropriate model than a lognormal for a given set of data.

Despite these weaknesses (as well as others), SOC and the edge of chaos remain tantalizing ideas for those who seek to understand complex systems and explain the origins of order in natural systems. Although the concept of complexity may never be nailed down, it serves as an umbrella term for a host of tools, approaches and models that are being used at the frontiers of interdisciplinary research.


Overviews for the nonspecialist

[Bak, 1996]
Bak P. How Nature Works, Springer-Verlag, 1996.

[Gell-Mann, 1994]
Gell-Mann M. The Quark and the Jaguar, WH Freeman, 1994.

[Kauffman, 1995]
Kauffman SA. At Home in the Universe, Oxford Univ. Press, 1995.

[Waldrop, 1992]
Waldrop MM. Complexity. Simon & Schuster, 1992.

[Yates, 1987]
Yates FE (ed.), 1987. Self-Organizing Systems, Plenum, 1987.

Literature cited

[Allen and Hoekstra, 1992]
TFH Allen and TW Hoekstra. Toward a Unified Ecology, Columbia Univ. Press, 1992.

[Bak et al, 1988]
P Bak, C Tang and K Wiesenfeld. Self-organized Criticality. Physical Review A, Vol. 38, pp 364-374, 1988.

[Brown, 1994]
JH Brown Complex Ecological Systems. In G Cowan, D Pines, and D Meltzer (eds.), Complexity: metaphors, models, and reality. SFI studies in the sciences of complexity, Proc. Vol. XIX, Addison-Wesley, 1994.

[Caswell and Cohen, 1991]
H Caswell and JE Cohen. Communities in Patchy Environments: A Model of Disturbance, Competition, and Heterogeneity. In J Kolasa and STA Pickett (eds.), Ecological Heterogeneity, Springer, 1991.

[Crutchfield, 1994]
Crutchfield JP. Is anything ever new? In G Cowan, D Pines and D Melsner (eds.) SFI studies in the sciences of complexity, Vol. XIX, Addison-Wesley, 1994.

[de Groot, 1995]
de Groot M. Primordial Soup, 1995.

[Edmonds, 1997]
Edmonds B. Hypertext Bibliography of Measures of Complexity, 1997. bruce/combib/

[Forrest and Jones, 1994]
Forrest S and Jones T. Modeling Complex Adaptive Systems with Echo. In RJ Stoner and XH Yu (eds.), Complex Systems: mechanisms of adaptation, IOS Press, 1994.

[Green, 1993]
Green DG. Emergent behaviour in biological systems. In DG Green and TJ Bossomaier (eds.), Complex Systems - From Biology to Computation, IOS Press, 1993.

[Green, 1994]
Green DG. Connectivity and the evolution of biological systems. Journal of Biological Systems, Vol. 2, pp 91-103, 1994.

[Hiebeler, 1994]
Hiebeler D. The swarm simulation system and individual-based modeling. Santa Fe Institute working paper 94-12-065, 1994. hiebeler/swarm-paper.html

[Horgan, 1995]
Horgan J. From complexity to perplexity, Scientific American, pp 104-109, June 1995.

[Hraber and Fraser. 1998]
Hraber P and S Fraser. Echo, 1998.

[Huffaker, 1958]
Huffaker CB. Experimental studies on predation: dispersion factors and predator-prey oscillations, Hilgardia, Vol. 27, pp 343-383, 1958.

[Huston et al, 1988]
Huston M, DeAngelis D and Post W. New computer models unify ecological theory, Bioscience, Vol. 38, pp 682-691, 1988.

[Ito and Gunji, 1993]
Ito K and Gunji Y. Self-organisation of living systems towards criticality at the edge of chaos, BioSystems, Vol. 33, pp 17-24, 1993.

[Jensen, 1998]
Jensen HJ. Self-Organized Criticality, Cambridge Univ. Press, 1998.

[Johnson et al, 1992]
Johnson AR, Wiens JA, Milne BT and Crist TO. Animal movements and population dynamics in heterogeneous landscapes, Landscape Ecology, Vol. 7, pp 63-75, 1992.

[Judson, 1994]
Judson OP. The rise of the individual-based model in ecology, Trends in Ecology and Evolution, Vol. 9, pp 9-14, 1994.

[Kauffman, 1991]
Kauffman SA. Coevolution to the edge of chaos: coupled fitness landscapes, poised states, and coevolutionary avalanches. Journal of Theoretical Biology, Vol. 149, pp 467-505, 1991.

[Kauffman, 1993]
auffman SA. Origins of Order: self- organization and selection in evolution, Oxford Univ. Press, 1993.

[Kawata and Toquenaga, 1994]
Kawata M and Toquenaga Y. Artificial individuals and global patterns. Trends in Ecology and Evolution, Vol. 9, pp 417-421, 1994.

[Keitt and Johnson, 1995]
Keitt TH and Johnson AR. Spatial heterogeneity and anomalous kinetics: emergent patterns in diffusion-limited predatory-prey interaction. Journal of Theoretical Biology, Vol. 172, pp 127-139, 1995.

[Langton, 1990]
Langton CG. Computation at the edge of chaos: phase transitions and emergent computation. Physica D, Vol. 42, pp 12-37, 1990.

[Langton, 1994]
Langton CG (ed.). Artificial Life III, Addison-Wesley, 1994.

[Manrubia and Paczuski, 1998]
Manrubia SC and Paczuski M. A simple model of large scale organization in evolution. International Journal of Modern Physics C, Vol. 9. pp 1025-1032, 1998.

[McCauley et al, 1993]
McCauley E, Wilson WG and de Roos AM. Dynamics of age-structured and spatially structured predator-prey interactions: individual-based models and population-level formulations. American Naturalist, Vol. 142, pp 412-442, 1993.

[Mitchell et al. 1993]
Mitchell M, Hraber P and Crutchfield JP. Revisiting the edge of chaos: Evolving cellular automata to perform computations. Complex Systems, Vol. 7, pp 89-130, 1993.

[Montroll and Schlesinger, 1982]
Montroll EW and Schlesinger MF. On 1/f noise and other distributions with long tails. Proc. Natl. Acad. Sci. USA, Vol. 79, pp 3380-3383, 1982.

[Nicolis and Prigogine, 1989]
Nicolis G and Prigogine I. Exploring complexity, WH Freeman, 1989.

[Packard, 1988]
Packard NH. Adaptation toward the edge of chaos. In AJ Mandell, AS Kelso and MF Shlesinger (eds.), Dynamic Patterns in Complex Systems, World Scientific, 1988.

[Perry, 1995]
Perry DA. Self-organizing systems across scales. Trends in Evolution and Ecology, Vol. 10, pp 241-244, 1995.

[Prigogine and Stengers, 1984]
Prigogine I and Stengers I. Order Out of Chaos: Man's New Dialogue with Nature, Bantam, 1984.

[Reynolds, 1999]
Reynolds D. Boids (flocks, herds and schools: a distributed behavioral model), 1999.

[Shugart et al, 1992]
Shugart HH, Smith TM and Post WM. The potential for application of individual-based simulation models for assessing the effects of global change. Annual Review of Ecological Systems, Vol. 23, pp 15-38, 1992.

[Sigmund, 1995]
Sigmund K. Hidden order: how adaptation builds complexity, J.H. (Review) Nature, Vol. 378, pp 453-453, 1995.

[Smith and Huston, 1989]
Smith T and Huston M. A theory of the spatial and temporal dynamics of plant communities, Vegetatio, Vol. 83, pp 49-69, 1989.

[Sole and Manrubia, 1995]
Sole RV and Manrubia SC. Are rainforests self-organized in a critical state. Journal of Theoretical Biology, Vol. 173, pp 31-40, 1995.

[Sole and Miramontes, 1995]
Sole RV and Miramontes O. Information at the edge of chaos in fluid neural networks, Physica D, Vol. 80, pp 171-180, 1995.

[Stauffer and Aharony, 1985]
Stauffer D and Aharony A. Introduction to Percolation Theory, Taylor & Francis, 1985.

[Swenson, 1989]
Swenson R. Emergent attractors and the law of maximum entropy production: foundations to a theory of general evolution. Systems Research, Vol. 6, pp 187-197, 1989.

[Watt, 1947]
Watt AS. Pattern and process in the plant community. Journal of Ecology Vol. 35, pp 1-22, 1947.

SOS, SOC, CAS, CA & complexity links

Ethan H Decker received a B.A. in Sociology, phi beta kappa, from Oberlin College in Ohio, USA. He then worked as an independent consultant on management, customer service and organization. He is currently a Ph.D. student in the Department of Biology at the University of New Mexico, USA, where he studies urban ecology and complex systems.

About this document ...

Self-Organizing Systems: A Tutorial in Complexity

This document was generated using the LaTeX2HTML translator Version 96.1 (Feb 5, 1996) Copyright 1993, 1994, 1995, 1996, Nikos Drakos, Computer Based Learning Unit, University of Leeds.

The command line arguments were:
latex2html -split 0 sos.tex.

The translation was initiated by KBCS-2000 Secretariat on Tue Feb 13 14:19:40 GMT 2001

KBCS-2000 Secretariat
Tue Feb 13 14:19:40 GMT 2001