It From Bit

Henry Louis Mencken [1917] once wrote that “[t]here is always an easy solution to every human problem — neat, plausible and wrong.” And neoclassical economics has indeed been wrong. Its main result, so far, has been to demonstrate the futility of trying to build a satisfactory bridge between formalistic-axiomatic deductivist models and real world target systems. Assuming, for example, perfect knowledge, instant market clearing and approximating aggregate behaviour with unrealistically heroic assumptions of representative actors, just will not do. The assumptions made, surreptitiously eliminate the very phenomena we want to study: uncertainty, disequilibrium, structural instability and problems of aggregation and coordination between different individuals and groups.

The punch line of this is that most of the problems that neoclassical economics is wrestling with, issues from its attempts at formalistic modeling per se of social phenomena. Reducing microeconomics to refinements of hyper-rational Bayesian deductivist models is not a viable way forward. It will only sentence to irrelevance the most interesting real world economic problems. And as someone has so wisely remarked, murder is unfortunately the only way to reduce biology to chemistry — reducing macroeconomics to Walrasian general equilibrium microeconomics basically means committing the same crime.

Lars Pålsson Syll. On the use and misuse of theories and models in economics.

~ ~ ~

Emergence, some say, is merely a philosophical concept, unfit for scientific consumption. Or, others predict, when subjected to empirical testing it will turn out to be nothing more than shorthand for a whole batch of discrete phenomena involving novelty, which is, if you will, nothing novel. Perhaps science can study emergences, the critics continue, but not emergence as such. (Clayton 2004: 577)*

It’s too soon to tell. But certainly there is a place for those, such as the scientist to whom this volume is dedicated, who attempt to look ahead, trying to gauge what are Nature’s broadest patterns and hence where present scientific resources can best be invested. John Archibald Wheeler formulated an important motif of emergence in 1989:

Directly opposite the concept of universe as machine built on law is the vision of a world self-synthesized. On this view, the notes struck out on a piano by the observer-participants of all places and all times, bits though they are, in and by themselves constituted the great wide world of space and time and things.

(Wheeler 1999: 314)

Wheeler summarized his idea — the observer-participant who is both the result of an evolutionary process and, in some sense, the cause of his own emergence — in two ways: in the famous sketch given in Fig.26.1 and in the maxim “It from bit.” In the attempt to summarize this chapter’s thesis with an equal economy of words I offer the corresponding maxim, “Us from it.” The maxim expresses the bold question that gives rise to the emergentist research program: Does nature, in its matter and its laws, manifest an inbuilt tendency to bring about increasing complexity? Is there an apparently inevitable process of complexification that runs from the period table of the elements through the explosive variations of evolutionary history to the unpredictable progress of human cultural history, and perhaps even beyond? (Clayton 2004: 577)

The emergence hypothesis requires that we proceed though at least four stages. The first stage involves rather straightforward physics — say, the emergence of classical phenomena from the quantum world (Zurek 1991, 2002) or the emergence of chemical properties through molecular structure (Earley 1981). In a second stage we move from the obvious cases of emergence in evolutionary history toward what may be the biology of the future: a new, law-based “general biology” (Kauffman 2000) that will uncover the laws of emergence underlying natural history. Stage three of the research program involves the study of “products of the brain” (perception, cognition, awareness), which the program attempts to understand not as unfathomable mysteries but as emergent phenomena that arise as natural products of the complex interactions of brain and central nervous system. Some add a fourth stage to the program, one that is more metaphysical in nature: the suggestion that the ultimate results, or the original causes, of natural emergence transcend or lie beyond Nature as a whole. Those who view stage-four theories with suspicion should note that the present chapter does not appeal to or rely on metaphysical speculations of this sort in making its case. (Clayton 2004: 578-579)

Defining terms and assumptions

The basic concept of emergence is not complicated, even if the empirical details of emergent processes are. We turn to Wheeler, again, for an opening formulation:

When you put enough elementary units together, you get something that is more than the sum of these units. A substance made of a great number of molecules, for instance, has properties such as pressure and temperature that no one molecule possesses. It may be a solid or a liquid or a gas, although no single molecule is solid or liquid or gas. (Wheeler 1998: 341)

Or, in the words of biochemist Arthur Peacocke, emergence takes place when “new forms of matter, and a hierarchy of organization of these forms … appear in the course of time” and ” these new forms have new properties, behaviors, and networks of relations” that must be used to describe them (Peacocke 1993: 62).

Clearly, no one-size-fits-all theory of emergence will be adequate to the wide variety of emergent phenomena in the world. Consider the complex empirical differences that are reflected in these diverse senses of emergence:

• temporal or spatial emergence
• emergence in the progression from simple to complex
• emergence in increasingly complex levels of information processing
• the emergence of new properties (e.g., physical, biological, psychological)
• the emergence of new causal entities (atoms, molecules, cells, central nervous system)
• the emergence of new organizing principles or degrees of inner organization (feedback loops, autocatalysis, “autopoiesis”)
• emergence in the development of “subjectivity” (if one can draw a ladder from perception, through awareness, self-awareness, and self-consciousness, to rational intuition).

Despite the diversity, certain parameters do constrain the scientific study of emergence:

  1. Emergence studies will be scientific only if emergence can be explicated in terms that the relevant sciences can study, check, and incorporate into actual theories.
  2. Explanations concerning such phenomena must thus be given in terms of the structures and functions of stuff in the world. As Christopher Southgate writes, “An emergent property is one describing a higher level of organization of matter, where the description is not epistemologically reducible to lower-level concepts” (Southgate et al. 1999: 158).
  3. It also follows that all forms of dualism are disfavored. For example, only those research programs count as emergentist which refuse to accept an absolute break between neurophysiological properties and mental properties. “Substance dualisms,” such as the Cartesian delineation of reality into “matter” and “mind,” are generally avoided. Instead, research programs in emergence tend to combine sustained research into (in this case) the connections between brain and “mind,” on the one hand, with the expectation that emergent mental phenomena will not be fully explainable in terms of underlying causes on the other.
  4. By definition, emergence transcends any single scientific discipline. At a recent international consultation on emergence theory, each scientist was asked to define emergence, and each offered a definition of the term in his or her own specific field of inquiry: physicists made emergence a product of tome-invariant natural laws; biologists presented emergence as a consequence of natural history; neuroscientists spoke primarily of “things that emerge from brains”; and engineers construed emergence in terms of new things that we can build or create. Each of these definitions contributes to, but none can be the sole source for, a genuinely comprehensive theory of emergence. (Clayton 2004: 579-580)

Physics to chemistry

(….) Things emerge in the development of complex physical systems that are understood by observation and cannot be derived from first principles, even given a complete knowledge of the antecedent states. One would not know about conductivity, for example, from a study of individual electrons alone; conductivity is a property that emerges only in complex solid state systems with huge numbers of electrons…. Such examples are convincing: physicists are familiar with a myriad of cases in which physical wholes cannot be predicted based on knowledge of their parts. Intuitions differ, though, on the significance of this unpredictability. (Clayton 2004: 580)

(….) [Such examples are] unpredictable even in principle — if the system-as-a-whole is really more than the sum of its parts.

Simulated Evolutionary Systems

Computer simulations study the processes whereby very simple rules give rise to complex emergent properties. John Conway’s program “Life,” which simulates cellular automata, is already widely known…. Yet even in as simple a system as Conway’s “Life,” predicting the movement of larger structures in terms of the simple parts alone turns out to be extremely complex. Thus in the messy real world of biology, behaviors of complex systems quickly become noncomputable in practice…. As a result — and, it now appears, necessarily — scientists rely on explanations given in terms of the emerging structures and their causal powers. Dreams of a final reduction “downwards” are fundamentally impossible. Recycled lower-level descriptions cannot do justice to the actual emergent complexity of the natural world as it has evolved. (Clayton 2004: 582)

Ant colony behavior

Neural network models of emergent phenomena can model … the emergence of ant colony behavior from simple behavioral “rules” that are genetically programmed into individual ants. (….) Even if the behavior of an ant colony were nothing more than an aggregate of the behaviors of the individual ants, whose behavior follows very simple rules, the result would be remarkable, for the behavior of the ant colony as a whole is extremely complex and highly adaptive to complex changes in its ecosystem. The complex adaptive potentials of the ant colony as a whole are emergent features of the aggregated system. The scientific task is to correctly describe and comprehend such emergent phenomena where the whole is more than the sum of the parts. (Clayton 2004: 586-587)

Biochemistry

So far we have considered models of how nature could build highly complex and adaptive behaviors from relatively simple processing rules. Now we must consider actual cases in which significant order emerges out of (relative) chaos. The big question is how nature obtains order “out of nothing,” that is, when the order is not present in the initial conditions but is produced in the course of a system’s evolution. What are some of the mechanisms that nature in fact uses? We consider four examples. (Clayton 2004: 587)

Fluid convection

The Benard instability is often cited as an example of a system far from thermodynamic equilibrium, where a stationary state becomes unstable and then manifests spontaneous organization (Peacocke 1994: 153). In the Bernard case, the lower surface of a horizontal layer of liquid is heated. This produces a heat flux from the bottom to the top of the liquid. When the temperature gradient reaches a certain threshold value, conduction no longer suffices to convey the heat upward. At that point convection cells form at right angles to th4e vertical heat flow. The liquid spontaneously organizes itself into these hexagonal structures or cells. (Clayton 2004: 587-588)

Differential equations describing the heat flow exhibit a bifurcation of the solutions. This bifurcation represents the spontaneous self-organization of large numbers of molecules, formally in random motion, into convection cells. This represents a particularly clear case of the spontaneous appearance of order in a system. According to the emergence hypothesis, many cases of emergent order in biology are analogous. (Clayton 2004: 588)

Autocatalysis in biochemical metabolism

Autocatalytic processes play a role in some of the most fundamental examples of emergence in the biosphere. These are relatively simple chemical processes with catalytic steps, yet they well express the thermodynamics of the far-from-equilibrium chemical processes that lie at the base of biology. (….) Such loops play an important role in metabolic functions. (Clayton 2004: 588)

Belousov-Zhabotinsky reactions

The role of emergence becomes clearer as one considers more complex examples. Consider the famous Belousov-Zhabotinsky reaction (Prigogine 1984: 152). This reaction consists of the oxidation of an organic acid (malonic acid) by potassium bromate in the presence of a catalyst such as cerium, manganese, or ferroin. From the four inputs into the chemical reactor more than 30 products and intermediaries are produced. The Belousov-Zhabotinsky reaction provides an example of a biochemical process where a high level of disorder settles into a patterned state. (Clayton 2004: 589)

(….) Put into philosophical terms, the data suggest that emergence is not merely epistemological but can also be ontological in nature. That is, it’s not just that we can’t predict emergent behaviors in these systems from a complete knowledge of the structures and energies of the parts. Instead, studying the systems suggests that structural features of the system — which are emergent features of the system as such and not properties pertaining to any of its parts — determine the overall state of the system, and hence as a result the behavior of individual particles within the system. (Clayton 2004: 589-590)

The role of emergent features of systems is increasingly evident as one moves from the very simple systems so far considered to the sorts of systems one actually encounters in the biosphere. (….) (Clayton 2004: 589-590)

The biochemistry of cell aggregation and differentiation

We move finally to processes where a random behavior or fluctuation gives rise to organized behavior between cells based on self-organization mechanisms. Consider the process of cell aggregation and differentiation in cellular slime molds (specifically, in Dictyostelium discoideum). The slime mold cycle begins when the environment becomes poor in nutrients and a population of isolated cells joins into a single mass on the order of 104 cells (Prigogine 1984: 156) . The aggregate migrates until it finds a higher nutrient source. Differentiation than occurs: a stalk or “foot” forms out of about one-third of the cells and is soon covered with spores. The spores detach and spread, growing when they encounter suitable nutrients and eventually forming a new colony of amoebas. (Clayton 2004: 589-591) [See Levinton 2001: 166;]

Note that this aggregation process is randomly initiated. Autocatalysis begins in a random cell within the colony, which then becomes the attractor center. It begins to produce cyclic adenosine monophosphate (AMP). As AMP is released in greater quantities into extracellular medium, it catalyzes the same reaction in the other cells, amplifying the fluctuation and total output. Cells then move up the gradient to the source cell, and other cells in turn follow their cAMP trail toward the attractor center. (Clayton 2004: 589-591)

Biology

Ilya Prigogine did not follow the notion of “order out of chaos” up through the entire ladder of biological evolution. Stuart Kauffman (1995, 2000) and others (Gell-Mann 1994; Goodwin 2001; see also Cowan et al. 1994 and other works in the same series) have however recently traced the role of the same principles in living systems. Biological processes in general are the result of systems that create and maintain order (stasis) through massive energy input from their environment. In principle these types of processes could be the object of what Kauffman envisions as “a new general biology,” based on sets of still-to-be-determined laws of emergent ordering or self-complexification. Like the biosphere itself, these laws (if they indeed exist) are emergent: they depend on the underlying physical and chemical regularities but are not reducible to them. [Note, there is no place for mind as a causal source.] Kauffman (2000: 35) writes: (Clayton 2004: 592)

I wish to say that life is an expected, emergent property of complex chemical reaction networks. Under rather general conditions, as the diversity of molecular species in a reaction system increases, a phase transition is crossed beyond which the formation of collectively autocatalytic sets of molecules suddenly becomes almost inevitable. (Clayton 2004: 593)

Until a science has been developed that formulates and tests physics-like laws at the level of biology [evo-devo is the closest we have so far come], the “new general biology” remains an as-yet-unverified, though intriguing, hypothesis. Nevertheless recent biology, driven by the genetic revolution on the one side and by the growth on the environmental sciences on the other, has made explosive advances in understanding the role of self-organizing complexity in the biosphere. Four factors in particular play a central role in biological emergence. (Clayton 2004: 593)

The role of scaling

As one moves up the ladder of complexity, macrostructures and macromechanisms emerge. In the formation of new structures, scale matters — or, better put, changes in scale matter. Nature continually evolves new structures and mechanisms as life forms move up the scale of molecules (c. 1 Ångstrom) to neurons (c. 100 micrometers) to the human central nervous system (c. 1 meter). As new structures are developed, new whole-part relations emerge. (Clayton 2004: 593)

John Holland argues that different sciences in the hierarchy of emergent complexity occur at jumps of roughly three orders of magnitude in scale. By the point systems have become too complex for predictions to be calculated, one is forced to “move the description ‘up a level’” (Holland 1998: 201). The “microlaws” still constrain outcomes, of course, but additional basic descriptive units must also be added. This pattern of introducing new explanatory levels iterates in a periodic fashion as one moves up the ladder of increasing complexity. To recognize the pattern is to make emergence an explicit feature of biological research. As of now, however, science possesses only a preliminary understanding of the principles underlying this periodicity. (Clayton 2004: 593)

The role of feedback loops

The role of feedback loops, examined above for biochemical processes, become increasingly important from the cellular level upwards. (….) (Clayton 2004: 593)

The role of local-global interactions

In complex dynamical systems the interlocked feedback loops can produce an emergent global structure. (….) In these cases, “the global property — [the] emergent behavior — feeds back to influence the behavior of the individuals … that produced it” (Lewin 1999). The global structure may have properties the local particles do not have. (Clayton 2004: 594)

(….) In contrast …, Kauffman insists that an ecosystem is in one sense “merely” a complex web of interactions. Yet consider a typical ecosystem of organisms of the sort that Kauffman (2000: 191) analyzes … Depending on one’s research interests, one can focus attention either on holistic features of such systems or on the interactions of the components within them. Thus Langston’s term “global” draws attention to system-level features and properties, whereas Kauffman’s “merely” emphasizes that no mysterious outside forces need to be introduced (such as, e.g., Rupert Sheldrake’s (1995) “morphic resonance”). Since the two dimensions are complementary, neither alone is scientifically adequate; the explosive complexity manifested in the evolutionary process involves the interplay of both systemic features and component interactions. (Clayton 2004: 595)

The role of nested hierarchies

A final layer of complexity is added in cases where the local-global structure forms a nested hierarchy. Such hierarchies are often represented using nested circles. Nesting is one of the basic forms of combinatorial explosion. Such forms appear extensively in natural biological systems (Wolfram 2002: 357ff.; see his index for dozens of further examples of nesting). Organisms achieve greater structural complexity, and hence increased chances of survival, as they incorporate discrete subsystems. Similarly, ecosystems complex enough to contain a number of discrete subsystems evidence greater plasticity in responding to destabilizing factors. (Clayton 2004: 595-596)

“Strong” versus “weak” emergence

The resulting interactions between parts and wholes mirror yet exceed the features of emergence that we observed in chemical processes. To the extent that the evolution of organisms and ecosystems evidences a “combinatorial explosion” (Morowitz 2002) based on factors such as the four just summarized, the hope of explaining entire living systems in terms of simple laws appears quixotic. Instead, natural systems made of interacting complex systems form a multileveled network of interdependency (cf. Gregersen 2003), and each level contributes distinct elements to the overall explanation. (Clayton 2004: 596-597)

Systems biology, the Siamese twin of genetics, has established many of the features of life’s “complexity pyramid” (Oltvai and Barabási 2002; cf. Barabási 2002). Construing cells as networks of genes and proteins, systems biologists distinguish four distinct levels: (1) the base functional organization (genome, transcriptome, proteome, and metabalome) [see below, Morowitz on the “dogma of molecular biology.”]; (2) the metabolic pathways built up out of these components; (3) larger functional modules responsible for major cell functions; and (4) the large-scale organization that arises from the nesting of the functional modules. Oltvai and Barabási (2002) conclude that “[the] integration of different organizational levels increasingly forces us to view cellular functions as distributed among groups of heterogeneous components that all interact within large networks.” Milo et al. (2002) have recently shown that a common set of “network motifs” occurs in complex networks in fields as diverse as biochemistry, neurobiology, and ecology. As they note, “similar motifs were found in networks that perform information processing, even though they describe elements as different as biomolecules within a cell and synaptic connections between neurons in Caenorhabditis elegans.” (Clayton 2004: 598)

Such compounding of complexity — the system-level features of networks, the nodes of which are themselves complex systems — is sometimes said to represent only a quantitative increase in complexity, in which nothing “really new” emerges. This view I have elsewhere labeled “weak emergence.” [This would be a form of philosophical materialism qua philosophical reductionism.] It is the view held by (among others) John Holland (1998) and Stephen Wolfram (2002). But, as Leon Kass (1999: 62) notes in the context of evolutionary biology, “it never occurred to Darwin that certain differences of degree — produced naturally, accumulated gradually (even incrementally), and inherited in an unbroken line of descent — might lead to a difference in kind …” Here Kass nicely formulates the principle involved. As long as nature’s process of compounding complex systems leads to irreducibly complex systems with structures and causal mechanisms of their own, then the natural world evidences not just weak emergence but also a more substantive change that we might label strong emergence. Cases of strong emergence are cases where the “downward causation” emphasized by George Ellis [see p. 607, True complexity and its associated ontology.] … is most in evidence. By contrast, in the relatively rare cases where rules relate the emergent system to its subvening system (in simulated systems, via algorithms; in natural systems, via “bridge laws”) weak emergence interpretation suffices. In the majority of cases, however, such rules are not available; in these cases, especially where we have reason to think that such lower-level rules are impossible in principle, the strong emergence interpretation is suggested. (Clayton 2004: 597-598)

Neuroscience, qualia, and consciousness

Consciousness, many feel, is the most important instance of a clearly strong form of emergence. Here if anywhere, it seems, nature has produced something irreducible — no matter how strong the biological dependence of mental qualia (i.e., subjective experiences) on antecedent states of the central nervous system may be. To know everything there is to know about the progression of brain states is not to know what it’s like to be you, to experience your joy, your pain, or your insights. No human researcher can know, as Thomas Nagel (1980) so famously argued, “what it’s like to be a bat.” (Clayton 2004: 598)

Unfortunately consciousness, however intimately familiar we may be with it on a personal level, remains an almost total mystery from a scientific perspective. Indeed, as Jerry Fodor (1992) noted, “Nobody has the slightest idea how anything material could be conscious. Nobody even knows what it would be like to have the slightest idea about how anything material could be conscious. So much for the philosophy of consciousness.” (Clayton 2004: 598)

Given our lack of comprehension of the transition from brain states to consciousness, there is virtually no way to talk about the “C” word without sliding into the domain of philosophy. The slide begins if the emergence of consciousness is qualitatively different from other emergences; in fact, it begins even if consciousness is different from the neural correlates of consciousness.Much suggests that both differences obtain. How far can neuroscience go, even in principle, in explaining consciousness? (Clayton 2004: 598-599)

Science’s most powerful ally, I suggest, is emergence. As we’ve seen, emergence allows one to acknowledge the undeniable differences between mental properties and physical properties, while still insisting on the dependence of the entire mental life on the brain states that produce it. Consciousness, the thing to be explained, is different because it represents a new level of emergence; but brain states — understood both globally (as the state of the brain as a whole) and in terms of their microcomponents — are consciousness’s sine qua non. The emergentist framework allows science to identify the strongest possible analogies with complex systems elsewhere in the biosphere. So, for example, other complex adaptive systems also “learn,” as long as one defines learning as “a combination of exploration of the environment and improvement of performance through adaptative change” (Schuster 1994). Obviously, systems from primitive organisms to primate brains record information from their environment and use it to adjust future responses to that environment. (Clayton 2004: 599)

Even the representation of visual images in the brain, a classically mental phenomenon, can be parsed in this way. Consider Max Velman’s (2000) schema … Here a cat-in-the-world and the neural representation of the cat are both parts of a natural system; no nonscientific mental “things” like ideas or forms are introduced. In principle, then, representation might be construed as merely a more complicated version of the feedback loop between a plant and its environment … Such is the “natural account of phenomenal consciousness” defended by (e.g.) Le Doux (1978). In a physicalist account of mind, no mental causes are introduced. Without emergence, the story of consciousness must be retold such that thoughts and intentions play no causal role. … If one limits the causal interactions to world and brains, mind must appear as a sort of thought-bubble outside the system. Yet it is counter to our empirical experience in the world, to say the least, to leave no causal role to thoughts and intentions. For example, it certainly seems that your intention to read this … is causally related to the physical fact of your presently holding this book [or browsing this web page, etc.,] in your hands. (Clayton 2004: 599-600)

Arguments such as this force one to acknowledge the disanologies between emergence of consciousness and previous examples of emergence in complex systems. Consciousness confronts us with a “hard problem” different from those already considered (Chalmers 1995: 201):

The really hard problem of consciousness is the problem of experience. When we think and perceive, there is a whir of information-processing, but there is also a subjective aspect. As Nagel has put it, there is something it is like to be a conscious organism. This subjective aspect is experience. When we see, for example, we experience visual sensations: the felt quality of redness, the experience of dark and light, the quality of depth in a visual field. Other experiences go along with perception in different modalities: the sound of a clarinet, the smell of mothballs. Then there are bodily sensations, from pains to orgasms; mental images that are conjured up internally; the felt quality of emotion, and the experience of a stream of conscious thought. What unites all of these states is that there is something it is like to be in them. All of them are states of experience.

The distinct features of human cognition, it seems, depend on a quantitative increase in brain complexity vis-à-vis other higher primates. Yet, if Chalmers is right (as I fear he is), this particular quantitative increase gives rise to a qualitative change. Even if the development of conscious awareness occurs gradually over the course of primate evolution, the (present) end of that process confronts the scientist with conscious, symbol-using beings clearly distinct from those who preceded them (Deacon 1997). Understanding consciousness even as an emergent phenomenon in the natural world — that is, naturalistically — requires a theory of “felt qualities,” “subjective intentions,” and “states of experience.” Intention-based explanations and, it appears, a new set of sciences: the social or human sciences. By this point emergence has driven us to a level beyond the natural-science-based framework of the present book. New concepts, new testing mechanisms, and perhaps even new standards for knowledge are now required. From the perspective of physics the trail disappears into the clouds; we can follow it no further. (Clayton 2004: 600-601)

The five emergences

In the broader discussion the term “emergence” is used in multiple and incompatible senses, some of which are incompatible with the scientific project. Clarity is required to avoid equivocation between five distinct levels on which the term may be applied: (Clayton 2004: 601)

• Let emergence-1 refer to occurrences of the term within the context of a specific scientific theory. Here it describes features of a specified physical or biological system of which we have some scientific understanding. Scientists who employ these theories claim that the term (in a theory-specific sense) is currently useful for describing features of the natural world. The preceding pages include various examples of theories in which this term occurs. At the level of emergence-1 alone there is no way to establish whether the term is used analogously across theories, or whether it really means something utterly distinct in each theory in which it appears. (Clayton 2004: 601-602)

• Emergence-2 draws attention to features of the world that may eventually become part of a unified scientific theory. Emergence in this sense expresses postulated connections or laws that may in the future become the basis for one or more branches of science. One thinks, for example, of the role of emergence in Stuart Kauffman’s notion of a new “general biology,” or in certain proposed theories of complexity or complexification. (Clayton 2004: 602)

• Emergence-3 is a mata-scientific term that points out a broad pattern across scientific theories. Used in this sense, the term is not drawn from a particular scientific theory; it is an observation about a significant pattern that connects a range of scientific theories. In the preceding pages I have often employed the term in this fashion. My purpose has been to draw attention to common features of the physical systems under discussion, as in (e.g.) the phenomena of autocatalysis, complexity, and self-organization. Each is scientifically understood, each shares common features that are significant. Emergence draws attention to these features, whether or not the individual theories actually use the same label for the phenomena they describe. (Clayton 2004: 602)

Emergence-3 thus serves a heuristic function. It assists in the recognition of common features between theories. Recognizing such patterns can help to extend existing theories, to formulate insightful new hypotheses, or to launch new interdisciplinary research programes.[4] (Clayton 2004: 602)

• Emergence-4 expresses a feature in the movement between scientific disciplines, including some of the most controversial transition points. Current scientific work is being done, for example, to understand how chemical structures are formed, to reconstruct the biochemical dynamics underlying the origins of life, and to conceive how complicated neural processes produce cognitive phenomena such as memory, language, rationality, and creativity. Each involves efforts to understand diverse phenomena involving levels of self-organization within the natural world. Emergence-4 attempts to express what might be shared in common by these (and other) transition points. (Clayton 2004: 602)

Here, however, a clear limitation arises. A scientific theory that explains how chemical structures are formed is perhaps unlikely to explain the origins of life. Neither theory will explain how self-organizing neural nets encode memories. Thus emergence-4 stands closer to the philosophy of science than it does to actual scientific theory. Nonetheless, it is the sort of philosophy of science that should be helpful to scientists.[5] (Clayton 2004: 602)

• Emergence-5 is a metaphysical theory. It represents the view that the nature of the natural world is such that it produces continually more complex realities in a process of ongoing creativity. The present does not comment on such metaphysical claims about emergence.[6] (Clayton 2004: 603)

Conclusion

(….) Since emergence is used as an integrative ordering concept across scientific fields …. It remains, at least in part, a meta-scientific term. (Clayton 2004: 603)

Does the idea of distinct levels then conflict with “standard reductionist science?” No, one can believe that there are levels in Nature and corresponding levels of explanation while at the same time working to explain any given set of higher-order phenomena in terms of underlying laws and systems. In fact, isn’t the first task of science to whittle away at every apparent “break” in Nature, to make it smaller, to eliminate it if possible? Thus, for example, to study the visual perceptual system scientifically is to attempt to explain it fully in terms of the neural structures and electrochemical processes that produce it. The degree to which downward explanation is possible will be determined by long-term empirical research. At present we can only wager on the one outcome or the other based on the evidence before us. (Clayton 2004: 603)

Notes:

[2] Gordon (2000) disputes this claim: “One lesson from ants is that to understand a system like theirs, it is not sufficient to take the system apart. The behavior of each unit is not encapsulated inside that unit but comes from its connections with the rest of the system.” I likewise break strongly with the aggregate model of emergence.

[3] Generally this seems to be a question that makes physicists uncomfortable (“Why, that’s impossible, of course!”), whereas biologists tend to recognize in it one of the core mysteries in the evolution of living systems.

[4] For this reason, emergence-3 stands closer to the philosophy of science than do the previous two senses. Yet it is a kind of philosophy of science that stands rather close to actual science and that seeks to be helpful to it. [The goal of all true “philosophy of science” is to seek critical clarification of ideas, concepts, and theoretical formulations; hence to be “helpful” to science and the question for human knowledge.] By way of analogy one thinks of the work of philosophers of quantum physics such as Jeremy Butterfield or James Cushing, whose work can be and has actually been helpful to bench physicists. One thinks as well of the analogous work of certain philosophers in astrophysics (John Barrow) or in evolutionary biology (David Hull, Michael Ruse).

[5] This as opposed, for example, to the kind of philosophy of science currently popular in English departments and in journals like Critical Inquiry — the kind of philosophy of science that asserts that science is a text that needs to be deconstructed, or that science and literature are equally subjective, or that the worldview of Native Americans should be taught in science classes.

— Clayton, Philip D. Emergence: us from it. In Science and Ultimate Reality: Quantum Theory, Cosmology and Complexity (John D. Barrow, Paul W. Davies, and Charles L. Harper, Jr., ed.). Cambridge: Cambridge University Press; 2004; pp. 577-606.

~ ~ ~

* Emergence: us from it. In Science and Ultimate Reality: Quantum Theory, Cosmology and Complexity (John D. Barrow, Paul W. Davies, and Charles L. Harper, Jr., ed.)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s