The mechanisms that afford the hierarchy its position as the central architectural principle in complexity can be understood at greater depth with reference to the signal/boundary framework by John H Holland, a pioneer of genetic algorithms, who describes this model in “Emergence – from chaos and order” (2000) and “Signals and Boundaries – Building Blocks for Complex Adaptive Systems” (2014).
Recall that a system’s integrity results from a higher reaction rate inside than outside, but that the system boundary is continuous: ultimately, the elements in a system, whether biological molecules or human beings, whether “inside” or “outside”, move randomly, in a mechanics that can be visualized as balls moving around on a two-dimensional, frictionless billiard table.
Suppose a subsystem supports a super-ordinate relation, the way a cell participates in a tissue, through some kind of process: say, a metabolic pathway with a series of reactions. Reaction rate of molecules within the cell depends on two things: the probability that a collision results in the product and reactant proportions. How are these factors controlled?
In organisms, the probability that a collision results in a certain product is controlled by enzymes, without which the probability that an organic reaction would occur is negligible. Reactant distributions are controlled by selective membranes that only let in molecules to varying degrees into a bounded compartment such as that of mitochondria. In other words, cells use subsystems to constrain randomness and catalyze reactions. Consider fishing: if you choose to fish in a lake where fish is known to be ample, the likelihood of catching one is higher, but whether you get one is still sensitive to chance. The boundaries of the lakes hinder free diffusion, so that some lakes are richer than others. Another example is that of sense modalities – they have no membranes, but the differences in responsiveness among receptors can be said to impose a surface, so that photoreceptors let through information carried by light, but ignores that of mechanical pressure, which is mediated by tactile receptors. Sense modalities subsequently participate in amodal cognitive activities, like how both the smell and sound of a horse evoke the relevant representation in your mind. To constrain chance, subsystems are necessary. This should reinforce the impression that it’s turtles all the way down.
Both enzymes and receptors can be thought of as rules operating on data that determine whether an event takes place. In Holland’s terminology, they both operate on “tags” of signals, such as how the structure of an enzyme’s active site makes it catalyze only some reactions, and the polarity of a cell membrane lets in molecules only of a certain kind. Suppose a rule (e.g. enzymes) produces products that carry a certain tag, and that another rule (another enzyme) operates only on signals carrying this tag. This results in a hierarchical arrangement of successive enclosures – the exit condition for one becomes the entry condition for another, each enclosure specialized for a particular interaction. Thus hierarchy supports complexity and prevents the dynamic in an organism from becoming a chaotic witch brew.
How did a hierarchical organization of reactions evolve? Consider each signal as a bit string of 1s and 0s, and each rule as an ON/OFF binary device controlled by a conditional, e.g. IF {100###} THEN {101###} where # means “unspecified” so that a condition of ###### lets any signal activate it. Thus rules vary in their range and specificity. These rules are encoded in a linear program code, just like how chromosomes carry information about how enzymes are synthesized, and this program code is imperfectly replicated, due to crossover (sexual reproduction) and, at a much lower rate, error (mutations).
Now suppose that a frog has a rule, R1, that says “IF {moving object} THEN {eat it}” and that a new allele, R2, appears in some other frogs, which is more specific: “IF {moving object && looks like a fly} THEN {eat it}”. If R2 confers a competitive advantage, e.g. by allowing a frog to distinguish flies from moving non-food, then this above-average string will spread rapidly through the population. This does not imply any “loss” for R1, which will remain well-established. Over-general rules are better than random ones, and it will be satisfied more often than R2, so if flies change appearance, the general rule allows moving non-fly foods to be identified in their place. In other words, a general rule is stronger (activated more often), while more specific rules lose and gain strength more quickly.
This is the tradeoff of specialization, and the reason for why species maintain their boundaries despite habitat overlap. Hybrids do not have poor quality genes, just amateurish ones, putting them at a competitive disadvantage in both parent habitats, since their specific phenotype is relatively bad in either.
If you think about it, you will find that falling back on the over-general rule is equivalent to Simon’s watchmaker falling back on the last stable intermediate subunit. Holland’s signal/boundary framework therefore accounts for the self-organization of hierarchical complexity without requiring a human agent. Holland also observes that the general, above-average rules are short and discovered early, or else cross-over would have disrupted them, since alleles close to each other on the chromosome are likely to be passed together to the offspring. Evolution behaves as if it were intelligently employing a rule of thumb stating that a component that consistently appears in strong rules should be a likely candidate for new rules. Thus combinations of short rules are what allow the hierarchy to become more elaborate.
This way, the genome “learns” across generations the features of its environment, favoring more and more specific rules, but retaining a rule hierarchy: at the lowest we have DNA sequences for basic translation processes, robust and reliable; at the top we have precariously specific rules, sensitive to contextual changes. Holland dubs this innate drive towards professionalization and complexity “the multiplier effect” and we will now look into supplementary accounts of this phenomenon.