VII. Randomness, Probability, Compression and Redundancy

VII. Randomness, Probability, Compression and Redundancy

“Order” and “disorder”, we have seen, are observer-dependent categories of a dynamical system’s state space. What characterizes disordered states, relative to any observer, is that different disordered states do not differ in any meaningful way.

VI. Thermodynamic Entropy

VI. Thermodynamic Entropy

Shannon’s measure of information is actually known as “entropy”, a word better known from thermodynamics, whose famous second law states that, in a closed system, it always increases to a maximum

V. Basic Information Theory

V. Basic Information Theory

Most psychologists studying perception and cognition today argue that Gibson’s radio-metaphor is flawed because a brain, unlike a radio, identifies a “signal” not directly, but in a memory-dependent way.

IV. A Patchwork of Metaphor

IV. A Patchwork of Metaphor

The language used so far may have seemed too removed from the number-crunching chores of experimental science to be anything else than a whimsical indulgence. 

III. Umwelts, Affordances and Measurements

III. Umwelts, Affordances and Measurements

The idea that any system can be said to have its own ontology – its own distinctive way of carving up reality – means that its structure distinguishes between “inputs” and couples these to certain “outputs”.