“Order” and “disorder”, we have seen, are observer-dependent categories of a dynamical system’s state space. What characterizes disordered states, relative to any observer, is that different disordered states do not differ in any meaningful way.
Shannon’s measure of information is actually known as “entropy”, a word better known from thermodynamics, whose famous second law states that, in a closed system, it always increases to a maximum