Information theory introduced the image of system boundaries as implicitly projecting “input categories” upon reality, parsing it into “variables” with an associated space of possible outcomes.
Shannon’s measure of information is actually known as “entropy”, a word better known from thermodynamics, whose famous second law states that, in a closed system, it always increases to a maximum