Entropy and Information
The term entropy is often avoided because it contains a certain complexity that cannot be argued away.
But when we talk about information, we also have to talk about entropy. Because entropy is the measure of the amount of information. We cannot understand what information is without understanding what entropy is.
Information is Always Relative
We believe that we can pack information, just as we store bits in a storage medium. Bits are there objectively available, like little beads in a chain that can say yes or no. For us, this is information. But this image is deceptive. We have become so accustomed to this picture that we can’t imagine otherwise.
Of course, the bit-spherules do not say ‘yes’ or ‘no’, nor 0 or 1, nor TRUE or FALSE, of anything else in particular. Bits have no meaning at all, unless you have defined this meaning from the outside. Then they can perfectly say 1, TRUE, ‘I’m coming to dinner tonight’ or something else, but only together with their environment, their context.
This consideration makes it clear that information is relative. The bit only acquires its meaning from its placement. Depending on its context, it means 0 or 1, ‘true’ or ‘false’, etc. The bit is set in its place, but its meaning only comes from its place.
The place and its context, must therefore be taken into account so that it becomes clear what the bit is supposed to mean. And of course, the meaning is relative, i.e. the same bit can have a completely different meaning in a different context, a different place.
This relativity characterises not only the bit, but every type of information. Every piece of information only acquires its meaning through the context in which it is placed. It is therefore relative. Bits are just signals. What they mean only becomes clear when you interpret the signals from your perspective, when you look at it from your context.
Only then does the signal take on a meaning for you. This meaning is not absolute, because whenever we try to isolate it from its context, it will be reduced to a signal mere signal. The meaning can only be found relative in the interaction between your expectation, the context and the position of the bit. There it is a switch, which can be set to ON or OFF. However, ON and OFF only inform about the position of the switch. Everything else is in the surroundings.
Definition of Entropy
Considering how important information and information technologies are today, it is astonishing how little is known about the scientific definition of entropy, i.e. information:
Entropy is a measure of the information that is
– known in the micro
– but unknown in the macro level.
Entropy is therefore closely related to information at the micro and macro levels and can be seen as the ‘distance’ or difference between the information at the two information levels.
Micro and Macro Level Define Information
What is meant by this gap between the micro and macro levels? – When we look at an object, the micro level contains the details (i.e. a lot of information), and the macro level contains the overview (i.e. less, but more targeted information).
The distance between the two levels can be very small (as with the bit, where the microlevel knows just two pieces of information: on or off) or huge, as with the temperature (macrolevel) in the coffee cup, for example, where the kinetic energies of the many molecules (microlevel) determine the temperature of the coffee. The number of molecules in this case lies in the order of Avogadro’s number 1023, i.e. quite high, and the entropy of the coffee in the cup is correspondingly high.
On the other hand, when the span between micro and macro level becomes very narrow, the information (entropy) will be small and comes very close to the size of a bit (info content = 1). However, it always depends on the relation between micro and macro level. This relation – i.e. what is known in the micro level but not in the macro level – defines the information that you receive, namely the information that a closer look at the details reveals.
The Complexity of the Macro Level
A state on the macro level always contains less information than on the micro state. The macro state is not complete, it never can contain all information one could possibly get by a closer look, but in most cases it is a well targeted and intended simplification of the information at the micro level.
This means that the same micro-state can supply different macro-states. For example: a certain individual (micro level), can belong to the collective macro groups of Swiss inhabitants, computer scientists, older men, contemporaries of the year 2024, etc., all at the same time.
The possibility of simultaneously drawing out several macro-states from different micro-states is characteristic of the complexity of micro- and macro-states and thus also of entropy.
If we transfer the entropy consideration of thermodynamics to more complex networks, we must deal with their higher complexity, but the ideas of micro and macro state remain and help us to understand what is going on when information is gained and processed.
Translation: Juan Utzinger
Continued in Entropy, Part 2
See also:
– Paradoxes and Logic, Part 2
– Georg Spencer-Brown’s Distinction and the Bit
– What is Entropy?
– Five Preconceptions about Entropy