Tag Archives: Information

Entropy between Micro- and Macro Level


Two Levels define Entropy: Micro and Macro

Two levels Define Entropy

The conventional physical definition of entropy characterises it as a difference between two levels: a detail level and an overview level.


Example Coffee Cup

The thermal entropy according to Boltzmann is classic, using the example of an ideal gas. The temperature (1 value) is directly linked to the kinetic energies of the individual gas molecules (1023 values). With certain adjustments, this applies to any material object, e.g. also to a coffee cup:

  1. Thermal macro state: temperature of the liquid in the cup.
  2. Thermal micro state: kinetic energy of all individual molecules in the cup

The values of a) and b) are directly connected. The heat energy of the liquid, which is expressed in the temperature of the coffee, is made up of the kinetic energies of the many (~ 1023) individual molecules in the liquid. The faster the molecules move, the hotter the coffee.

The movement of the individual molecules b) is not constant, however. Rather, the molecules are constantly colliding, changing their speed and therefore their energy. Nevertheless, the total energy after each collision is the same. Because of the energy theorem, the energy of the molecules involved changes with each collision, but the energy of all the molecules involved together remains the same. Even if the coffee cools down slowly or if the liquid is heated from the outside, the interdependence is maintained: The single overall value (temperature) and the many detailed values (movements) are always interdependent.


Example Forest and Trees

The well-known proverb warns us not to see the wood for the trees. This is an helpful picture for the tension between micro and macro level.

Forest: macro level

Trees: micro level

On the micro level we see the details, on the macro level we recognise the big picture. So which view is better? The forest or the trees?

  • Both macro level and micro level are useful – depending on the task
  • Both refer to the same object.
  • Both cannot be discerned at the same time
    -> When you look at the forest, you can’t see the individual trees
    -> If you look at the trees, you miss the forest

We generally believe that it is better to know all the details. But this is a delusion. We always need an overview. Otherwise we would get lost in the details.


So Where is the Entropy?

We can now enumerate all the details of the micro view and thus obtain the information content – e.g. in bits – of the micro state. In the macro state, however, we have a much smaller amount of bits. The difference between the two amounts is the entropy, namely the information that is present in the micro state (trees) but missing in the macro state (forest).


Why isn’t the information content at the micro level the absolute entropy?

The information content at the micro level can be calculated in bits. Does this amount of bits correspond to entropy? If so, the information content at the macro level would simply be a loss of information. The actual information would then be in the micro level of details.

This is the spontaneous expectation that I repeatedly encounter with dialogue partners. They assume that there is an absolute information content, and in their eyes, this is naturally the one with the greatest amount of detail.

A problem with this conception is that the ‘deepest’ micro-level is not clearly defined. The trees are a lower level of information in relation to the forest – but this does not mean that the deepest level of detail has been reached. You can describe the trees in terms of their components – branches, twigs, leaves, roots, trunk, cells, etc. – which is undoubtedly a deeper level than just trees and would contain even more details. But even this level would not be deep enough. We still can go deeper into the details and describe the different cells of the tree, the organelles in the cells, the molecules in the organelles and so on. We would then arrive at the quantum level. But is that the deepest level? Perhaps, but that is not certain. And the further we go into the details, the further we move away from the description of the forest. What interests us is the description of the forest and the lowest level is not necessary for this. The deeper down we search, the further we move away from the description of our object.

→ The deepest micro level is not unequivocally defined!

We can therefore not assign a distinct absolute entropy for our object. Because the micro level can be set at any depth, the entropy, i.e. the quantitative information content at this level, also changes. the deeper, the more information, the higher the entropy.


Is There an Absolute Macro Level?

Like the micro level, the highest information level, e.g., of a forest, is not clearly defined as well.

Is this macro level the image that represents an optical view of the forest as seen by a bird flying over it? Or is it the representation of the forest on a map? At what scale? 1:25,000 or 1:100,000? Obviously the amount of information of the respective macro state changes depending on the view.

What are we interested in when we describe the forest? The paths through the forest? The tree species? Are there deer and rabbits? How healthy is the forest?

In other words, the forest, like any object, can be described in very different ways.

There is no clear, absolute macro level. A different macro representation applies depending on the situation and requirements.


The Relativity of Micro and Macro Levels

At each level, there is a quantitative amount of information, the deeper the richer, the higher the clearer. It would be a mistake, however, to label a specific level with its amount of information as the lowest or the highest. Both are arbitrary. They are not laid by the object, but by the observer.

The Difference is the Information

As soon as we accept that both micro and macro levels can be set arbitrarily, we approach a more real concept of information. It suddenly makes sense to speak of a difference. The difference between the two levels define the span of knowledge.

The information that I can gain is the information that I lack at the macro level, but which I find at the micro level. The difference between the two levels in terms of their entropy is the information that I can gain in this process.

Conversely, if I have the details of the micro level in front of me and want to gain an overview, I have to simplify this information of the micro level and reduce its number of bits. This reduction is the entropy, i.e. the information that I consciously relinquish.


The Information Paradox

If I want to extract the information that interests me from a jumble of details, i.e. if I want to get from a detailed description to useful information, then I have to ignore a lot of information at the micro level. I have to lose information in order to get the information I want. This paradox underlies every analytical process.

Information is Relative and Dynamic

What I am proposing is a relative concept of information. This does not correspond to the expectations of most people who have a static idea of the world. The world, however, is fundamentally dynamic. We live in this world – like all other living beings – as information-processing entities. The processing of information is an everyday process for all of us, for all biological entities, from plants to animals to humans.

The processing of information is an existential process for all living beings. This process always has a before and an after. Depending on this, we gain information when we analyse something in detail. And if we want to gain an overview or make a decision (!), then we have to simplify information. So we go from a macro-description to a micro-description and vice versa. Information is a dynamic quantity.

Entropy is the information that is missing at the macro level but can be found at the micro level.

And vice versa: entropy is the information that is present at the micro level but – to gain an overview – is ignored at the macro level.


Objects and their Micro and Macro Level

We can assume that a certain object can be described at different levels. According to current scientific findings, it is uncertain whether a deepest level of description can be found, but this is ultimately irrelevant to our information theory considerations. In the same way, it does not make sense to speak of a highest macro level. The macro levels depend on the task at hand.

What is relevant, however, is the distance, i.e. the information that can be gained in the macro state when deeper details are integrated into the view, or when they are discarded for the sake of a better overview. In both cases, there is a difference between two levels of description.

The illustration above visualises the number of detected bits in an object. At the top of the macro level, there are few, at the bottom of the micro level there are many. The object remains the same whether many or few details are taken into account and recognised.

The macro view brings a few bits, but their selection is not determined by the object alone, but rather by the interest behind the view of the observer.

The number of bits, i.e. the entropy, decreases from bottom to top. The heigth of the level, however, is not a property of the object of observation, but a property of the observation itself. Depending on my intention, I see it  the observed object differently, sometimes detailed and unclear, another time clear and simplified, i.e. sometimes with a lot of entropy and another time with less entropy.

Information acquisition is the dynamic process that either:

a) gains more details: Macro → Micro
b) gains more overview: Micro → Macro

In both cases, the amount of information (entropy as the amount of bits) is changed. The bits gained or lost correspond to the difference in entropy between the micro and macro levels.

When I examine the object, it reveals more or less information depending on how I look at it. Information is always relative to prior knowledge and must be understood dynamically.

Translation: Juan Utzinger


 

Entropy and Information

Entropy and Information

The term entropy is often avoided because it contains a certain complexity that cannot be argued away.
But when we talk about information, we also have to talk about entropy. Because entropy is the measure of the amount of information. We cannot understand what information is without understanding what entropy is.

Information is Always Relative

We believe that we can pack information, just as we store bits in a storage medium. Bits are there objectively available, like little beads in a chain that can say yes or no. For us, this is information. But this image is deceptive. We have become so accustomed to this picture that we can’t imagine otherwise.

Of course, the bit-spherules do not say ‘yes’ or ‘no’, nor 0 or 1, nor TRUE or FALSE, of anything else in particular. Bits have no meaning at all, unless you have defined this meaning from the outside. Then they can perfectly say 1, TRUE, ‘I’m coming to dinner tonight’ or something else, but only together with their environment, their context.

This consideration makes it clear that information is relative. The bit only acquires its meaning from its placement. Depending on its context, it means 0 or 1, ‘true’ or ‘false’, etc. The bit is set in its place, but its meaning only comes from its place.

The place and its context, must therefore be taken into account so that it becomes clear what the bit is supposed to mean. And of course, the meaning is relative, i.e. the same bit can have a completely different meaning in a different context, a different place.

This relativity characterises not only the bit, but every type of information. Every piece of information only acquires its meaning through the context in which it is placed. It is therefore relative. Bits are just signals. What they mean only becomes clear when you interpret the signals from your perspective, when you look at it from your context.

Only then does the signal take on a meaning for you. This meaning is not absolute, because whenever we try to isolate it from its context, it will be reduced to a signal mere signal.  The meaning can only be found relative in the interaction between your expectation, the context and the position of the bit. There it is a switch, which can be set to ON or OFF. However, ON and OFF only inform about the position of the switch. Everything else is in the surroundings.

Definition of Entropy

Considering how important information and information technologies are today, it is astonishing how little is known about the scientific definition of entropy, i.e. information:

Entropy is a measure of the information that is 
– known in the micro 
– but unknown in the macro level.

Entropy is therefore closely related to information at the micro and macro levels and can be seen as the ‘distance’ or difference between the information at the two information levels.

Micro and Macro Level Define Information

What is meant by this gap between the micro and macro levels?  –  When we look at an object, the micro level contains the details (i.e. a lot of information), and the macro level contains the overview (i.e. less, but more targeted information).

The distance between the two levels can be very small (as with the bit, where the microlevel knows just two pieces of information: on or off) or huge, as with the temperature (macrolevel) in the coffee cup, for example, where the kinetic energies of the many molecules (microlevel) determine the temperature of the coffee. The number of molecules in this case lies in the order of Avogadro’s number 1023, i.e. quite high, and the entropy of the coffee in the cup is correspondingly high.
On the other hand, when the span between micro and macro level becomes very narrow, the information (entropy) will be small and comes very close to the size of a bit (info content = 1). However, it always depends on the relation between micro and macro level. This relation – i.e. what is known in the micro level but not in the macro level – defines the information that you receive, namely the information that a closer look at the details reveals.

The Complexity of the Macro Level

A state on the macro level always contains less information than on the micro state. The macro state is not complete, it never can contain all information one could possibly get by a closer look, but in most cases it is a well targeted and intended simplification of the information at the micro level.

This means that the same micro-state can supply different macro-states. For example: a certain individual (micro level), can belong to the collective macro groups of Swiss inhabitants, computer scientists, older men, contemporaries of the year 2024, etc., all at the same time.

The possibility of simultaneously drawing out several macro-states from different micro-states is characteristic of the complexity of micro- and macro-states and thus also of entropy.

If we transfer the entropy consideration of thermodynamics to more complex networks, we must deal with their higher complexity, but the ideas of micro and macro state remain and help us to understand what is going on when information is gained and processed.

Translation: Juan Utzinger


Continued in Entropy, Part 2


See also:
Paradoxes and Logic, Part 2
– Georg Spencer-Brown’s Distinction and the Bit
–  What is Entropy?
Five Preconceptions about Entropy

Five Preconceptions about Entropy

Which of these Preconceptions do you Share?

  1. Entropy is for nerds
  2. Entropy is incomprehensible
  3. Entropy is thermodynamics
  4. Entropy is noise
  5. Entropy is absolute


Details

1. Entropy is the Basis of our Daily Lives

Nerds like to be interested in complex topics and entropy fits in well, doesn’t it? It helps them to portray themselves as superior intellectuals. This is not your game and you might not see any practical reasons to occupy yourself with entropy. This attitude is very common and quite wrong. Entropy is not a nerdy topic, but has a fundamental impact on our lives, from elementary physics to practical everyday life.

Examples (according to W. Salm1)

  • A hot coffee cup cools down over time
  • Water evaporates in an open container
  • Pendulums that have been knocked remain stationary after a while
  • Iron rusts
  • Magnets become weaker after some years
  • Lessons learnt are forgotten
  • Combed hair becomes dishevelled
  • White shirts become stained
  • Rocks crumble
  • Radioactive elements decay

So there are plenty of reasons to look into the phenomenon of entropy, which can be found everywhere in everyday life. But most people tend to avoid the term. Why is that? This is mainly due to the second preconception.


2. Entropy is a Perfectly Understandable and Indispensable Fundamental Concept

It is true, that at first glance, entropy is rather confusing. However, entropy is only difficult to understand because of persistent preconceptions (see points 4 and 5, below). These ubiquitous preconceptions are the obstacles that make the concept of entropy seem incomprehensible. Overcoming these thresholds not only helps to understand many real and practical phenomena, but also sheds light on the foundations that hold our world together.


3. Entropy Plays a Role Everywhere in Nature

The term entropy stems from thermodynamics. But we should not be mislead by this. In reality, entropy is something that exists everywhere in physics, chemistry, biology and also in art and music. It is a general and abstract concept and it refers directly to the structure of things and the information they contain.

Historically, the term was introduced not 200 years ago in thermodynamics and was associated with the possibility of allowing heat (energy) to flow. It helped to understand the mode of operation of machines (combustion engines, refrigerators, heat pumps, etc.). The term is still taught in schools this way.

However, thermodynamics only shows a part of what entropy is. Its general nature was only described by C.E. Shannon2 in 1948. The general form of entropy, also known as Shannon or information entropy, is the proper, i.e. the fundamental form. Heat entropy is a special case.

Through its application to heat flows in thermodynamics, entropy as heat entropy was given a concrete physical dimension, namely J/K, i.e. energy per temperature. However, this is the special case of thermodynamics, which deals with energies (heat) and temperature. If entropy is understood in a very general and abstract way, it is dimensionless, a pure number.

As the discoverer of abstract and general information entropy, Shannon gave this number a name, the “bit”. For his work as an engineer at the Bell telephone company, Shannon used the dimensionless bit to calculate the flow of information in the telephone wires. His information entropy is dimensionless and applies not only in thermodynamics, but everywhere where information and flows play a role.


4. Entropy is the Difference between not Knowing and Knowing

Many of us learnt at school that entropy is a measure of noise and chaos. Additionally, the second law of physics tells us that entropy can only ever increase. Thus, disorder should but increase. However, identifying entropy with noise or even chaos is misleading.

There are good reasons for this misleading idea: If you throw a sugar cube into the coffee, its well-defined crystal structure dissolves, the molecules disperse disorderly in the liquid and the sugar shows a transition from ordered to disordered. This decay of order can be observed everywhere in nature. In physics, it is entropy that drives the decay of order according to the second law. And decay and chaos can hardly be equated with Shannon’s concept of information. Many scientists thought the same way and therefore equated information with negentropy (entropy with a negative sign). At first glance, this doesn’t seem to be a bad match. In this view, entropy is noise and the absence of noise, i.e. negentropy, would then be information. Actually logical, isn’t it?

Not quite, because information is contained both in the sugar cube as well as in the dissolved sugar molecules floating in the coffee. In some ways, there is even more information in the individually floating molecules because each has its own path. Their bustling movements contain information. For us coffee drinker, however, the bustling movements of the many molecules in the cup does not contain useful information and appears only chaotic. Can this chaos be information?

The problem is our conventional idea of information. Our idea is too static. I suggest that we see entropy as something that denotes a flow, namely the flow between not knowing and knowing. This dynamic is characteristic of learning, of absorbing new information.

Every second, an incredible amount of things happen in the cosmos that could be known. The information in the entire world can only increase. This is what the second law says, and what increases is entropy, not negentropy. Wouldn’t it be much more obvious to put information in parallel with entropy and not with negentropy? More entropy would then mean more information and not more chaos.

Where can the information be found? In the noise or in the absence of noise? In entropy or in negentropy?


Two Levels

Well, the dilemma can be solved. The crucial step is to accept that entropy is the tension between two states, the overview state and the detail state. The overview view does not need the details, but only sees the broad lines. C.F. Weizsäcker speaks of the macro level. The broad lines are the information that interests us. Details, on the other hand, appear to us as unimportant noise. But the details, i.e. the micro level, contain more information, usually a whole lot more, just take the movements of the water molecules in the coffee cup (micro level), whose chaotic bustle contains more information than the single indication of the temperature of the coffee (macro level). Both levels are connected and their information depends on each other in a complex way. Entropy is the difference between the two amounts of information. This is always greater at the detail level (micro level), because there is always more to know in the details than in the broad lines and therefore also more information.

But because the two levels refer to the same object, you as the observer can look at the details or the big picture. Both belong together. The gain in information about details describes the transition from the macro to the micro level, the gain in information about the overview describes the opposite direction.

So where does the real information lie? At the detailed level, where many details can be described, or at the overview level, where the information is summarised and simplified in a way that really interests us?

The answer is simple: information contains both the macro and the micro level. Entropy is the transition between the two levels and, depending on what interests us, we can make the transition in one direction or the other.


Example Coffee Cup

This is classically demonstrated in thermodynamics. The temperature of my coffee can be seen as a metric for the average kinetic energy of the individual liquid molecules in the coffee cup. The information contained in the many molecules is the micro state, the temperature is the macro state. Entropy is the knowledge that is missing in the macro state but is present in the micro state. But for me as a coffee drinker, only the knowledge of the macro state, the temperature of the coffee, is relevant. This is not present in the micro state insofar as it does not depend on the individual molecules, but rather statistically on the totality of all molecules. It is only in the macro state that knowledge about temperature becomes tangible.

For us, only the macro state shows relevant information. But there is additional information in the noise of the details. How exactly the molecules move is a lot of information, but these details don’t matter to me when I drink coffee, only their average speed determines the temperature of the coffee, which is what matters to me.

The information-rich and constantly changing microstate has a complex relationship with the simple macroinformation of temperature. The macro state also influences the micro state, because the molecules have to move within the statistical framework set by the temperature. Both pieces of information depend on each other and are objectively present in the object at the same time. What differs is the level or scope of observation. The difference in the amount of information in the two levels determines the entropy.

These conditions have been well known since Shannon2 and C.F. Weizsäcker. However, most schools still teach that entropy is a measure of noise. This is misleading. Entropy should always be understood as a delta, as a difference (distance) between the information in the overview (macro state) and the information in the details (micro state).


5. Entropy is a Relative Value

The fact that entropy is always a distance, a delta, i.e. mathematically a difference, also results in the fact that entropy is not an absolute value, but rather a relative value.

Example Coffee Cup
Let’s take the coffee cup as an example. How much entropy is in there? If we only look at the temperature, then the microstate corresponds to the average kinetic energy of the molecules. But the coffee cup contains even more information: How strong is the coffee? How strongly sweetened? How strong is the acidity? What flavours does it contain?

Example School Building
Salm1 gives the example of a lost door key that a teacher is looking for in a school building. If he knows which classroom the key is in, he has not yet found it. At this moment, the microstate only names the room. Where in the room is the key? Perhaps in a cupboard. In which one? At what height? In which drawer, in which box? The micro state varies depending on the depth of the request. It is a relative value. 

Because the information entropy is always a difference, the entropy, i.e. the span between overview and details, can always be extended to even more details.

Entropy is a relative value. If we specify it in absolute terms, we set – without explicitly declaring it – a lowest level (classroom, shelf or drawer). This is legal as long as we are aware that the seemingly absolute value only represents the distance to the assumed micro-level.

Statics and Dynamics 

Energy and entropy are two complementary quantities that permeate the entire description of nature (physics, chemistry, biology). The two fundamental laws of physics each contain one of the two general quantities E (energy) and S (entropy):

  1. Law:  E = 0    oder:   dE/dt = 0
  2. Law:  ∆S ≥ 0    oder:   dS/dt ≥ 0

Energy remains constant over time (in a closed system), while entropy can only increase. In other words: energy is a static value and shows what does not change, while entropy is essentially dynamic and shows flows, e.g. in heat machines, in Shannon’s current in telephone wires and whenever our thoughts flow and we learn and think.

Entropy and Time

Entropy is essentially linked to the phenomenon of time by the second law (∆S ≥ 0). While energy remains constant in a closed system (Noether’s theorem), entropy changes over time and increases in a closed system. Entropy therefore knows time, not only heat entropy in particular, but also the much more general information entropy.


Conclusion

  • Entropy is a key concept in physics and information theory.
  • The term entropy comes from thermodynamics, but the concept of entropy refers to information in general.
  • The thermodynamical entropy is the special case, information entropy is the general concept.
  • Everything that happens physically, chemically and in information processing, whether technical or biological, has to do with entropy. In particular, everything that has to do with information flows and structures. In other words, everything that really interests us.
  • Entropy is always relative and refers to the distance between the macro and micro levels.
  • The macro level contains less information than the micro level
  • The macro level contains the information of interest.
  • Neither is absolute: the micro level can always be described in more detail. The macro level is defined from the outside: What is of interest? The temperature of the coffee? The concentration of sugar molecules? The acidity? The caffeine content …
  • Only the definition of the two states makes it possible to specify the entropy in seemingly absolute terms. However, what counts for entropy is the relative value, i.e. the delta between the two states. This delta, the entropy, determines the flow.
  • The flow happens in time.

(Translation: Juan Utzinger)


1 Salm, W: Entropie und Information – naturwissenschaftliche Schlüsselbegriffe, Aulis Verlag Deubner, Köln, 1997

2 Shannon, C.E. und Weaver W: The Mathematical Theory of Information, Illinois Press, Chicago, 1949


See also: What is Entropy?