Tag Archives: meaning

Entropy and Information

Entropy and Information

The term entropy is often avoided because it contains a certain complexity that cannot be argued away.
But when we talk about information, we also have to talk about entropy. Because entropy is the measure of the amount of information. We cannot understand what information is without understanding what entropy is.

Information is Always Relative

We believe that we can pack information, just as we store bits in a storage medium. Bits are there objectively available, like little beads in a chain that can say yes or no. For us, this is information. But this image is deceptive. We have become so accustomed to this picture that we can’t imagine otherwise.

Of course, the bit-spherules do not say ‘yes’ or ‘no’, nor 0 or 1, nor TRUE or FALSE, of anything else in particular. Bits have no meaning at all, unless you have defined this meaning from the outside. Then they can perfectly say 1, TRUE, ‘I’m coming to dinner tonight’ or something else, but only together with their environment, their context.

This consideration makes it clear that information is relative. The bit only acquires its meaning from its placement. Depending on its context, it means 0 or 1, ‘true’ or ‘false’, etc. The bit is set in its place, but its meaning only comes from its place.

The place and its context, must therefore be taken into account so that it becomes clear what the bit is supposed to mean. And of course, the meaning is relative, i.e. the same bit can have a completely different meaning in a different context, a different place.

This relativity characterises not only the bit, but every type of information. Every piece of information only acquires its meaning through the context in which it is placed. It is therefore relative. Bits are just signals. What they mean only becomes clear when you interpret the signals from your perspective, when you look at it from your context.

Only then does the signal take on a meaning for you. This meaning is not absolute, because whenever we try to isolate it from its context, it will be reduced to a signal mere signal.  The meaning can only be found relative in the interaction between your expectation, the context and the position of the bit. There it is a switch, which can be set to ON or OFF. However, ON and OFF only inform about the position of the switch. Everything else is in the surroundings.

Definition of Entropy

Considering how important information and information technologies are today, it is astonishing how little is known about the scientific definition of entropy, i.e. information:

Entropy is a measure of the information that is 
– known in the micro 
– but unknown in the macro level.

Entropy is therefore closely related to information at the micro and macro levels and can be seen as the ‘distance’ or difference between the information at the two information levels.

Micro and Macro Level Define Information

What is meant by this gap between the micro and macro levels?  –  When we look at an object, the micro level contains the details (i.e. a lot of information), and the macro level contains the overview (i.e. less, but more targeted information).

The distance between the two levels can be very small (as with the bit, where the microlevel knows just two pieces of information: on or off) or huge, as with the temperature (macrolevel) in the coffee cup, for example, where the kinetic energies of the many molecules (microlevel) determine the temperature of the coffee. The number of molecules in this case lies in the order of Avogadro’s number 1023, i.e. quite high, and the entropy of the coffee in the cup is correspondingly high.
On the other hand, when the span between micro and macro level becomes very narrow, the information (entropy) will be small and comes very close to the size of a bit (info content = 1). However, it always depends on the relation between micro and macro level. This relation – i.e. what is known in the micro level but not in the macro level – defines the information that you receive, namely the information that a closer look at the details reveals.

The Complexity of the Macro Level

A state on the macro level always contains less information than on the micro state. The macro state is not complete, it never can contain all information one could possibly get by a closer look, but in most cases it is a well targeted and intended simplification of the information at the micro level.

This means that the same micro-state can supply different macro-states. For example: a certain individual (micro level), can belong to the collective macro groups of Swiss inhabitants, computer scientists, older men, contemporaries of the year 2024, etc., all at the same time.

The possibility of simultaneously drawing out several macro-states from different micro-states is characteristic of the complexity of micro- and macro-states and thus also of entropy.

If we transfer the entropy consideration of thermodynamics to more complex networks, we must deal with their higher complexity, but the ideas of micro and macro state remain and help us to understand what is going on when information is gained and processed.

Translation: Juan Utzinger


Continued in Entropy, Part 2


See also:
Paradoxes and Logic, Part 2
– Georg Spencer-Brown’s Distinction and the Bit
–  What is Entropy?
Five Preconceptions about Entropy

Semantics and Linguistics

What is semantics?

A simple and easily understandable answer is that semantics is the meaning of signals. The signals can exist in any form: as text, as an image, etc. The most frequently studied semantics is that of words.

This is a good reason to examine the relationship of linguistics and semantics. Can semantics be regarded as a subdiscipline of linguistics?

Linguistics and semantics

Linguistics, the science of language and languages, has always examined the structure (grammar, syntax) of languages. Once the syntax of a sentence has been understood, linguists see two further tasks, i.e. secondly to examine the semantics of the sentence and thirdly to examine its pragmatics. “Semantics” is about the meaning of sentences, “pragmatics” about the “why” of a statement.

The linguists’ three steps

In the linguists’ eyes, there are thus three steps in understanding language: syntax -> semantics -> pragmatics. These three fields are weighted very differently by linguists: a conventional textbook predominantly deals with syntax, whereas semantics and pragmatics play a marginal role – and always on the basis of the previously conducted syntactic analysis. The linguists’ syntactic analysis thus already sets the course for what is based on it, namely semantics and pragmatics.

This is not really ideal for semantics. When you deal with semantics in more detail, it becomes clear that the grammar and other properties of individual languages constitute externals which may circumscribe the core of the statements – their meaning – in an occasionally very elegant manner, but they merely circumscribe them and do not represent them completely, let alone directly. A direct formal representation of what is meant by a text, however, would be the actual objective of a scientific semantics.

Can this objective be attained? First, we will have to clarify the relationship between words and concepts – words and concepts are not the same. Concepts are the basic elements of semantics and have a special, but not entirely simple relationship with the words of a language.

Word does not equal concept

One could flippantly assume that there is a one-to-one relationship between words and concepts, i.e. that behind every word, there is a concept which summarises the meaning of the word. But this is precisely what is wrong. Words and concepts cannot unequivocally be mapped on each other. The fact that this is the case can be recognised by everybody who observes himself while reading, talking and thinking.

It is obvious that a word can have several meanings depending on the context in which it is uttered. Occasionally, a word may even have no meaning at all, for instance if it is a technical term and I don’t know the specialist field. In such a case, I may be able to utter the word, but it remains devoid of meaning for me. Yet somebody who understands the specialist field will understand it.

Meaning has much to do with the addressee

Even perfectly normal words which we all know, not always have an unequivocal meaning but can evoke slightly different ideas (meanings) depending on the listener or the context. This does not only concern abstract words or words to which various values are attached, such as happiness, democracy, truth, etc.: absolutely concrete terms like leg, house and dog are interpreted differently by different people, too. The reception of the words as meaningful concepts has much to do with the addressee, his situation and expectations. There is definitely no 1:1 relation between words and concepts.

Meanings vary

Even in ourselves, there are quite different ideas for the same word; depending on the situation, we associate different ideas with the same word, depending on the situation and the everchanging state of our momentary knowledge of words and topics.

A dynamic process

The transition from one language to another shows how the link between words and concepts is a dynamic process in time and changes the meaning of the words. The English word ‘brave’ is the same word as the word ‘bravo’ in Italian, which we use if a musical performance inspires us. But the same word also exists in German, where today it means prissy or well-behaved – certainly not exactly the same as brave, though it is the same word and once meant the same in German as in English.

Semantics examines the play of meanings

We have to accept that a word and a concept cannot be mapped on each other just like that. Although in individual cases it may seem that there is precisely one concept (one semantics) behind every word, this idea is completely inappropriate in reality. And it is this idea which prevents the play of meanings from being understood correctly. Yet it is precisely this play of meanings which, in my view, constitutes semantics as a field of knowledge. In this field, it is possible to represent concepts formally in their own proper structure – which is completely independent from the formal representation of words.


Translation: Tony Häfliger and Vivien Blandford