What Information does a Bit Convey?


What does a bit mean?

The question may seem trivial – after all, everyone knows that a bit represents a choice between 0 and 1. Isn’t it?

So, what’s the problem?

The problem is that 0 and 1 are not the only answers. 0 and 1 are just a pair of possible instantiations for a bit, especially useful if you look at numbers, but there are many more possiblities.

Classic examples are:

– yes and no
– true and false
– positive and negative
– on and off

to name but a few. Other options include male/female, inside/outside, good/bad and any other binary pair you care to think of, as well as the associated inversions: for example, 1/0 as well as 0/1.

Bits select information. This information can be numbers, but information encompasses much more than numbers. Laymen think that bits are based on 0 and 1, but 0 and 1 are only a possible instatiation of something more general, namely information.

How does the bit know which two states it can represent?

You may think that the quality of information (like number count, truth value, positivity, direction etc) is an inherent property of the bit itself, with one bit representing 0/1 and another one representing for example true/false. This is not the case. Chip manufacturers do not assign individual properties to the bits within their chips in this way. From a technical point of view, all bits are identical, and it is precisely this simplicity and neutrality that makes binary, i.e. bit technologies so attractive.

Only when the computer runs a program do the previously neutral bits take on individual value pairs such as 0/1, true/false, etc. It is thus the program that assigns a meaning to the two states of the bit.

Of course, this has practical benefits because it means that the meaning assigned to a particular bit within a chip can be redefined time and time again, depending what program happens to be running. But it also means we have to face the fact that this meaning is not an inherent property of the bit itself, but a function of the program that calls it up, and thus of different bits entirely – those of the program.

But where do these bits take their meaning from? Of course, these bits also have value pairs assigned to them from the outside in exactly the same way. And so it goes on, with the meaning of value pairs always being assigned from the outside by other bits. In other words, this is an infinite regress in which every bit that explains another bit itself has to be explained by yet another.


Where does the chain end?

This search for the bits that explain other bits never ends; that is the nature of an infinite regress. But we do still have a chance of finding the end of the chain: the search is only hopeless as long as we remain within the confines of the computer. Human beings, however, can think beyond these limits. The program was written for a certain purpose and it is human beings – programmers and users – who decide what meaning they want the bits to have. This meaning, and thus the specific individual value pairs assigned to the bits, emerges offline at the end of the regress as an implicit understanding in the minds of human beings.

This has taken us beyond the world of bits, and I would argue that this is unavoidable: until we escape these bounds, we remain in a world that is precise but completely devoid of meaning. Individual bits only take on meaning when this is assigned from the outside, i.e. when we link them to information that is meaningful to us as human beings. This allows us to resolve the infinite regress.

Seen in isolation, the two states of the bit are completely neutral, which means we can assign them any meaning we like. Technically, that’s a stroke of genius. But we shouldn’t deceive ourselves that we can generate meaningful information from bits alone. There always needs to be an ‘outside’ that assigns meaning to the bits.


We thus have two types of information:

1. The isolated bit:

This indicates which of the two states of the bit is selected without describing the states themselves. It is the ‘technical’ bit of information theory.

2. The meaning assigned to the bit:

This tells us what the bit is all about, i.e. which two states it allows us to choose between. This is the qualitative information that can be expressed using the bit. Although we assign this information to the bit itself, it disappears as soon as we look at the bit in isolation.

These two types of information are fundamentally different.

Despite – or maybe precisely because of – this difference, they belong together. It is only by their combination that meaningful information is produced.

D. M.  MacKay: Selective and descriptive information content

Following the information science pioneer D. M. MacKay, we can call the two types of information the selective and the descriptive information content.


 

Leave a Reply

Your email address will not be published. Required fields are marked *