The Bit
The bit is the basic building block of information. But what information does this building block contain?
The bit has a great deal to do with entropy and with distinction according to Georg Spencer-Brown:
Entropy
Entropy measures information. This is why the concept of entropy is indispensable when we attempt to understand the phenomenon of information. There is a separate overview page on this website for articles on entropy:
The term entropy originates from thermodynamics, where it complements the concept of energy. However, entropy is much more. Information entropy as defined by Shannon, is also a measure of information. Despite its enormous importance in physics in general, information theory (bits), but also in everyday life, the concept of entropy is barely understood.
Machine Thinking
As a computer scientist, I am fascinated by thinking machines. But do machines really think for themselves? Or do they derive their intelligence from those who build, feed and control them? What is the difference between machine intelligence and biological intelligence?