Entropy is More than Thermodynamics
The term entropy originally came from thermodynamics, where it complements the concept of energy. However, entropy is much more. As information entropy defined by Claude Shannon, it is also a measure of information. The corresponding unit of measurement is the bit. Despite its enormous importance in physics, information theory and everyday life, the concept of entropy is considered nerdy and difficult to understand. It shouldn’t be.
Five widespread preconceptions complicate the understanding of this important concept in physics and in everyday life:
Entropy also plays a role in the difference between bits and distinction according to Spencer-Brown:
Information Reduction
The phenomenon of information reduction is closely linked to entropy.
Logical systems are always simplifications of a much more complex and detailed reality. In order to understand how we think, we have to come to terms with this necessary and unavoidable limitation of our knowledge of reality. In 2019, I wrote a short series of articles on this natural process of information reduction:
Many people are irritated by the topic of information reduction, but we are following in the footsteps of two of the most important European philosophers: William Ockham (Ockham’s Razor) and Socrates (I know that I know nothing).
Entropy is measured in bits and is one of the foundations of information -> Overview of Information Theory