Why “Information” is Entropy?

Deb Prakash Chatterjee
5 min readApr 24, 2024

--

Today we are working with more than 1e+9 bytes or 1 GB of data from our mobile phones and laptop daily. We can watch movies, read our favorite blogs, code a fully working website, and make it live. All these processes will require a huge amount of information to be transferred from one machine to another machine.

Now I am stating that Information is entropy, does it mean that we are transferring entropy on a daily basis? 🤷🏻

We studied in high school that entropy is closely related to the second law of thermodynamics.

The second law of thermodynamics states that in a closed system, the entropy of the system will always increase.

Then from where do we land on the concept of information?

Let’s understand this!

We need to go back to 1867. Sir James Clerk Maxwell is famous for the equations of electromagnetism, and infamous for the idea on how to violate the second law of thermous. Sir Lord Calvin mentioned about a paradoxical experiment, named Maxwell’s Demon. Let’s realize the experiment first to get a better view of the problem.

Maxwell’s Demon Experiment

Maxwell’s Demon experiment requires one box (separated from the outside environment) cut in two halves, and the temperature is the same throughout the boxes. These two boxes are filled with gas molecules with high-speed and low-speed mixed. Their adjoint wall has one door, which can be opened and closed, and in open conditions, only one molecule can pass through the hole. Since these boxes have the same temperature, hence even after opening the door n times, it will still be in thermal equilibrium. It is the state of maximum entropy.

Here comes the demon.

Image: Freepik

He is a very intelligent being, and he can open and close the door, according to the speed of the molecules. When a high-speed molecule approaces from right to left, and a slow-speed molecule approaches from left to right, then only the door will be opened by the demon. Please keep in mind that only one particle can be travelled from one box from other at a given time.

In this continuous process, all high-speed molecules will be in the left box, and all slow-speed molecules will reside in the right box. In this above-mentioned process, the molecules will be sorted, and the overall entropy will decrease. Using this technique, we can use the energy to do some work, like making a heat engine.

So, we have proved that, in a closed system, the entropy decreased, even though not a single interact with any outside universe. This is strictly against the second law of thermodynamics.

Does it mean that the second law of thermodynamics itself has flaws?

The short answer is NO! The law is flawless, and it is high time, we shall understand why. By understanding this, we will get to know why “Information” is also referred to as entropy. Please remember this following concept is application for all information storing devices be it human brain, a drive comprised of some gates or even a quantum state.

Let’s recap the experiment once again. We have two boxes filled with high-speed and low-speed gas molecules. The demon can pass one high-speed gas molecule from the right to the left box, and one low-speed gas molecule from the left to the right box. It is very obvious that to perform the door open-close operation, the demon needs to know the position and speed of the approaching molecules. Now storing all the information about the molecules will result in an increment of the entropy, that can compensate for the overall entropy reduced.

The demon’s memory can not be infinite. After some amount of time, when his memory is full, and he tries to memorize new molecules’ information (position and speed), then he needs to remove some of the information from his memory of past molecules. Here comes the next entropy increase.

According to Landauer’s principle, the erasure of one bit of information requires a minimal energy cost, which is =

Formula for Landauer’s Principle

Here K = Boltzmann constant, and T = temperature of the thermal reservoir used in the process.

This erasure of the bits from the demon’s memory will increase entropy as heat. So, we have entropy as the information is getting stored in the demon’s memory + the erasure of the information is creating entropy, thus the overall entropy is getting increased.

This is how the second law of thermodynamics is maintained while this experiment is performed. It took more than 100 years for physicists to solve this problem, but it was one of the greatest discoveries when it came to information theory and physics.

Honorable Mention:-

Sir Claude Shannon made significant contributions to the field of information theory and thermodynamics. One of his most important contributions was the formulation of the concept of entropy in information theory.

According to his theories, information, and entropy are very similar in nature. Entropy is the measurement of the uncertainty or randomness in the system (in the message or signal if specific). Whereas, Information is the measurement of the reduction of the uncertainty or randomness from the message.

The greater the entropy is, the greater the message is uncertain. Also, the greater the information is, the greater amount of reduction of randomness from the message, hence the information is extracted from the random message.

I hope after reading this article, you now understand why we can say that information and entropty are same.

--

--

Deb Prakash Chatterjee
Deb Prakash Chatterjee

Written by Deb Prakash Chatterjee

Software Engineer || Technical Writer || Science Enthusiast

No responses yet