# By any means tenable or Untenable

Thermodynamics is the most important branch of the science dealing with the energy, work and conversions of one form of energy into other forms. It cuts across many branches of the courses like physics, chemistry and engineering. All the students in these streams have to come face to face with this subject.

Its laws have become so perfect that they can be bound into mathematical equations. First law of thermodynamics is concerning the conservation of energy.  Suppose there are two systems which are having energies A & B. If they are connected, then the energy of the resultant system shall be A+B.

But there is a catch. This law does not impose any restrictions on the donors and recipients of the energy. For example, we can heat the water by extracting (suppose we are having such an extracting machine) heat from the ice. After all the total has to remain constant and sum of the energies of two systems. But the common experience is contrary to this. Heat always flow from higher temperature to lower temperature.

To address this difficulty, second law of thermodynamics was proposed. This takes into account the natural phenomena. Another thermodynamic function called “Entropy” was formulated. This is the measure of randomness of the system. All the systems in nature tends towards chaos or randomness.

# Entropy: A negative form of Information?

Entropy is the measure of degree of randomness in a system. More the randomness more is the system stable. Any reaction which results in the creation of more number of particles or when matter in the solid state goes to liquid state and from liquid state to gaseous state results in increase of entropy.

Entropy is related to the heat energy which is the form of energy all other energies when used tend to change into. Heat causes the increase in the randomness of the motion of the particles in the matter. The entropy thus is connected with chaos.

It is a fact that it is almost impossible to design a machine which can convert all the energy into the useful work. This means that efficiency of any machine cannot be 100%. In fact there is a theorem by Carnot which states this. Where does the lost energy go? It is turned into the heat energy. This is due to this energy that parts of the implements heat up during the operation.

It has been said that information is also a form of energy. Take for example the following 125 alphabet highly non-random arrangement of the words.

“There is a tide in the affairs of men, Which, taken at tide flood, leads on to fortune; Omitted, all the voyage of their life Is bound in the shallow and in miseries.”

The above short passage from Julius Caesar; Act IV, Scene 3, is spoken by Brutus, when he realizes that he must face Mark Antony‘s army. Lesser the randomness, more information-rich is the content.

If we scramble these alphabets, the randomness increases and information or the meaning is lost. The entropy is increased. Thus we can entropy is the form of negative information.