Entropy is the measure of degree of randomness in a system. More the randomness more is the system stable. Any reaction which results in the creation of more number of particles or when matter in the solid state goes to liquid state and from liquid state to gaseous state results in increase of entropy.
Entropy is related to the heat energy which is the form of energy all other energies when used tend to change into. Heat causes the increase in the randomness of the motion of the particles in the matter. The entropy thus is connected with chaos.
It is a fact that it is almost impossible to design a machine which can convert all the energy into the useful work. This means that efficiency of any machine cannot be 100%. In fact there is a theorem by Carnot which states this. Where does the lost energy go? It is turned into the heat energy. This is due to this energy that parts of the implements heat up during the operation.
It has been said that information is also a form of energy. Take for example the following 125 alphabet highly non-random arrangement of the words.
“There is a tide in the affairs of men, Which, taken at tide flood, leads on to fortune; Omitted, all the voyage of their life Is bound in the shallow and in miseries.”
If we scramble these alphabets, the randomness increases and information or the meaning is lost. The entropy is increased. Thus we can entropy is the form of negative information.