What is the equation for entropy

entropy

The entropy is often misunderstood as a kind of "disorder". But that does not go far enough. Once introduced to explain the limited efficiency of steam engines, the term is now also used in many other disciplines.

Hardly any other term in physics is used so often outside of physics - and so often deviating from its actual meaning - as that of entropy. The term has a very narrow meaning. The Austrian physicist Ludwig Boltzmann came up with a concrete definition of this physical quantity in the second half of the 19th century. He focused on the microscopic behavior of a fluid, i.e. a gas or a liquid. He understood the disordered movement of atoms or molecules in it as heat, which was decisive for his definition.

Entropy in the bathtub

In a closed system with a fixed volume and a fixed number of particles, Boltzmann stated, the entropy is proportional to the logarithm of the number of microstates in the system. He understood microstates to mean all the ways in which the molecules or atoms of the trapped fluid can arrange themselves. His formula defines entropy as a measure of the “freedom of arrangement” of the molecules and atoms: if the number of microstates that can be captured increases, then the entropy increases. If there are fewer ways in which the particles of the fluid can arrange themselves, the entropy is smaller.

Entropy increase

Boltzmann's formula is often interpreted as if entropy was synonymous with “disorder”. However, this simplified picture is easily misleading. An example of this is the foam in a bathtub: when the bubbles burst and the surface of the water becomes smooth, the clutter appears to be decreasing. But entropy doesn't do that! In fact, it actually increases, because after the foam bursts, the possible space for the molecules of the liquid to stay is no longer limited to the outer skin of the vesicles - the number of microstates that can be consumed has increased. The entropy has grown.

With the help of Boltzmann's definition, one side of the term can be understood - but entropy also has another, macroscopic side, which the German physicist Rudolf Clausius had already uncovered a few years earlier. The steam engine, a classic heat engine, was invented at the beginning of the 18th century. Heat engines convert a temperature difference into mechanical work. Back then, physicists tried to understand what principles these machines obey. The researchers were irritated to find that only a few percent of the thermal energy could be converted into mechanical energy. The rest was somehow lost - without their understanding the reason.

Value of energy

The theory of thermodynamics seemed to lack a physical concept that takes into account the different valences of energy and limits the ability to convert thermal energy into mechanical energy. The solution came in the form of entropy. In the middle of the 19th century, Clausius introduced the term as a thermodynamic quantity and defined it as a macroscopic measure for a property that limits the usability of energy.

According to Clausius, the change in entropy of a system depends on the heat supplied and the temperature that is present. He concluded that entropy is always transferred together with heat. In addition, Clausius found that the entropy in closed systems, unlike the energy, is not a conserved quantity. This knowledge entered physics as the second law of thermodynamics:

"In a closed system, the entropy never decreases."

The entropy therefore always increases or remains constant. This introduces an arrow of time into the physics of closed systems, because with increasing entropy, thermodynamic processes in closed systems are irreversible (or irreversible).

Heat engine

A process would only be reversible if the entropy remained constant. But that is only possible in theory. All real processes are irreversible. According to Boltzmann, one can also say: The number of possible micro-states is increasing at all times. This microscopic interpretation extends the thermodynamic-macroscopic interpretation by Clausius. The entropy finally resolved the mystery of the energy that had disappeared in heat engines (see box). A part of the thermal energy constantly withdraws from mechanical usability and is released again because the entropy must not decrease in closed systems.

Versatile use

Since the findings of Clausius and Boltzmann, entropy has also moved into other areas of physics. It was even picked up outside of physics, at least as a mathematical concept. For example, the American mathematician and electrical engineer Claude Shannon introduced the so-called information entropy in 1948. With this size he characterized the loss of information in transmissions over the telephone line.

Entropy also plays a role in chemistry and biology: In certain open systems, new structures can form if entropy is released to the outside. These must be so-called dissipative systems, in which energy is converted into thermal energy. This theory of structure formation comes from the Belgian physical chemist Ilya Prigogine. To date, works are published in which new aspects are added to the physical scope of the concept.