Let’s go on a side trail here and discuss what is entropy. And then later will discuss entropy and the universe.
In the classical sense, entropy is a thermodynamic term associated with molecular disorder.
entropy, the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work. Because work is obtained from ordered molecular motion, the amount of entropy is also a measure of the molecular disorder, or randomness, of a system.
a measure of the unavailable energy in a closed thermodynamic system that is also usually considered to be a measure of the system’s disorder, that is a property of the system’s state, and that varies directly with any reversible change in heat in the system and inversely with the temperature of the system
In the general sense, entropy is applied to other disciplines and is associated with the amount of disorder in a system.
Entropy is a scientific concept as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.
The term entropy is now used in many other sciences (such as sociology), sometimes distant from physics or mathematics, where it no longer maintains its rigorous quantitative character. Usually, it roughly means disorder, chaos, decay of diversity or tendency toward uniform distribution of kinds.
Going back to thermodynamics, there are three laws of thermo.
The first law of thermodynamics states that, when energy passes into or out of a system (as work, heat, or matter), the system’s internal energy changes in accordance with the law of conservation of energy.
The second law of thermodynamics states that in a natural thermodynamic process, the sum of the entropies of the interacting thermodynamic systems never decreases. A common corollary of the statement is that heat does not spontaneously pass from a colder body to a warmer body.
The third law of thermodynamics states that a system’s entropy approaches a constant value as the temperature approaches absolute zero. With the exception of non-crystalline solids (glasses), the entropy of a system at absolute zero is typically close to zero.
The idea of entropy can be hard to grasp because what does it mean to measure disorder? There is no way to empirically measure entropy. An “entropy meter” does not exist, unlike a thermometer or a photometer. Entropy can be calculated, but not directly measured in a system.
Also, what does disorder mean at a molecular level? Associating disorder with entropy is conceptually confusing, so another way to look at is energy dispersal.
The interpretation of entropy as a measure of energy dispersal has been exercised against the background of the traditional view, introduced by Ludwig Boltzmann, of entropy as a quantitative measure of disorder. The energy dispersal approach avoids the ambiguous term ‘disorder’. An early advocate of the energy dispersal conception was Edward Armand Guggenheim in 1949, using the word ‘spread’.
In this alternative approach, entropy is a measure of energy dispersal or spread at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the energy of a thermodynamic system, divided by its temperature.
Some educators propose that the energy dispersal idea is easier to understand than the traditional approach. The concept has been used to facilitate teaching entropy to students beginning university chemistry and biology.