Entropy is confusing, and sometimes it seems to be made that way.
For instance, this was how Rudolf Clausius (a German physicist) came up with the term:
Leon Cooper (1972 Nobel Prize Laureate in Physics) commented that instead, “he succeeded in coining a word that meant the same thing to everybody: nothing.”
What entropy isn’t
We often hear entropy described as a measure of disorder. *Cue the messy room analogy.*
However, not only is that explanation misleading, it’s also incorrect.
While the messy room analogy might offer an initial, relatable understanding of entropy as “disorder,” it falls short of accurately representing the scientific and mathematical principles of entropy in thermodynamics.
For example, an increase in entropy in a system does not always correspond to what we would consider an increase in disorder in everyday terms, and the opposite applies as well.
So, then, what exactly is entropy?
Let’s start with another question: why do we eat food?
Why food?
For nutrients? Protein? Carbs?
Energy?
We use energy for various functions such as movement, heat production, and maintaining bodily functions. However, when we consume food, we aren’t really “using up” energy but instead converting the chemical energy stored in the food into forms of energy — kinetic, potential, heat, etc.
When you consume food, the body metabolizes it, converting the organized energy (low entropy, e.g., glucose) stored in the chemical bonds of food molecules into less organized forms (high entropy, e.g., CO2 & H2O). This process releases energy that the body uses but also increases the overall entropy in the system.
The energy released is not destroyed but dispersed in a more disordered form.
Because energy is only useful to do work when it’s clumped together (e.g., in your food) and not when it’s spread out (e.g., heat radiating from your body), entropy is better thought of as a measure of how SPREAD OUT the energy is.
Now, we’re getting somewhere.
An improved definition
Britannica defines entropy as a “measure of a system’s thermal energy per unit temperature that is unavailable to do work.”
To explain what that means, let’s wind back in history to Clausius’ summary of the first two laws of thermodynamics (yes, the guy who came up with the confusing term):
- The energy of the universe is constant.
- The entropy of the universe strives toward a maximum.
No energy is created or destroyed for every energy transformation, but the net energy always becomes more spread out and less usable.
To quantify this idea, Clausius formulated a way to measure entropy. When an amount of heat Q flows into a large heat reservoir at a temperature T (above absolute zero), the entropy increase is given by:
When you divide the heat transfer (Q) by the temperature (T), you are essentially measuring how much energy is distributed within the system relative to its temperature. The resulting units, Joules/Kelvin, reflect that entropy measures the energy per unit temperature.
Still trying to understand that? Let’s break the equation into simple terms:
We can break down #3 a bit more.
A microstate refers to a specific, unique arrangement of particles (atoms, molecules) within a system. Each microstate corresponds to a particular set of positions and momenta (momentum and direction of motion) of all the particles in the system. For example, each microstate corresponds to a unique distribution of particle positions and velocities in a gas.
A macrostate, on the other hand, is a description of the system’s macroscopic properties and can be represented by a set of macroscopic variables. This includes temperature (T), pressure (P), volume (V), energy (U), and entropy (S)!
Adding heat (Q) to a system increases the energy of the macrostate, allowing it to access a larger number of microstates. At higher temperatures, a measure of the average kinetic energy of particles, it takes more heat than at a lower entropy to increase the overall entropy of the system by the same amount.
Entropy, more as we know it
The Boltzmann equation (which was actually created by Max Planck, not Ludwig Boltzmann) provides a statistical interpretation of entropy. For a large system in thermodynamic equilibrium, entropy S is proportional to the natural logarithm of a quantity Ω, representing the maximum number of microscopic ways the macroscopic state corresponding to S can be arranged. The equation is:
The idea behind this equation is that while certain occurrences (of low entropy configurations) spontaneously happening don’t violate the law of energy conservation, they don’t happen. For example:
If you dump a cup of water into the ocean and then scoop another cup up, it’s technically not impossible to scoop up the same water molecules in the same arrangement, but that doesn’t happen. This is because the probability of that happening is so tiny it wouldn’t happen in your lifetime. Or my lifetime. Or the Earth’s lifetime…
This equation emphasizes that entropy is not just about energy dispersal but also about the number of ways a system can be arranged while maintaining the same energy state. Thus, entropy is a central concept in understanding heat and energy flow, as well as the fundamental nature of spontaneous processes and the direction of time.
It encapsulates the idea that all spontaneous processes are irreversible, and since the universe’s entropy is constantly increasing, it creates an “arrow” of time: time moves in the direction of increased entropy.
But that doesn’t make sense…
Then how does anything become “ordered”?
Returning to the food example, don’t plants decrease entropy when they grow, using less ordered molecules to form a more ordered structure?
Yes… and no. While entropy can decrease locally, it’s at the expense of the general universe increasing in entropy. In the case of the plant growing, it gets a source of low entropy from the Sun’s light. The Sun, rather than only being a source of energy, is more accurately the Earth’s source of low entropy energy (and the Earth radiates dispersed thermal energy back into space)!
But why does this matter anyway?
The universe’s tendency towards low entropy is likely why humans and all other living things exist.
The role of life in accelerating entropy increase is a fascinating aspect of thermodynamics. Life processes involve numerous chemical reactions, often leading to faster energy disposal than non-living systems.
Let’s take the example of a forest: plants use sunlight to grow, animals feed on plants or other animals, and decomposers break down waste and dead organisms. This continuous cycle of growth, consumption, and decomposition leads to a rapid increase in entropy compared to a non-living system, like a barren land.
This can be seen in the higher rate of energy transfer and transformation in a living ecosystem. The energy from the Sun is quickly dispersed through various life forms and eventually radiated back into space as less organized thermal energy. In contrast, a barren land absorbs and reradiates solar energy with less transformation and dispersal of energy.
The concept of entropy is crucial in understanding not just physical systems but also biological processes and the progress of society. The direction of energy flow and its transformation into different forms is the basis of all life and technological development.
Our ability to harness low-entropy energy sources, like fossil fuels and solar energy, and convert them into useful work drives our civilization. By understanding entropy, we gain a deeper appreciation of the delicate balance between order and disorder that characterizes our universe and our place within it.
Here are a few great resources to check out (that I used!):
[MOST USED] Cooper, Leon N. (1968). An Introduction to the Meaning and Structure of Physics. Harper. On the nature of heat. p. 307–354
Britannica, T. Editors of Encyclopaedia (2022, September 10). Boltzmann constant. Encyclopedia Britannica. https://www.britannica.com/science/Boltzmann-constant
Drake, G. W.F. (2023, November 10). entropy. Encyclopedia Britannica. https://www.britannica.com/science/entropy-physics
Veritasium on YouTube. The Most Misunderstood Concept in Physics. https://youtu.be/DxL2HoqLbyA?si=aApbv6tpywDhdN_i
Steve Mould on Youtube. A better description of entropy. https://youtu.be/w2iTCm0xpDc?si=12o5Ur4iUhkaifV7
Sean Caroll on YouTube. The Passage of Time and the Meaning of Life. https://youtu.be/-nTQi_LgIQ4?si=uFiZqrySEpbvl6_S