15.6 Entropy and the Second Law of Thermodynamics: Disorder and the Unavailability of Energy

3 min readjune 18, 2024

measures disorder in systems, from gas molecules to stacked papers. It' calculated differently for reversible and irreversible processes, always increasing in closed systems. changes depend on initial and final states, not the path taken.

The states that entropy in isolated systems always increases. This explains why heat flows from hot to cold and why energy becomes less available for work over time. It has far-reaching implications for the universe's future.

Entropy and the Second Law of Thermodynamics

Entropy in thermodynamic processes

Top images from around the web for Entropy in thermodynamic processes
Top images from around the web for Entropy in thermodynamic processes
  • Entropy (SS) measures the degree of disorder or randomness in a system
    • Higher entropy signifies greater disorder (gas molecules spread out in a room)
    • Lower entropy indicates more order (neatly stacked papers)
  • (ΔS\Delta S) calculated differently for reversible and irreversible processes
    • : ΔS=dQT\Delta S = \int \frac{dQ}{T}, dQdQ is heat exchanged, TT is absolute temperature (melting ice at 0°C)
    • : ΔS>dQT\Delta S > \int \frac{dQ}{T} (burning wood)
  • Entropy change depends on initial and final states, not the path taken
    • : change in entropy always greater than or equal to zero, ΔS0\Delta S \geq 0 (isolated container of gas)
    • Entropy of universe always increases in spontaneous processes (salt dissolving in water)
  • Entropy is related to the number of possible microstates in a system, as described by

Second law and energy availability

  • : total entropy of an always increases over time
    • Heat flows spontaneously from hot to cold object, increasing total entropy (coffee cooling to room temperature)
    • Reversible processes maintain total entropy, irreversible processes increase it (frictionless pendulum vs. pendulum with friction)
  • Energy becomes less available for useful work as entropy increases
    • Closed system: some energy always lost as heat, reducing usable energy (car engine)
    • Quality of energy decreases during conversion from one form to another (electrical to thermal energy in a light bulb)
  • Disorder increases as entropy increases
    • System naturally tends towards most probable, disordered state (messy room)
    • Ordered states less probable, require energy input to maintain (tidying a room)
  • The concept of helps quantify the amount of useful work that can be extracted from a system

Long-term implications of entropy

  • Universe as a whole considered an isolated system
    • Second Law: total entropy of universe always increasing (expanding universe)
    • Universe moving towards state of maximum disorder, (stars burning out)
  • Entropy explains many everyday phenomena
    1. Ice melts at room temperature: disordered liquid state has higher entropy
    2. Perfume diffuses throughout room: dispersed state has higher entropy than concentrated state
    3. Living organisms constantly input energy to maintain ordered structures against tendency towards disorder (eating food to maintain body)
  • Second Law imposes limitations on energy efficiency and spontaneity of processes
    • 100% efficiency impossible in real-world processes due to entropy increases (heat loss in engines)
    • Spontaneous processes always involve increase in total entropy of system and surroundings (rusting of iron)

Statistical interpretation and thermodynamic equilibrium

  • developed the statistical interpretation of entropy
  • is the state of maximum entropy for a given set of constraints
  • The is closely related to the increase of entropy in the universe
  • of a system affects how quickly it approaches equilibrium and changes in entropy

Key Terms to Review (24)

Arrow of Time: The arrow of time refers to the unidirectional nature of time, where events appear to flow in a specific direction from the past to the future. This concept is closely tied to the Second Law of Thermodynamics, which describes the natural tendency of entropy to increase over time, leading to the irreversibility of many physical processes.
Boltzmann's constant: Boltzmann's constant is a fundamental physical constant that relates the average kinetic energy of particles in a gas with the temperature of the gas, typically denoted as k or k_B. It plays a crucial role in statistical mechanics and thermodynamics by linking microscopic and macroscopic properties, ultimately influencing concepts like entropy and the unavailability of energy within a system. The value of Boltzmann's constant is approximately 1.38 x 10^-23 J/K, reflecting how energy is distributed among particles at a given temperature.
Change in entropy: Change in entropy is the measure of the disorder or randomness in a system as it undergoes a process. It quantifies the energy dispersal and unavailability for doing work.
Closed System: A closed system is a thermodynamic system that does not exchange matter with its surroundings, but may exchange energy. It is isolated from the transfer of matter but can interact with its environment through the transfer of energy, such as heat or work.
Entropy: Entropy is a measure of the disorder or randomness in a system. It also quantifies the unavailability of a system's energy to do work.
Entropy: Entropy is a measure of the disorder or randomness in a system. It represents the unavailability of a system's energy to do useful work and the natural tendency of the universe towards increased disorder and chaos. This concept is central to the understanding of thermodynamics and the second law of thermodynamics, which governs the flow of energy and heat in physical systems.
Free Energy: Free energy is a thermodynamic quantity that combines the concepts of energy and entropy to determine the spontaneity and feasibility of a chemical process. It represents the maximum amount of useful work that can be extracted from a system while in thermal equilibrium with its surroundings, accounting for both the energy released and the disorder or randomness created in the process.
Heat Capacity: Heat capacity is a measure of the amount of energy required to raise the temperature of a substance by a certain amount. It quantifies how much heat a material can absorb or release without undergoing a significant change in temperature. This concept is crucial in understanding the thermal properties of materials and their behavior during various thermodynamic processes.
Heat Death: Heat death is the hypothetical end-state of the universe, in which the universe has reached maximum entropy and is in a state of complete thermodynamic equilibrium, with no possibility of further energy transfer or work being done. It is a concept closely tied to the second law of thermodynamics and the inevitable increase in disorder and unavailability of energy over time.
Irreversible process: An irreversible process is a thermodynamic process that cannot return both the system and the surroundings to their original states. Irreversibility often results from factors like friction, unrestrained expansion, or heat transfer through a finite temperature difference.
Isolated system: An isolated system is a physical system that does not exchange matter or energy with its surroundings. This means that the total energy and momentum within the system remain constant over time, as there are no external forces or influences acting on it. Such systems are theoretical constructs that help in understanding the principles of conservation laws.
Joules per Kelvin: Joules per Kelvin (J/K) is the unit of measurement for entropy, which quantifies the amount of energy in a system that is not available to do work, often associated with the level of disorder within that system. This measurement connects energy transfer and temperature, highlighting how the unavailability of energy increases with greater disorder. The concept is essential for understanding thermodynamic processes, as it illustrates the relationship between energy, heat, and the natural tendency for systems to evolve towards greater entropy.
Ludwig Boltzmann: Ludwig Boltzmann was an Austrian physicist who made significant contributions to the field of statistical mechanics, particularly in the understanding of the relationship between the microscopic behavior of atoms and molecules and the macroscopic properties of matter, such as pressure, temperature, and entropy. His work laid the foundation for the statistical interpretation of thermodynamics and the kinetic theory of gases.
Macrostate: A macrostate is a description of a thermodynamic system that provides a high-level, statistical overview of the system's properties, without delving into the details of the individual microscopic states of the system's components. It represents the overall, observable characteristics of a system, rather than the specific configurations of its individual parts.
Microstate: A microstate is a fundamental concept in statistical mechanics that refers to the specific arrangement or configuration of the microscopic particles, such as atoms or molecules, that make up a thermodynamic system. It is a crucial component in understanding the behavior of systems at the molecular level, particularly in the context of entropy and the second law of thermodynamics.
Refrigerator: A refrigerator is a device that removes heat from a designated area to lower its temperature, commonly used to preserve food by keeping it cool. This process relies on the principles of thermodynamics, particularly the transfer of heat through the refrigeration cycle, where a refrigerant absorbs and expels heat. The efficiency of a refrigerator is often evaluated in terms of its coefficient of performance, which relates to its function as a heat pump and the concept of energy conservation.
Reversible Process: A reversible process is a thermodynamic process that can be reversed without leaving any trace on the surroundings. In a reversible process, the system and the surroundings can be restored to their initial states without the expenditure of any work or the absorption of any heat from external sources.
S: The symbol 's' is commonly used to represent distance or displacement in physics, particularly in equations of motion. It serves as a fundamental variable that quantifies how far an object travels from its initial position to its final position, regardless of the path taken. Understanding 's' is crucial for analyzing motion, as it is directly linked to concepts such as time, velocity, and the energy dynamics of systems.
Second law of thermodynamics: The second law of thermodynamics states that the total entropy of an isolated system can never decrease over time. It implies that natural processes tend to move towards a state of maximum disorder or entropy.
Second Law of Thermodynamics: The Second Law of Thermodynamics is a fundamental principle that describes the natural tendency of energy to become less organized and more disordered over time. It establishes limits on the efficiency of energy conversion processes and the direction of heat transfer, with important implications for the operation of heat engines, heat pumps, and the overall entropy of the universe.
Statistical Mechanics: Statistical mechanics is a branch of physics that applies the principles of probability and statistics to understand the behavior of large systems composed of many interacting particles, such as gases, liquids, and solids. It provides a fundamental explanation for the macroscopic properties of materials and systems in terms of their microscopic constituents and interactions.
The second law of thermodynamics stated in terms of entropy: The second law of thermodynamics states that in any closed system, the total entropy can only increase or remain constant over time. Entropy is a measure of disorder or randomness, and this law implies that energy transformations are not perfectly efficient.
Thermodynamic equilibrium: Thermodynamic equilibrium is a state in which a system's macroscopic properties, such as temperature, pressure, and volume, are uniform throughout and do not change over time. In this state, all parts of the system are balanced, and there are no net flows of energy or matter. This concept is crucial for understanding how systems evolve towards higher entropy and the implications for energy availability.
ΔS: ΔS, or change in entropy, is a fundamental concept in thermodynamics that quantifies the degree of disorder or randomness in a system. It is a measure of the unavailability of energy and is closely related to the Second Law of Thermodynamics, which states that the entropy of an isolated system not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium.
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.