Are you a chemistry student? Visit A-Level Chemistry to download comprehensive revision materials - for UK or international students!

Entropy Equation

Introduction

An Entropy contains a broad range of properties of a thermodynamic system. It relates to the number Ω of microscopic configuration which is also known as microstates which are consistent with the macroscopic quantitates that characterize the system i.e. volume, pressure, and temperature.

History

Lazare Carnot, a French mathematician suggested in his 1803 paper named Fundamental Principles of Equilibrium and Movement in any machine represents moment losses by activities through acceleration and shocks of the moving parts. From the basis of this work, in 1824 Lazare’s son Sadi Carnot published Reflections on the Motive Power OF Fire which suggested that in every heat-engines, whenever heat (caloric) decreases through a temperature difference i.e. from a hot to a cold body, motive power or work can be produced. He made an analogy of a water wheel that how water is fallen on it. This gives an initial idea to the second law of thermodynamics. In the early 18th century, Carnot tells us that both heat and light were indestructible forms of matter that are attracted and repelled by other matter and he took this view from the Newtonian Hypothesis.

In 1843, James Joule gives the first law of thermodynamics, from his experiments on heat-friction expresses the concept of energy and its conservation in all processes. However, it doesn’t tell us the effects of friction and dissipation.

During the 1850s and 1860s, a German physicist named Rudolf Clausius objects to the supposition that no change occurs in a working body and gave this change a mathematical explanation by interrogating the nature of the inherent loss of usable heat when work is done. For example, the heat produced by friction. Clausius explained entropy as the transformation content in contrast to an earlier view that was based on the theories of Newton that heat was an indestructible particle having a mass.

Later an entropy was given a statistical basis by different scientists named Ludwig Boltzmann, Josiah Willard Gibbs and James Clerk Maxwell. In 1877 Boltzmann tells us the way to measure the entropy of ideal gas particles in which he explained the entropy that it is directly proportional to the natural logarithm of the number of microstates that gas can occupy. Hence, the crucial problem in statistical thermodynamics has been to illustrate the distribution of a given amount of energy E over N identical system

Definition

A measure of an extent to which energy is dispersed is called entropy.

Entropy can be defined as the two equivalent definitions:

  1. The classical thermodynamic definition
  2. The statistical mechanics’ definition

The ancient definition of classical thermodynamics was first developed. In the classical thermodynamics point, the microscopic features of a system are not considered. While the behavior of a system is illustrated in terms of empirically defined thermodynamic variables such as entropy, temperature, heat capacity, and pressure. The classical thermodynamics explanation describes a state of equilibrium whereas more attempts have been made to develop an appropriate definition of entropy in a non-equilibrium system. Hence, the macroscopic approach in studying thermodynamics that does not require any knowledge about the behavior of individual particles is known as classical thermodynamics.

The statistical definition of entropy and other properties of thermodynamic were developed later. The thermodynamic properties are defined in terms of the statistics of the motions of the microscopic constituents of a system. in short, the statistical thermodynamics is defined as an elaborated approach which is based on the average behavior of large groups of individual particles.

Entropy Equations

Entropy can be calculated using different equations:

  1. If the process is at a constant temperature, then the equation will be:
Entropy Equation 1

Where

ΔS indicated the change in entropy

qrev shows the reverse of the heat

T is the Kelvin temperature

  • If there is a known reaction, then ΔSrxn can be determined by using a table of standard entropy values.
Entropy Equation 2
  • Gibbs free energy (ΔG) and the enthalpy (ΔH) can also be used to determine ΔS. The Gibbs entropy formula is :
Entropy Equation 3

The entropy of a system

entropy is a fundamental function of a state. It arises directly from the Carnot cycle. It can also be explained as a reversible heat divided by temperature.

In a thermodynamic system, the physical states like pressure, density, and temperature tend to be uniform over time because the equilibrium state has a higher probability than any other state.

For example, if an icy water glass is placed in the air at room temperature. The difference in temperature between ice water and a warm room starts to equalize itself as portions of the thermal energy from warm surroundings to cool icy water. A time comes when the temperature of the room and a glass content becomes equal. Therefore, you can say that the entropy of the room has decreased when the energy is dispersed to glass content.

The above example tells that it is an isolated system, the entropy of the glass content system has increased more than the entropy of the surrounding room has decreased. So it is seen that in an isolated system the diffusion of energy from warm to cool results in a net increase in entropy. Hence, when the “universe” of the room and glass content system reached an equilibrium temperature, the entropy change from an initial state is at maximum. So, the entropy of the thermodynamic system is a measure of how far the equalization has progressed.

Entropy never decreases for an isolated system. The increase in an entropy leads to an irreversible change in a system because some of the energy is expended as waste heat that limits the amount of work a system can perform. Entropy can only be calculated; it can never be observed directly. For a given substance, entropy can be calculated as a standard molar entropy from an absolute zero which is also known as absolute entropy or as a difference in entropy from any reference state which can also be known as zero entropy.

Units of entropy:

Entropy has the dimension of energy that is divided by temperature. So it has the units of Joules per Kelvin (J/K) in SI units. These units are the same as that of heat capacity but off course both of the concepts are distinct.

Note

Entropy is not a conserved quantity; for example, consider an isolated system having a non-uniform temperature, the heat might flow irreversibly hence the temperature becomes uniform such that entropy increases. The second law of thermodynamics tells us that entropy of a closed system remains constant or may increase. The chemical reactions are a source of changes in entropy and entropy also plays an important role in explaining the direction of a chemical reaction that proceeds spontaneously.

The definition of entropy according to one dictionary is that “it is a measure of thermal energy per unit temperature that is not available for any useful work”. At a uniform temperature, a substance has a maximum entropy and is unable to drive a heat engine. At non-uniform temperature, a substance has low entropy and some of the thermal energy can drive a heat engine.

There is a special case of entropy occurs when two or more different substances are mixed and hence the entropy of mixing takes place with an increase in entropy. If the substances are at the same temperature and pressure, the net exchange of heat and work will be zero. The entropy change is different due to the mixing of different substances. This results due to a change in volume per particle with mixing at a statistical mechanical level.

Entropy Equation 4

Experimental measurement of entropy:

The entropy of a substance can be measured indirectly. The definition of temperature will be used in terms of entropy whereas limiting the energy exchange to heat (dU to dQ ).

Entropy Equation 5

The above relation shows how entropy changes dS when at a specific temperature a small amount of energy dQ is introduced into a system.

The measurement process goes as follows. At first, a given sample is cooled close to absolute zero. The entropy approaches zero at such temperatures because of the temperature definition. Then a small amount of heat is introduced into the sample and temperature change is measured until the desired temperature is achieved i.e. 25ᵒC. The data gained is put into the above equation and the results yield the absolute value of entropy of the sample at the final temperature. This value of entropy is called as calorimetric entropy.

The second law of thermodynamics:

the second law of thermodynamics generally requires that the total entropy of a system can’t decrease until increasing the entropy of any other system. So the entropy of an isolated system tends not to decrease. It follows that without the application of some work the transfer of heat from the cold body to a hot body is not possible. It is also impossible for any device that operates on a cycle to produce total work from one reservoir of temperature; the total work production needs a flow of heat from a hot reservoir to a cold reservoir or a single expanding reservoir that undergoes adiabatic cooling that performs adiabatic work. As a result, there is no possibility of a permanent motion system.

Applications:

  1. The fundamental thermodynamic relation:

The entropy of any system depends on its internal energy and its external parameters i.e. volume. In the thermodynamic limit, this fact leads us to an equation that relates the change in the internal energy U to changes in entropy and the external parameters. We call this relation a fundamental thermodynamic relation. If the external pressure p holds on volume V as the only external parameter, then we gain the following relation:

dU = T dS – p dV

This fundamental thermodynamic relation is involving many thermodynamic identities that are independent of the microscopic details of the system. For example, Maxwell relations and heat capacities relationships. 

  • Entropy in chemical thermodynamics:

      Thermodynamic entropy plays a central role in chemical thermodynamics that enables changes to be quantified and to predict the outcome of reactions. The second law of thermodynamics tells us that entropy in an isolated system is the combination of a subsystem under study and its surroundings that increases during all spontaneous chemical and physical processes. The Clausius equation introduces the measurement of entropy change. Entropy change states the direction and quantifies the magnitude of small changes i.e. heat transfer between systems from hot to a cold body spontaneously.

Entropy change equations for simple processes:

  1. Isothermal expansion or compression of an ideal gas:

At a constant temperature the expansion or compression of an ideal gas from an initial volume V0 and pressure P0 to a final volume V and pressure P, the change in entropy is given by the following equation:

Entropy Equation 6

                             Where

                             n indicates the number of moles of a gas

                             R is the ideal constant.

These equations may also apply for expansion into a finite vacuum or a throttling process, where the temperature, internal energy, and enthalpy for an ideal gas remains constant.

  • Cooling and heating

The entropy change equation for heating or cooling of any system at constant pressure from an initial temperature to a final temperature is given by:

Entropy Equation 7

                             Where

                             Cp represents the constant pressure molar heat capacity

            There is no phase change occurs in temperature interval.

            Similarly, at constant volume, the entropy change is given by

Entropy Equation 8

Where Cv is the constant-volume molar heat capacity and there is no phase change.

Around absolute zero at low temperature, the heat capacities of solids drop quickly near to zero, therefore the assumption of constant heat capacity does not apply.

As entropy is a state function, the entropy changes of any process in which temperature and volume both vary is the same as for a path that is divided into two steps i.e. heating at constant volume and expansion at a constant temperature. The entropy change for an ideal gas is given as:

Entropy Equation 9

Similarly, when the temperature and pressure of an ideal gas both vary, the equation will be given as:

Entropy Equation 10
  • Phase transitions:

At constant temperature and pressure, a reversible phase transition occurs. The reversible heat is the enthalpy change for the transition and the entropy change is the given by dividing the enthalpy change by the thermodynamic temperature. The entropy of fusion is given by:

Entropy Equation 11

Similarly, the entropy of vaporization for the vaporization of a liquid to gas is given as follows:

Entropy Equation 12


References:

http://www.softschools.com/formulas/chemistry/entropy_formula/97/

http://www.iiserpune.ac.in/~p.hazra/FULL_Thermodynamics_lecture.pdf

http://www.people.vcu.edu/~vnicule/Basic%20concepts%20of%20Thermo%20part%201.pdf

https://en.wikipedia.org/wiki/Entropy#Entropy_balance_equation_for_open_systems

5/5 (2)

Please rate these notes