Skip to main content

Entropy


\LARGE S=k\ln\Omega\hspace{30pt}{\Delta}S{\geq}0



The Terms


S - Entropy of the system [Units: J K-1]

k - Boltzmann constant [Value: 1.381 x 10-23 J K-1]

\Omega - The number of distinct microstates consistent with the given macrostate.

 



What Does It Mean?

The second Law of thermodynamics is that entropy increases. Entropy is a macroscopic property of a thermodynamic system which measures the microscopic disorder. The Boltzmann definition derived from probability theory is given on the left, whereas the expression on the right is of the second law of thermodynamics, which states that the total entropy of an system will not decrease other than by increasing the entropy of some other system.

 


Did You know?

It is known theoretically, and has been confirmed experimentally, that the second law of thermodynamics can be violated for small systems and over short time intervals. In other words, there is a finite probability of observing processes that decrease, rather than increase, the entropy.

This is likely to be important in understanding the functioning of nano-machines and the molecular motors that underpin biological processes. 


Further information at Warwick

Entropy is covered in the second year module "PX265 Thermal Physics II" and the third year module "PX366 Statistical Physics".

Boltzmann

Ludwig Eduard Boltzmann

1844-1906