Boltzmann entropy

Boltzmann entropy

In thermodynamics, specifically in statistical mechanics, the Boltzmann entropy is an approximation to the normal Gibbs entropy.

The Boltzmann entropy is obtained if one assumes one can treat all the component particles of a thermodynamic system as statistically independent. The probability distribution of the system as a whole then factorises into the product of "N" separate identical terms, one term for each particle; and the Gibbs entropy simplifies to the Boltzmann entropy

:S_B = - N k_B sum_i p_i log p_i ,

where the summation is taken over each possible state in the 6-dimensional phase space of a "single" particle (rather than the 6"N"-dimensional phase space of the system as a whole).

This reflects the original statistical entropy function introduced by Ludwig Boltzmann in 1872. For the special case of an ideal gas it exactly corresponds to the proper thermodynamic entropy.

However, for anything but the most dilute of real gases, it leads to increasingly wrong predictions of entropies and physical behaviours, by ignoring the interactions and correlations between different molecules. Instead one must follow Gibbs, and consider the ensemble of states of the system as a whole, rather than single particle states.

= See also =

* Entropy (thermodynamics)
* Boltzmann's entropy formula
* Gibbs entropy

= References =
* Jaynes, E. T. (1965). [http://bayes.wustl.edu/etj/articles/gibbs.vs.boltzmann.pdf Gibbs vs Boltzmann entropies] . "American Journal of Physics", 33, 391-8.

External links


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… …   Wikipedia

  • Boltzmann's entropy formula — In statistical thermodynamics, Boltzmann s equation is a probability equation relating the entropy S of an ideal gas to the quantity W , which is the number of microstates corresponding to a given macrostate::S = k log W ! (1)where k is Boltzmann …   Wikipedia

  • Entropy in thermodynamics and information theory — There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S , of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s; and the …   Wikipedia

  • Entropy (disambiguation) — Additional relevant articles may be found in the following categories: Thermodynamic entropy Entropy and information Quantum mechanical entropy Entropy, in thermodynamics, is a measure of the energy in a thermodynamic system not available to do… …   Wikipedia

  • Entropy (general concept) — In many branches of science, entropy refers to a certain measure of the disorder of a system. Entropy is particularly notable as it has a broad, common definition that is shared across physics, mathematics and information science. Although the… …   Wikipedia

  • Entropy (order and disorder) — Boltzmann s molecules (1896) shown at a rest position in a solid In thermodynamics, entropy is commonly associated with the amount of order, disorder, and/or chaos in a thermodynamic system. This stems from Rudolf Clausius 1862 assertion that any …   Wikipedia

  • Entropy (arrow of time) — Entropy is the only quantity in the physical sciences that picks a particular direction for time, sometimes called an arrow of time. As one goes forward in time, the second law of thermodynamics says that the entropy of an isolated system can… …   Wikipedia

  • Boltzmann constant — For the constant pertaining to energy of black body radiation see Stefan–Boltzmann constant Values of k[1] Units 1.3806488(13)×10−23 J K−1 8.617332 …   Wikipedia

  • Entropy (statistical thermodynamics) — In thermodynamics, statistical entropy is the modeling of the energetic function entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of the Austrian physicist Ludwig Boltzmann. Mathematical… …   Wikipedia

  • Entropy (energy dispersal) — The thermodynamic concept of entropy can be described qualitatively as a measure of energy dispersal (energy distribution) at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”