Thermodynamical Probability, Fundamental Postulates of Statistical Mechanics,
This note provides us an information about Thermodynamical Probability, Fundamental Postulates of Statistical Mechanics, Division of Phase Space into Cells, Entropy and Probability
Summary
This note provides us an information about Thermodynamical Probability, Fundamental Postulates of Statistical Mechanics, Division of Phase Space into Cells, Entropy and Probability
Things to Remember
- Probability is generally defined as the ratio of number of possible events to the total number of events.
- The equilibrium state of a gas corresponds to the microstate of maximum probability.
- Entropy is additive quantity .
- \(\Omega=\frac{N!}{n_1!n_2!n_3!}\)
- The number of microstates in this volume element \(=\frac{\Delta q_1\Delta q_2...\Delta q_f.\Delta p_1\Delta p_2...\Delta p_f}{h^f}\)
- S=Klog\(\Omega \)
MCQs
No MCQs found.
Subjective Questions
No subjective questions found.
Videos
No videos found.

Thermodynamical Probability, Fundamental Postulates of Statistical Mechanics,
Thermodynamical Probability:
Probability is generally defined as the ratio of number of possible events to the total number of events. Let x be the possible events and y be the failed events then the probability is given by ,\(\Omega =\frac{x}{x+y}\). But in statistical thermodynamics, the probability of a macrostate is defined as the number of microstates corresponding to that macrostate.

Let us assume that four molecules a,,b,c and d. We like to arrange them in cell i and cell j. Let Ni be the phase points in the ith cell and Nj in the jth cell. Then the following are the possible macrostates:
\(N_i\) | 4 | 3 | 2 | 1 | 0 |
\(N_j\) | 0 | 1 | 2 | 3 | 4 |
It shows that the total number of macrostate are 5.
Now find possible number of macrostates while taking \(N_i=1\) and \(N_j=3\). Under this condition we arrange phase states a, b, c and d as follows,
\(N_i\) | bcd | acd | abd | abc |
\(N_j\) | a | b | c | d |
Thus for macrostate \(N_i=1\space and \space N_j=3.\) There are four possible microstates.
We see from above statistical performance, the thermodynamical probability for macrostate \(N_i=1\space and \space N_j=3\) is 4.
\(\Omega=4\)
For N phase points total number of permutations is N!. If \(n_1\) is the number of phase points in cell 1, \(n_2\) in the cell 2 then there are\(n_1\)! permutation in cell 1,\(n_2!\) permutation in cell 2 and so on. Then thermodynamical property for such a system is
\(\Omega=\frac{N!}{n_1!n_2!n_3!}\)
Fundamental Postulates of Statistical Mechanics
(1) Any gas is composed oby molecules, which are in motion and behave like very small elastic spheres.
(2) The size of each phase cell is same.
(3) All accessible macrostates corresponding to the possible macrostates are equally probable.
(4) Total number of molecules is constant.
Total energy of the system is constant. Let there are \(n_1\) molecules of energy\(\varepsilon_1\), \(n_2\) of \(\varepsilon_2\) and so on. Then from the law of conservation of energy,
\( E=n_1\varepsilon_1+n_2\varepsilon_2+...\implies E=\sum_{i}n_i\varepsilon_i\)
(5) The equilibrium state of a gas corresponds to the microstate of maximum probability.
Division of Phase Space into Shells:
Let 2f dimensional phase space defined by position co-ordinates \(q_1, q_2,...,q_i,...q_f\) and momentum coordinates \(p_1, p_2,...p_i,...p_f\).
An element of volume in this phase space will be represented by

(\(\Delta q_1\Delta q_2...\Delta q_f\)).(\(\Delta p_1\Delta p_2...\Delta p_f\))
Let us divide any finite volume of phase space into a large number of cells. Let the size of each cell be h. But \(h=\delta p_i\delta q_i\)
\(\therefore \)The number of microstates in this volume element \(=\frac{\Delta q_1\Delta q_2...\Delta q_f.\Delta p_1\Delta p_2...\Delta p_f}{h^f}\)
Entropy and Probability
Boltzmann discovered a relation between entropy and probability. He stated that the equilibrium state of the system is the state of the system at which thermodynamic probability is maximum. But from the thermodynamical point of view, the equilibrium state of a system is the state of maximum entropy.

If the two cells A and B have entropies \(S_1\space and\space S_2\) and the probability of a molecule finding in the region of cell A under the given restriction be \(\Omega_1\) and that of another molecule is \(\Omega_2\). Then from Boltzmann relation,
\(S_1=f(\Omega_1)\rightarrow 1\space and\space S_2=f(\Omega_2)\rightarrow 2\)
When the molecules of A and B aremixed together and hence the entropy of molecule and probability changes.
Since entropy is additive quantity then the latter entropy will be \(S=S_1+S_2\rightarrow 3\)
On the other hand probability of finding the molecules in the latter case is \(\Omega=\Omega_1\Omega_2\)\begin{align*}\therefore S=f(\Omega) \end{align*}\begin{align*}S=f(\Omega_1\Omega_2)\rightarrow 4 \end{align*}Now from 1,2,3 and 4\begin{align*}f(\Omega_1\Omega_2)=f(\Omega_1)+f(\Omega_2) \end{align*}Taking partial differentiation with respect to \(\Omega_1\space and \space \Omega_2\) repectively, we get\begin{align*}\Omega_2f^{'}(\Omega_1\Omega_2)=f^{'}(\Omega_1)\space and \space \Omega_1f^{'} (\Omega_1\Omega_2)=f^{'}(\Omega_2)\end{align*}Dividing each other,\begin{align*}\frac{\Omega _{2}}{\Omega _{1}}=\frac{f^{'}(\Omega _{1})}{f^{'}(\Omega _{2})} \end{align*}\begin{align*}\implies \Omega_1f^{'} (\Omega_1)=\Omega_2f^{'} (\Omega_2)=K,\space where\space K\space is\space constant \end{align*}\begin{align*}\implies f^{'}(\Omega_1)=\frac{K}{\Omega_1}\rightarrow 5 \space and\space f^{'}(\Omega_2)=\frac{K}{\Omega_2}\rightarrow 6 \end{align*}Integrating 5 and 6, we get\begin{align*}\int f^{'}(\Omega_1)d\Omega_1=\int \frac{K}{\Omega_1}d\Omega_1\end{align*}\begin{align*}\therefore f(\Omega_1)=Klog\Omega_1+C_1\rightarrow 7 \end{align*}\begin{align*}\int f^{'}(\Omega_2)d\Omega_2=\int \frac{K}{\Omega_2}d\Omega_2\end{align*}\begin{align*}Similarly,\space f(\Omega_2)=Klog\Omega_2+C_2\rightarrow 8 \end{align*}From experimental observation, \(K=1.4\times 10^{-23}JK^{-1} \) called Boltzmann constant. After applying the boundary condition, the constans \(C_1\) and \(C_2\) vanish.
\(\therefore \) The general relation between the entropy and probability is S=Klog\(\Omega \)
Lesson
Classical Statistical Physics
Subject
Physics
Grade
Bachelor of Science
Recent Notes
No recent notes.
Related Notes
No related notes.