Basic Concept of Statistical Mechanics, Physics tutorial

Elementary Probability Theory:

The Statistical mechanics is a stream of physics which applies probability theory that includes mathematical tools for dealing by large populations, to the study of the thermodynamic behavior of systems comprised of a large number of particles. We invariably calculate the averages of physical quantities of interest and then set up the connection between these values and the experimentally observed values. Therefore it is vital to be familiar with the fundamental concepts of the probability theory.

Basic Terminology:

Assume that we toss two coins altogether the possible outcomes can be listed as follows:

1069_Elementary Probability Theory.jpg

That is, there are four results of this statistical experiment that might be listed as: Ω = {(H, H), (H, T), (T, H), (T, T)}

The set of entire possible outcomes is termed as the sample space of the experiment and each of the elements or individual outcome such as (HH, HT, TH, TT) that make up a sample space (Ω) is termed as a sample point.

Therefore, we contain four sample points in the Ω that is termed as cardinality of Ω and is represented by n (Ω) that is, n (Ω) = 4. The event is a possible outcome in a random experiment. It is therefore the subset of the sample space and is generally related by a specified rule, for illustration the event of getting an odd number in a throw of a die is (1, 3, 5) whereas the event of getting the similar faces in a throw of two coins is {HH, TT}.

E1UE2 → Either E1 or E2 takes place or both takes place (at least one of E1 or E2 takes place)

E1∩E2 → Both E1 and E2 take place. If there are no sample points common to E1 and E2, the E1∩E2 = Φ and the events are stated to be disjoint or mutually exclusive.

In common, if the distinct simple events are E1 ...En, we have:

Ω = E1UE2......... UEn = i-1Un Ei

Having introduced the theory of a sample space, we now define the probability of an event.

Let us suppose the simple case in which Ω consists of a finite number of points and all the outcomes are equally likely. Let 'A' be any subset of Ω. Then we define the probability of the event 'A' to be:

P(A) = n(A)/n(Ω)

Elementary Combinatorial:

We start by stating the multiplication rule.

1) Multiplication Rule: If there are 'm' ways in which an event 'U' can occur and 'n' ways in which an independent event 'V' can take place and then there are 'mn' ways in which the two events can take place jointly. The alternative formulation of this outcome is that if an operation can be performed in 'm' ways and after it is performed in any one of such ways, a second independent operation can be performed in 'n' ways. Then the two operations can be carrying out in m by n ways.

2) Permutations: A permutation is the arrangement of a set of objects in a definite order. The number of permutation of 'n' elements taken 'r' at a time is n!/(n-r)!

This is represented by the symbol nPr Combinatorial.

A combination is the selection of n distinct objects with no regard to order. The number of combination of n element taken r at a time is n!/(n-r)!r! This is represented by nCr.

These are merely the binomial coefficients as they appear to Newton's binomial expansion

(x1 + x2)n = x1 + nx1 n-1x2 + .... + x2n = r=0Σn nCr x1(n-r)x2r

Here 'n' is a positive integer

Entropy and Probability: (A statistical view)

Entropy ~ a measure of the disorder of a system

  • A state of high order = low probability
  • A state of low order = high probability

In an irreversible method or process, the universe moves from the state of low probability to the state of higher probability.

We will describe the concepts by considering the free expansion of a gas from volume Vi to volume Vf. The gas for all time expands to fill the available space. This never spontaneously compresses itself back to the original volume.

First two definitions:

Microstate:  An illustration of a system that specified the properties (Position and/or momentum, and so on) of each and every individual particle.

Macrostate: A more generalized illustration of the system, it can be in terms of the macroscopic quantities, like P and V, or it can be in terms of the number of particles whose properties fall in a given range. In common, each macrostate includes a large number of microstates. Illustrations: Imagine a gas comprising of just 2 molecules. We wish for to consider whether the molecules are in the left or right half of the container.

There are three macrostates molecules on the left, both on the right, and one on every side.

There are 4 microstates:

There are 4 microstates: LL, RR, LR and RL

How about 3 molecules? Now we have:

LLL,     (LLR, LRL, RLL),     (LRR, RLR, RRL),    RRR 

↑                          ↑                        ↑                 ↑

(All L)         (2L, 1R)               (2R, 1L)           (All R)

That is, 8 microstates and 4 macrostates

Basic Assumption of Statistical Mechanic

The whole microstates are equally probable.

Therefore, we can compute the likelihood of determining a given arrangement of molecules in the container.

Therefore, events like the spontaneous compression of a gas (or spontaneous conduction of heat from a cold body to a hot body is not impossible, however they are so unlikely that they never take place. We can associate the # of microstates 'W' of systems to its entropy 'S' via considering the probability of a gas to spontaneously compress itself into the smaller volume. If the original volume is Vi, then the probability of determining 'N' molecules in a smaller volume Vf is:

Probability = Wf/Wi = (Vf/Vi) N

ln (Wf/Wi) = N ln (Vf/Vi) = n NA ln (Vf/Vi)

We have notice for a free expansion that ΔS = nRln (Vf/Vi)

Therefore, ΔS = (R/NA) ln (Wf/Wi) = kln (Wf/Wi)

Sf - Si = K ln (Wf) - K ln (Wi)

Therefore, we arrive at an equation first expressed by Ludwig Boltzmann, associating the entropy of a system to the number of microstates.

S = kln (W)

Concept of Statistical Mechanic:

Statistical mechanics gives a framework for associating the microscopic properties of individual atoms and molecules to the macroscopic bulk properties of materials which can be noticed in daily life, thus describing thermodynamics as an outcome of classical and quantum-mechanical illustration of statistics and mechanics at the microscopic level.

Statistical Mechanics gives a molecular level interpretation of the macroscopic thermodynamic quantities like work, heat, free energy and entropy. This enables the thermodynamic properties of bulk materials to be associated to the spectroscopic data of the individual molecule. This capability to make macroscopic predictions based on microscopic properties is the major benefit of statistical mechanic over the classical thermodynamics. Both the theories are governed via the second law of thermodynamics via the medium of entropy.

Though, entropy in thermodynamics can only be recognized empirically, while in statistical mechanical, this is a function of the distribution of the system on its microstates.

The vital problem in statistical thermodynamic is to compute the difference of a given amount of energy 'E' over 'N' identical systems. The main goal of statistical thermodynamics is to understand and deduce the materials in term of the properties of their constituent particles and the interactions among them.

This is completed through connecting thermodynamic functions to Quantum-Mechanical Equations. The two central quantities in statistical thermodynamic are the Boltzmann factor and the partition function.

Lastly, and most significantly the formal definition or statement of entropy of a thermodynamic system from a statistical viewpoint is termed as statistical entropy, and is stated as:

S = KB ln Ω

Here,

KB = Boltzmann's constant 1.38066 x 10-23 JK-1 and Ω is the number of microstates corresponding to the observed thermodynamic macrostate. The equation above is valid only if each and every microstate is equally accessible (that is, each and every microstate consists of an equivalent probability of occurring).

In conclusion the, theories of statistical mechanics that are critically significant and underline all other outcomes in order of dependence are the following.

  • Conservation of energy
  • Equilibrium, Temperate and Entropy
  • The Boltzmann distribution
  • Multiplicity defies energy (or entropy attracts heat)

Statistical Ensembles:

The modern formulation of statistical method is mainly based on the illustration of the physical system by an ensemble that symbolizes all possible configurations of the system and the probability of realizing each and every configuration.

Each and every ensemble is related by a partition function that, having mathematics manipulation, can be employed to extract values of thermodynamic properties of the systems. According to the relationship of the system to the rest of the universe, one of the three general kinds of ensemble might apply in order of increasing complexity.

1) Micro canonical Ensemble: It explains a completely isolated system, having constant energy as it doesn't exchange energy or mass by the rest of the universe.

2) Canonical Community: It illustrates a system in the thermal equilibrium with its environment. This might only exchange energy in the form of heat with the outside.

3) Grand Canonical: Used in open systems that exchange energy and mass with the outside?

2319_Statistical Ensembles.jpg

Micro canonical Ensemble: In this kind of ensemble N, V and E are fixed. As the second law of thermodynamics applies to the isolated systems, the first case investigated will correspond to the case of Micro canonical ensemble illustrates an isolated system.

The entropy of such a system can just increase, in such a way that the maximum of its entropy corresponds to the equilibrium state for the system. As an isolated system keeps a constant energy, the total energy of the system doesn't fluctuate. Therefore, the system can access only those of its micro-states that correspond to a specific value 'E' of the energy. The internal energy of the system is then strictly equivalent to its energy.

Suppose Ω(E) be the number of microstates corresponding to the value of the system's energy. The macroscopic state of maximal entropy for the system is the one in which all the micro-states are equally possible to take place by the Probability I/Ω(E), throughout the system fluctuations.

S = - KB i=1ΣΩ(E) {1/Ω(E) ln 1/Ω(E)}

S = KB ln (Ω(E))

Here, 'S' is the system entropy and KB is the Boltzmann's constant.

Canonical Ensemble: Major article in the canonical ensemble N, V and T are fixed. Invoking the concept of the canonical ensemble, this is possible to derive the probability Pi that a macroscopic system in thermal equilibrium by its environment, will be in a given microstate with energy Ei according to the Boltzmann distribution.

Pi = (e-βEi)/(iΣjmax e-βEi)

β = 1/KBT

The temperature 'T' occurs from the fact that the system is in thermal equilibrium by its environment. The probabilities of the different microstate should add to one and the normalization factor in the denominator is the canonical partition function

Z = iΣimax e-βEi

Here Ei are the energy of the ith microstate of the system. The partition function is a measure of the number of states accessible to the system at a specific temperature.

The distribution function:

Assume that an ideal monoatomic gas made up of 'N' particles enclosed in a volume 'V' and having total internal energy 'U'. The state of the system at any time 't' is symbolized by a point in a 6N dimensional phase space. This signifies that each and every particle is related by six dimensional phase space, as well termed as the μ space, μ represents the first letter of molecule. The particles are moving independently of each other and the contributions of individual particles remain separate.

To provide a microscopic illustration of the system, we split the μ-space into cells of volume h3. Remember that in classical statistics, we can select 'h' as small as we like. Each and every particle will be found to engage a cell in this network. Assume that the cells are numbered 1, 2... Assume that the energy of a particle in the ith cell be represented by ε. Then, we have:

N = Σi ni

U = Σi niεi

The macrostate (N, V, U) can be realized in a number of different manners. In order to proceed by our argument, we advance the hypothesis that all the microstates are equally probable. In another words, equivalent phase elements in phase space are related with equivalent probabilities it corresponds to the supposition that the faces of a die are equally probable.

This hypothesis is termed as the postulate of equal a priori probabilities. The thermodynamic probability 'W' is simply the number of manners of placing N distinguishable objects in cells such that there are no objects in the first cell, n2 in the second and so forth. This number is given represented:

W = n!/n1!n2!... = N!/πi=1 ni!

Tutorsglobe: A way to secure high grade in your curriculum (Online Tutoring)

Expand your confidence, grow study skills and improve your grades.

Since 2009, Tutorsglobe has proactively helped millions of students to get better grades in school, college or university and score well in competitive tests with live, one-on-one online tutoring.

Using an advanced developed tutoring system providing little or no wait time, the students are connected on-demand with a tutor at www.tutorsglobe.com. Students work one-on-one, in real-time with a tutor, communicating and studying using a virtual whiteboard technology.  Scientific and mathematical notation, symbols, geometric figures, graphing and freehand drawing can be rendered quickly and easily in the advanced whiteboard.

Free to know our price and packages for online physics tutoring. Chat with us or submit request at info@tutorsglobe.com