Suppose you are making observations of a system which


Problem 1: Suppose you are making observations of a system which changes in time and which can be in one of three states. Every time you observe the system, the next observation is independent. For simplicity we will call these states A, B and C. Call the probability P(A) = x, the probability P(B) = y and the probability P(C) = z. There is a nice function for measuring how random this process is called the entropy of the probability distribution. This is the function

h (x, y, z) = -x ln (x) - y ln (y) - z ln (z)

The variables x, y and z are supposed to represent probability, so their sum should be 100%. If we want to know the extrema of the entropy, we need to subject it to the constraint x + y + z = 1. In addition, all probabilities are positive. Use Lagrange multipliers to find the extrema of h (x, y, z) subject to this constraint.

Does entropy have a maximum, minimum or both?

Problem 2: Suppose we attach some significance to each state of the system, given by a weight (some scalar). If these are states of some physical system, our weight might be the energy of each state. Call the weights of A, B and C the constants a, b and c respectively. A new quantity we can define is the free energy of the system

E(x, y, z) = h(x, y, z) + ax + by + cz

Most physical systems try to extremize the free energy, and the extrema of the free energy is the pressure of the system. Using Lagrange multipliers, show that the pressure is in (ea + eb + ec), which occurs when the probability is distributed as

x = ea / (ea + eb + ec);   y = eb / (ea + eb + ec); z = ec / (ea + eb + ec).

Request for Solution File

Ask an Expert for Answer!!
Mathematics: Suppose you are making observations of a system which
Reference No:- TGS01359242

Expected delivery within 24 Hours