Letnbsppnbspbe the transition matrix of an ergodic markov


1. Let P be the transition matrix of an ergodic Markov chain.  Let x be any column vector such that Px =   x.  Let M be the maximum value of the components of x.  Assume thatxi  M .  Show that if pij  > 0 then xj  M .  Use this to prove that x must be a constant vector.

2. Let P be the transition matrix of an ergodic Markov chain.  Let w be a fixed probability vector (i.e., w is a row vector with wP w). Show that if wi  = 0 and pji  > 0 then wj  = 0.  Use this to show that the fixed probability vector for an ergodic chain cannot have any 0 entries.

3. Find a Markov chain that is neither absorbing or ergodic.

Request for Solution File

Ask an Expert for Answer!!
Basic Statistics: Letnbsppnbspbe the transition matrix of an ergodic markov
Reference No:- TGS01288976

Expected delivery within 24 Hours