Letnbspp1nbspandnbspp2nbspdenote transition probability


Let P(1and P(2denote transition probability matrices for ergodic Markov chains having the same state space. Let π1 and π2 denote the stationary (limiting) proba- bility vectors for the two chains. Consider a process de?ned as follows:

(a) X0 = 1. A coin is then ?ipped and if it comes up heads, then the remain- ing states X1, ... are obtained from the transition probability matrix P(1and if tails from the matrix P(2). Is {Xn? 0} a Markov  chain?  If P{coin comes up heads}, what is limn→∞ P(Xi)?

(b) X0 = 1. At each stage the coin is ?ipped and if it comes up heads, then the next state is chosen according to P(1and if tails comes up, then it is chosen according to P(2). In this case do the successive states constitute a Markov chain? If so, determine the transition probabilities. Show by a counterexample that the limiting probabilities are not the same as in part  (a).

Request for Solution File

Ask an Expert for Answer!!
Basic Statistics: Letnbspp1nbspandnbspp2nbspdenote transition probability
Reference No:- TGS01352800

Expected delivery within 24 Hours