1 the state of a process changes daily according to a


1. The state of a process changes daily according to a two-state Markov chain. If the process is in state iduring one day, then it is in state the following day with probability Pi,, where

P0,0 = 0.4, P0,1  = 0.6, P1,0 = 0.2, P1,1  = 0.8

Every day a message is sent. If the state of the Markov chain that day is then the message sent is "good" with probability pand is "bad" with probability q= 1 - p= 0, 1

(a) If the process is in state 0 on Monday, what is the probability that a good message is sent on Tuesday?

(b) If the process is in state 0 on Monday, what is the probability that a good message is sent on Friday?

(c) In the long run, what proportion of messages are good?

(d) Let Yequal 1 if a good message is sent on day and let it equal 2 otherwise. Is {Yn? 1} a Markov chain? If so, give its transition probability matrix. If not, brie?y explain why not.

Request for Solution File

Ask an Expert for Answer!!
Basic Statistics: 1 the state of a process changes daily according to a
Reference No:- TGS01352798

Expected delivery within 24 Hours