consider a person who repeatedly plays a game of


Consider a person who repeatedly plays a game of chance (gambling)with two results possible (win or lose) with a probability p = 0, 3 to win. If the person has bet x amount and if he wins, he take back his x dollars and he receives another x dollars. Otherwise, heloses his bet. Suppose that each game, the person chooses to bet the amount as follows: if he has less than $ 3 in hand, he put all his money. Otherwise, since his goal is to have $ 5, he put the difference between $ 5 and what it was in his pocket. The continues to play until he found with 0 or $ 5 in his pocket, then stop playing. Let Xnbe the pocket amount (in whole dollars) only after the game, and assume that X0 the initial money of the player, is an integer uniformly distributed between 1 and 4.

(A) Construct the matrix of transition probabilities of the Markov chain {Xn, n = 0, 1, 2,. . .}.

(B) What is the probability that the game ends on the first try?

(C) If the game ends in the first test, what is the probability that the player is ruined?

(D) If the game continues after the first test, then what is the expected wealth of the player?

(E) What is the probability that the player has $ 3 or more after playing twice?

 

Request for Solution File

Ask an Expert for Answer!!
Basic Statistics: consider a person who repeatedly plays a game of
Reference No:- TGS0219780

Expected delivery within 24 Hours