State space


Dave's Photography Store stocks a particular model camera that can be ordered weekly. The demand for this camera is random. It is 0 with probability 0.35, 1 with probability 0.40, 2 with probability 0.20, and 3 with probability 0.05. At the end of the week, Dave places an order that is delivered in time for the next opening of the store on Monday. Dave's current policy is to order two (2) cameras if inventory is less than or equal to one (1) and order nothing otherwise. Let Xn be the number of cameras in inventory at the end of week n.

1. What is the state space S of stochastic process {Xn; n >= 0}?

2. Explain why {Xn; n >= 0} is a Markov chain.

3. Write down the transition matrix of {Xn; n >= 0}, and draw the transition diagram of fXn; n 0g.

4. Dave estimates that, at the end of week 0, there is a 60% chance that his inventory will be empty, a 30% that his inventory will have only one (1) camera, and a 10% chance that his inventory will have two (2) camera. What is the probability mass function (PMF) for X3?

Request for Solution File

Ask an Expert for Answer!!
Basic Statistics: State space
Reference No:- TGS0716206

Expected delivery within 24 Hours