Steady state probability of markov chains are - states of


Question 1. A state condition where-as the probability of one event changes as a result of occurrence of another related event is known as:

1. initial probability

2. transition matrix

3. stationary probability

4. conditional probability

Question 2. The steady state probabilities of a double stochastic probability matrix is equal to

1. 1.0

2. 1/(m*n), where m is the number of rows of the matrix and n is the number of columns of the matrix

3. 1/n, where n is the number of columns in the matrix

4. 0.50

5. None of the above

Question 3. The Markov state where, in all conditions, it will never converge to steady state is

1. Periodic State

2. Absorbing State

3. Ergodic State

4. Trapping State

5. Transient

Question 4. Steady State Probability of Markov Chains are

1. States of condition that reach probability value equal to 1.0 in all cases

2. Is the convergence to an equilibrium or "steady state" condition and applies to all markov chains

3. Steady State Probabilities is the product of Steady State Probabilities multiplied by the Transition Matrix

4. All of the above

5. None of the above

Question 5. A Transition matrix is

1. Current states of a system at time t

2. Conditional probabilities that involve moving from one state to another

3. The stationary assumption of a markov chain

4. A m by n matrix of probabilities

5. none of the above

Question 6. In a transition matrix where the sum probabilities values in each column equals 1.0 is referred to as

1. Steady State Probabilities

2. Conditional Probability Matrix

3. Stationary Matrix

4. Double Stochastic Transition Matrix

5. None of the above

Question 7. Which of the following is true regarding the markov Analysis Methodology

1. States of Nature are outcomes of a process (machine operating or broken, % of customers buying product A & B, etc)

2. There exist an initial probability associated with the state of nature (100% operational and 0% broken, 80% customers buy product A and 20% buy product B)

3. There is also transition (or conditional) probabilities of moving from one state to another (represented by the Transition Matrix)

4. All of the above

5. None of the above

Question 8. A Markov Chain is

1. A discrete-time stochastic process that is a description of the relation between the random variables at various states (in time) X0, X1, X2

2. A continuous -time stochastic process in which the state of the system can be viewed at any time, not just at discrete instants in time.

3. A probability assessment that is not conditional

4. All of the above

5. None of the above

Question 9. This condition where the probability relating the next period's state to the current state does not change over time is referred to as

1. Transition Matrix

2. Stationary Assumption

3. Marvov Process

4. Markov Chain

5. Initial Probability Distribution

Question 10. A process where-by the input variables are random and are defined by distributions rather than a single number is known as a

1. Markov Process

2. Stochastic Process

3. Deterministic Process

4. All of the above

5. None of the above

Question 11. Which of the following is a true statement regarding Markov Analysis

1. Markov Analysis is a Technique that involves predicting probabilities of future occurrences

2. Markov Analysis is a stochastic process in which current states of a system depend on previous states

3. The objective of Markov Analysis is to predict future states of nature given the probabilities of existing states

4. All of the above are true regarding Markov Analysis

5. All of the above are NOT true regarding Markov Analysis

Question 12. Which of the following are Markov properties

1. The states of nature are mutually exclusive and collectively exhausted

2. Each entry in the transition matrix is a conditional probability that is nonnegative in value

3. The summation of all the probabilities in the transition matrix sume to a value of 1.0 along each row in the matrix

4. All of the above

5. None of the above

Question 13. The Markov state that is characterized by all zero's in the retention cells (diagonal of the matrix) and all one's or zero's in non retention cells is referred to as

1. Periodic State

2. Absorbing State

3. Trapping State

4. Ergodic State

5. Transient

Solution Preview :

Prepared by a verified Expert
Basic Statistics: Steady state probability of markov chains are - states of
Reference No:- TGS01198932

Now Priced at $15 (50% Discount)

Recommended (90%)

Rated (4.3/5)