Which of the following is true about the transition matrix \(P\) of a Markov chain?
Which of the following is true about the transition graph of a Markov chain with transition matrix \(P\)?
What is the Markov property?
Consider a Markov chain \((X_t)_{t \ge 0}\) on state space \(S\). Which of the following equations is a direct consequence of the Markov property?
Consider a Markov chain \((X_t)_{t\geq0}\) with transition matrix \(P\) and initial distribution \(\mu\). Which of the following is true about the distribution of a sample path \((X_0, X_1, \ldots, X_T)\)?
In the weather model example, if today is dry, what is the probability that tomorrow will also be dry?
What is the initial distribution vector \(\mu\) for the weather model if the weather starts as dry?
Using the transition matrix \(P\) for the weather model, what is the probability that it is dry after two days if it starts as dry?
In the random walk on the Petersen graph example, if the current state is vertex 9, what is the probability of transitioning to vertex 4 in the next step?
Which of the following is true about a doubly stochastic matrix?