A Markov chain is a sequence of Probability Vectors, and a Stochastic Matrix P , such that: