Notes

Home

❯

Mathematics

❯

Algebra

❯

LinearAlgebra

❯

Markov Chain

Markov Chain

Jan 11, 20261 min read

A Markov chain is a sequence of Probability Vectors, and a Stochastic Matrix P , such that:

Pkx0​​=xk​xk+1​=Pxk​k∈N0​​

Graph View

Backlinks

  • Linear Algebra
  • Steady State Vector

Created with Quartz v4.5.2 © 2026

  • GitHub
  • Discord Community