State Transitions
← Back to Markov Chains
The probability of moving from one state to another in a Markov chain. Represented by a transition matrix where entry (i,j) is the probability of transitioning from state i to state j. Rows sum to 1. The n-step transition matrix is the n-th power of the one-step transition matrix.
mathematics-for-cs probability markov-chains state-transitions