Transition Matrices and Markov Chains
Understanding Transition Matrices
- A transition matrix represents the probabilities of moving from one state to another in a system.
- Each element $(p_{ij})$ in the matrix represents the probability of moving from state $i$ to state $j$.
All probabilities in a row must sum to 1, as they represent all possible outcomes from that state.
For example, if we have three states (A, B, C), a transition matrix P would look like:
$$P = \begin{pmatrix} p_{11} & p_{12} & p_{13} \\ p_{21} & p_{22} & p_{23} \\ p_{31} & p_{32} & p_{33} \end{pmatrix}$$
ExampleConsider the weather forecast:
- If it's sunny today, there's a 70% chance it'll be sunny tomorrow, and 30% chance of rain
- If it's rainy today, there's a 60% chance it'll be rainy tomorrow, and 40% chance of sunny
The transition matrix would be:
$$P = \begin{pmatrix} 0.7 & 0.3 \\ 0.4 & 0.6 \end{pmatrix}$$
Markov Chains
- A Markov chain is a sequence of events where the probability of each event depends only on the state of the previous event.
- This is called the Markov property.
The key characteristic of a Markov chain is that it's "memoryless" - only the current state matters for predicting the next state.
State Vectors
- The state vector represents the probability distribution across all states at a particular time.
- For $n$ states, it's a $1×n$ matrix: $$s = \begin{pmatrix} s_1 & s_2 & ... & s_n \end{pmatrix}$$
Finding Future States
To find the state vector after n steps:
- Multiply the initial state vector by the transition matrix $n$ times
- Or: $s_n = s_0P^n$
Using our weather example:
Initial state: Sunny (100%)
\[ s_0 = \begin{pmatrix} 1 & 0 \end{pmatrix} \]
After one day:
\[ s_1 = s_0 P = \begin{pmatrix} 1 & 0 \end{pmatrix} \begin{pmatrix} 0.7 & 0.3 \\ 0.4 & 0.6 \end{pmatrix} = \begin{pmatrix} 0.7 & 0.3 \end{pmatrix} \]