My view of a Markov Chain
When a ball bounces it either: Bursts stops bounces
Historically,
When the ball bursts,
The next possibilities were;
60% it bursts
30% stops
10% bounces again
When the ball stops.
The next possibilities were;
60% it bursts
20% stops
20% bounces again
When the ball bounces
The next day possibilities were;
70% it bursts
10% stops
20% bounces again
To move from (state: bounces) to (state: bursts),
The ball must either
1: stay on (state: bounces) on the first duration and move to (state: bursts) the second duration (0.2 ⋅ 0.6);
2: move to (state: bursts) on the first duration and stay in that state for the second duration (0.6 ⋅ 0.6)
3: They could transition to (state: bounces) on the first duration and then to (state: bursts) in the second duration (0.2 ⋅ 0.7).
Therfore probability: ((0.2 ⋅ 0.6) + (0.6 ⋅ 0.6) + (0.2 ⋅ 0.7)) = 0.62.
This gives a 62% chance that the ball will move to (state: bursts) after two durations, when it started out in the (state: bounces).
If I have missed anything please let me know.