Definition
The probability that the Markov chain ever enters transient state given that it starts in transient state is:
where if and otherwise.
Notation
- : expected number of visits to state starting from
- : expected number of visits to state starting from
Example
For (lose probability ), starting with 3 units:
- (Expected time at 5 units)
- (Expected time at 2 units)
- Probability of ever hitting 1: