Example: Gambling Model

Consider a gambler who, at each play of the game, either wins $1 with probability or loses $1 with probability . If the gambler quits playing either when going broke or attaining a fortune of , then the gambler’s fortune is a Markov chain having transition probabilities:

States 0 and are absorbing states since once entered they are never left.

Transition Matrix

The transition matrix is . For , :

Transition probabilities:

  • for
  • for
  • (absorbing)
  • (absorbing)

Simulation Examples

Parameters: (win probability), (target), starting fortune = $5

Simulation 1 (Success):

  • Start: $5 → Win(0.6) → $6 → Lose(0.4) → $5 → Win(0.6) → $6 → Win(0.6) → $7 → Lose(0.4) → $6 → Win(0.6) → $7 → Win(0.6) → $8 → Win(0.6) → $9 → Win(0.6) → $10 (absorbed)
  • Result: Reached target in 9 games

Simulation 2 (Failure):

  • Start: $5 → Lose(0.4) → $4 → Lose(0.4) → $3 → Lose(0.4) → $2 → Win(0.6) → $3 → Lose(0.4) → $2 → Lose(0.4) → $1 → Lose(0.4) → $0 (absorbed)
  • Result: Went broke in 7 games