Markov Process

  • Definition
  • Related
Home

❯

statistics

❯

Stochastic Model

❯

Markov Process
  • Definition
  • Related

Markov Process

Mar 29, 20261 min read

Definition

A Markov Process {Xt​} is a stochastic process with the Markov property.

Related

  • Markov Property
  • Discrete-time Markov Chain

Recent Notes

  • Modstok1 Tugas 2

    Apr 03, 2026

    • Waiting Times (Poisson)

      Apr 03, 2026

      • type/definition
    • Inter-arrival Times

      Apr 03, 2026

      • type/definition
    • Stochastic Model

      Apr 03, 2026

      • type/category
    • Poisson Process Cheatsheet

      Apr 03, 2026

      • type/cheatsheet

    Graph View

    Related notes

    • Markov Property
    • Stochastic Process
    • Stochastic Model

    Created with Quartz v4.5.2 © 2026

    • GitHub
    • Discord Community