DefinitionThey are mathematical systems that go from random state to random state on a state space. They are Markov processes, a type of stochastic process, where the transition probability from one state to the next depends only on the current state and not on the entire history that precedes it. If the rain tomorrow is based only on whether it is raining today and not on any previous days (as in a simple weather model), then it can be represented by a Markov chain.