markoff process
English Thesaurus
1. a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state (noun.process)
| hypernym | : | stochastic process, |
| definition | : | a statistical process involving a number of random variables depending on a variable parameter (which is usually time) (noun.process) |
| hyponym | : | markoff chain, markov chain, |
| definition | : | a Markov process for which the parameter is discrete time values (noun.process) |
Visual ArtiKata
Explore markoff process in ArtiKata.com >
