markovian
English Thesaurus
1. relating to or generated by a Markov process (adj.pert)
| pertainym | : | markoff process, markov process, |
| definition | : | a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state (noun.process) |
| pertainym | : | markoff process, markov process, |
| definition | : | a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state (noun.process) |
Visual ArtiKata
Explore markovian in ArtiKata.com >