By Letter: Non-alphabet | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z
  Email this page to a friend


Markov process




<probability, simulation> A process in which the sequence of events can be described by a Markov chain.



< Previous Terms Terms Containing Markov process Next Terms >
Mark 1
marketroid
Markov
Markov chain
Markov model
Markov
Markov chain
Markowitz
mark-sweep garbage collection
markup
Marlais
Mars


Web Standards & Support:

Link to and support eLook.org Powered by LoadedWeb Web Hosting
Valid XHTML 1.0!Valid CSS!eLook.org FireFox Extensions