By Letter: Non-alphabet | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z
  Email this page to a friend


Markov chain




<probability> (Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.

A Markov process is governed by a Markov chain.

In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model.

Simscript II.5 uses this approach for some modelling functions.

[Better explanation?]



< Previous Terms Terms Containing Markov chain Next Terms >
Marginal Hacks
Maril
Mark 1
marketroid
Markov
Andrei Markov
Finite State Machine
Markov
Markov chain
Markov model
Markov model
Markov process
Markowitz
mark-sweep garbage collection
markup


Web Standards & Support:

Link to and support eLook.org Powered by LoadedWeb Web Hosting
Valid XHTML 1.0!Valid CSS!eLook.org FireFox Extensions