SoylentNews
SoylentNews is people
https://soylentnews.org/

Title    Brush Up Your Markov Chains
Date    Sunday March 01 2015, @06:11PM
Author    LaminatorX
Topic   
from the Eat-Pray-Love dept.
https://soylentnews.org/article.pl?sid=15/03/01/1451240

arti writes:

[Submitted via IRC]

Many of you will know about Markov chains. Named after Andrey Markov, [they] are mathematical systems that hop from one "state" (a situation or set of values) to another. For example, if you made a Markov chain model of a baby's behavior, you might include "playing," "eating", "sleeping," and "crying" as states, which together with other behaviors could form a 'state space': a list of all possible states. In addition, on top of the state space, a Markov chain tells you the probability of hopping, or "transitioning," from one state to any other state---e.g., the chance that a baby currently playing will fall asleep in the next five minutes without crying first.

Victor Powell and Lewis Lehe have produced a 'visual explanation' of how to produce Markov chains showing how they are used in a variety of disciplines; they are useful to computer scientists and engineers and many others. As they point out:

In the hands of meteorologists, ecologists, computer scientists, financial engineers and other people who need to model big phenomena, Markov chains can get to be quite large and powerful.

If you've not seen Markov chains in use before, or perhaps your knowledge is just a little rusty, then take a look at the link and see it they can be of any use to you.

Links

  1. "Andrey Markov" - https://en.wikipedia.org/wiki/Andrey_Markov
  2. "'visual explanation'" - http://setosa.io/blog/2014/07/26/markov-chains/

© Copyright 2024 - SoylentNews, All Rights Reserved

printed from SoylentNews, Brush Up Your Markov Chains on 2024-04-25 13:53:29