Markov chain - A random process in which the probability that a certain future state will occur depends only on the present or immediately preceding state of the system, and not on the events leading up to the present state.
; дс тдрлими цюлютдажкию дроемжки аиакиохдйис лидр http://www.computeruser.com/resources/dictionary/definition.html;