Madhavan Mukund


Distributed Markov Chains

S K Jha, M Mukund, R Saha and P S Thiagarajan

arXiv:1408.0979 (2014).

Abstract

The formal verification of large probabilistic models is important and challenging. Exploiting the concurrency that is often present is one way to address this problem. Here we study a restricted class of asynchronous distributed probabilistic systems in which the synchronizations determine the probability distribution for the next moves of the participating agents. The key restriction we impose is that the synchronizations are deterministic, in the sense that any two simultaneously enabled synchronizations must involve disjoint sets of agents. As a result, this network of agents can be viewed as a succinct and distributed presentation of a large global Markov chain. A rich class of Markov chains can be represented this way.

We define an interleaved semantics for our model in terms of the local synchronization actions. The network structure induces an independence relation on these actions, which, in turn, induces an equivalence relation over the interleaved runs in the usual way. We construct a natural probability measure over these equivalence classes of runs by exploiting Mazurkiewicz trace theory and the probability measure space of the associated global Markov chain.

It turns out that verification of our model, called DMCs (distributed Markov chains), can often be efficiently carried out by exploiting the partial order nature of the interleaved semantics. To demonstrate this, we develop a statistical model checking (SMC) procedure and use it to verify two large distributed probabilistic networks.

  • Download PDF.