Brute-Force in Distribute VerificationWith the increase in complexity of computer systems, it becomes more important to develop formal methods for ensuring their quality and reliability. Various techniques for automated and semi-automated analysis and verification have been successfully applied to real-life computer systems. However, these techniques are computationally hard and memory intensive in general and their applicability to extremely large systems is limited. The major hampering factor is the state space explosion problem due to which large industrial models cannot be efficiently handled by a single state-of-the-art computer.Much attention has been focused on the development of approaches to battle the state space explosion problem. Many techniques, such as abstraction, state compression, state space reduction, symbolic state representation, etc., are used to reduce the size of the problem to be handled allowing thus a single computer to process larger systems. There are also techniques that purely focus on increasing the amount of available computational power. These are, for example, techniques to fight memory limits with efficient utilisation of an external I/O device [2,19,24,28,32], or techniques that introduce cluster-based algorithms to employ aggregate power of network-interconnected computers.Cluster-based algorithms perform their computation simultaneously on a number of workstations that are allowed to communicate and synchronise themselves by means of message passing. Cluster-based algorithms can thus be characterised as parallel algorithms performing in a distributed memory environment. The algorithms prove their usefulness in verification of large-scale systems. They have been successfully applied to symbolic model checking [22,23], analysis of stochastic [25] and timed [6] systems, equivalence checking [8] and other related problems [7,9,21].The idea of parallel verification appeared already in the very early years of the formal verification era. However, inaccessibility of cheap parallel computers together with negative theoretical complexity results excluded this approach from the main stream in formal verification. The situation changed dramatically during the past several years. Computer progress over the past two decades has measured several orders of magnitude with respect to various physical parameters such as computing power, memory size at all hierarchy levels from caches to disk, power consumption, physical size and cost.