Introduction and main resultsAn algorithm is said to be weakly scalable if it can solve progressively larger problems with an increasing number of processors in a fixed amount of time. According to classical Schwarz theory, the parallel Schwarz method (PSM) is not scalable (see, e.g., [2,7]). Recent results in computational chemistry, however, have shed more light on the scalability of the PSM: surprisingly, in contrast with classical Schwarz theory, the authors in [1] provide numerical evidence that in some cases the one-level PSM converges to a given tolerance within the same number of iterations independently of the number N of subdomains. This behaviour is observed if fixed-sized subdomains form a "chain-like" domain such that the intersection of the boundary of each subdomain with the boundary of the global domain is non-empty. This result was subsequently rigorously proved in [3,4,5] for the PSM and in [2] for other one-level methods. On the other hand, this weak scalability is lost if the fixed-sized subdomains form a "globular-type" domain Ω, where the boundaries of many subdomains lie in the interior of Ω. The following question therefore arises: is it possible to quantify the lack of scalability of the PSM for cases where individual subdomains are entirely embedded inside the global domain? To do so, for increasing N one would need to estimate the number of iterations necessary to achieve a given tolerance.Some isolated results in this direction do exist in the literature. For instance, in [2] a heuristic argument is used to explain why in the case of the PSM for the solution of a 1D Laplace problem an unfortunate initialisation leads to a contraction in the infinity norm being observed only after a number of iterations proportional to N.