“…While it is, therefore, clear that the only promising route for an efficient characterization of large and complex quantum systems can be through effective descriptions—such as offered, e.g., by the theory of open quantum systems [ 19 , 20 , 21 , 22 ], modern semiclassics [ 23 ], or random matrix theory [ 24 , 25 ]—there is an intermediate range of system sizes where efficient numerical methods can (a) be gauged against each other, to benchmark their quantitative reliability, without any a priori restriction on the explored portion of Hilbert space, and (b) contribute to gauge effective theories against (numerically) exact solutions [ 26 , 27 , 28 ], at spectral densities where quantum granular effects induce possibly sizeable deviations [ 29 ] from effective theory predictions (which always rely on some level of coarse graining). In our view, it is this intermediate system sizes where efficient methods of numerical simulation develop their full potential, since they can inspire and ease the development, e.g., of powerful statistical methods and paradigms (such as scaling properties [ 18 , 26 , 30 ])—which then enable robust predictions in the realm of fully unfolding complexity.…”