In any statistical investigation, we deal with the applications of probability theory to real problems, and the conclusions are inferences based on observations. To obtain plausible inferences, statistical analysis requires careful understanding of the underlying probabilistic model, which constrains the extraction and interpretation of information from observational data, and must be preliminarily checked under controlled conditions. However, these very first principles of statistical analysis are often neglected in favor of superficial and automatic application of increasingly available ready-to-use software, which might result in misleading conclusions, confusing the effect of model constraints with meaningful properties of the process of interest. To illustrate the consequences of this approach, we consider the emerging research area of so-called ‘compound events’, defined as a combination of multiple drivers and/or hazards that contribute to hydro-climatological risk. In particular, we perform an independent validation analysis of a statistical testing procedure applied to binary series describing the joint occurrence of hydro-climatological events or extreme values, which is supposed to be superior to classical analysis based on Pearson correlation coefficient. To this aim, we suggest a theoretically grounded model relying on Pearson correlation coefficient and marginal rates of occurrence, which enables accurate reproduction of the observed joint behavior of binary series, and offers a sound simulation tool useful for informing risk assessment procedures. Our discussion on compound events highlights the dangers of renaming known topics, using imprecise definitions and overlooking or misusing existing statistical methods. On the other hand, our model-based approach reveals that consistent statistical analyses should rely on informed stochastic modeling in order to avoid the proposal of flawed methods, and the untimely dismissal of well-devised theories.