Environmental monitoring is a key component of understanding and managing ecosystems. Given that most monitoring e orts are still expensive and time-consuming, it is essential that monitoring programs are designed to be e cient and e ective. In many situations, the expensive part of monitoring is not sample collection, but instead sample processing, which leads to only a subset of the samples being processed. For example, sediment or ice cores can be quickly obtained in the field, but they require weeks or months of processing in a laboratory setting. Standard sub-sampling approaches often involve equally-spaced sampling. We use simulations to show how many samples, and which types of sampling approaches, are the most e ective in detecting ecosystem change. We test these ideas with a case study of Cladocera community assemblage reconstructed from a sediment core. We demonstrate that standard approaches to sample processing are less e cient than an iterative approach. For our case study, using an optimal sampling approach would have resulted in savings of 195 person-hours-thousands of dollars in labor costs. We also show that, compared with these standard approaches, fewer samples are typically needed to achieve high statistical power. We explain how our approach can be applied to monitoring programs that rely on video records, eDNA, remote sensing, and other common tools that allow re-sampling.