Passive acoustic monitoring can offer insights into the state of coral reef ecosystems at low-costs and over extended temporal periods. However, lack of effective automated analysis has impeded progress in this field. Here, we show machine learning (ML) can be used to unlock greater insights from reef soundscapes. We showcase this on a diverse set of tasks using three biogeographically independent datasets, each containing fish community, coral cover or depth zone classes. Supervised and unsupervised ML models were effective at identifying relevant classes and individual sites within these. We also compare three different approaches for extracting feature embeddings from soundscape recordings: acoustic indices designed by ecologists, a pretrained convolutional neural network (P-CNN) and a CNN trained on each dataset (T-CNN). We report P-CNNs present a powerful tool for soundscape ecologists due to their strong performance and low computational cost. Our findings have implications for soundscape ecology on any habitat.