Advances in in-situ marine life imaging have significantly increased the size and quality of available datasets, but automatic image analysis has not kept pace. Machine learning has shown promise for image processing, but its effectiveness is limited by several open challenges: the requirement for large expert-labeled training datasets, disagreement among experts, under-representation of various species and unreliable or overconfident predictions. To overcome these obstacles for automated underwater imaging, we combine and test recent developments in deep classifier networks and self-supervised feature learning. We use unlabeled images for pretraining deep neural networks to extract task-relevant image features, allowing learning algorithms to cope with scarcity in expert labels, and carefully evaluate performance in subsequent label-based tasks. Performance on rare classes is improved by applying data rebalancing together with a Bayesian correction to avoid biasing inferred in-situ class frequencies. A divergence-based loss allows training on multiple, conflicting labels for the same image, leading to better estimates of uncertainty which we quantify with a novel accuracy measure. Together, these techniques can reduce the required label counts ~100-fold while maintaining the accuracy of standard supervised training, shorten training time, cope with expert disagreement and reduce overconfidence.
<p>Quantifying the error of predictions in earth system models is just as important as the quality of the predictions themselves. While machine learning methods become better by the day in emulating weather and climate forecasting systems, they are rarely used operationally. Two reasons for this are poor handling of extreme events and a lack of uncertainty quantification. The poor handling of extreme events can mainly be attributed to loss functions emphasizing accurate prediction of mean outcomes. Since extreme events are not frequent in climate and weather applications, capturing them accurately is not a natural consequence of minimizing such a loss. Uncertainty quantification for numerical weather prediction usually proceeds through creating an ensemble of predictions. The machine learning domain has adapted this to some extent, creating machine learning ensembles, with multiple architectures trained on the same data or the same architecture trained on altered datasets. Nevertheless, few approaches currently exist for tuning a deep learning ensemble.&#160;</p> <p>We introduce a new approach using a generative neural network, similar to those employed in adversarial learning, but we replace the discriminator with a new loss function. This gives us the control over the statistical properties the generator should learn and increases the stability of the training process immensely. By generating a prediction ensemble during training, we can tune ensemble properties such as variance or skewness in addition to the mean. Early results of this approach will be demonstrated using simple 1D experiments, showing the advantage over classically trained neural networks. Especially the task of predicting extremes and the added value of ensemble predictions will be highlighted. Additionally, predictions of a Lorenz-96 system are demonstrated to show the skill in forecasting chaotic systems.</p>
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.