Context. Reconstructing images from very long baseline interferometry (VLBI) data with a sparse sampling of the Fourier domain (uv-coverage) constitutes an ill-posed deconvolution problem. It requires application of robust algorithms, maximizing the information extraction from all of the sampled spatial scales, and minimizing the influence of the unsampled scales on image quality. Aims. We develop a new multiscale wavelet deconvolution algorithm, DoG-HiT, for imaging sparsely sampled interferometric data, which combines the difference of Gaussian (DoG) wavelets and hard image thresholding (HiT). Based on DoG-HiT, we propose a multistep imaging pipeline for analysis of interferometric data. Methods. DoG-HiT applies the compressed sensing approach to imaging by employing a flexible DoG wavelet dictionary, which is designed to adapt smoothly to the uv-coverage. It uses closure properties as data fidelity terms only, initially, and performs nonconvex, nonsmooth optimization by an amplitude-conserving and total-flux-conserving, hard thresholding splitting. DoG-HiT calculates a multiresolution support as a side product. The final reconstruction is refined through self-calibration loops and imaging with amplitude and phase information applied for the multiresolution support only. Results. We demonstrate the stability of DoG-HiT, and benchmark its performance against image reconstructions made with the CLEAN and regularized maximum-likelihood (RML) methods using synthetic data. The comparison shows that DoG-HiT matches the super-resolution achieved by the RML reconstructions and surpasses the sensitivity to extended emission reached by CLEAN. Conclusions. The application of regularized maximum likelihood methods, outfitted with flexible multiscale wavelet dictionaries, to imaging of interferometric data, matches the performance of state-of-the art convex optimization imaging algorithms and requires fewer prior and user-defined constraints.
The next-generation Event Horizon Telescope (ngEHT) will be a significant enhancement of the Event Horizon Telescope (EHT) array, with ∼10 new antennas and instrumental upgrades of existing antennas. The increased uv-coverage, sensitivity, and frequency coverage allow a wide range of new science opportunities to be explored. The ngEHT Analysis Challenges have been launched to inform the development of the ngEHT array design, science objectives, and analysis pathways. For each challenge, synthetic EHT and ngEHT datasets are generated from theoretical source models and released to the challenge participants, who analyze the datasets using image reconstruction and other methods. The submitted analysis results are evaluated with quantitative metrics. In this work, we report on the first two ngEHT Analysis Challenges. These have focused on static and dynamical models of M87* and Sgr A* and shown that high-quality movies of the extended jet structure of M87* and near-horizon hourly timescale variability of Sgr A* can be reconstructed by the reference ngEHT array in realistic observing conditions using current analysis algorithms. We identify areas where there is still room for improvement of these algorithms and analysis strategies. Other science cases and arrays will be explored in future challenges.
In the past few years, the Event Horizon Telescope (EHT) has provided the first-ever event horizon-scale images of the supermassive black holes (BHs) M87* and Sagittarius A* (Sgr A*). The next-generation EHT project is an extension of the EHT array that promises larger angular resolution and higher sensitivity to the dim, extended flux around the central ring-like structure, possibly connecting the accretion flow and the jet. The ngEHT Analysis Challenges aim to understand the science extractability from synthetic images and movies to inform the ngEHT array design and analysis algorithm development. In this work, we compare the accretion flow structure and dynamics in numerical fluid simulations that specifically target M87* and Sgr A*, and were used to construct the source models in the challenge set. We consider (1) a steady-state axisymmetric radiatively inefficient accretion flow model with a time-dependent shearing hotspot, (2) two time-dependent single fluid general relativistic magnetohydrodynamic (GRMHD) simulations from the H-AMR code, (3) a two-temperature GRMHD simulation from the BHAC code, and (4) a two-temperature radiative GRMHD simulation from the KORAL code. We find that the different models exhibit remarkably similar temporal and spatial properties, except for the electron temperature, since radiative losses substantially cool down electrons near the BH and the jet sheath, signaling the importance of radiative cooling even for slowly accreting BHs such as M87*. We restrict ourselves to standard torus accretion flows, and leave larger explorations of alternate accretion models to future work.
Context. Imaging in radioastronomy is an ill-posed inverse problem. However, with increasing sensitivity and capabilities of telescopes, several strategies have been developed in order to solve this challenging problem. In particular, novel algorithms have recently been proposed using (constrained) nonlinear optimization and Bayesian inference. Aims. The Event Horizon Telescope (EHT) Collaboration convincingly investigated the fidelity of their image reconstructions with large surveys, solving the image reconstruction problem with different optimization parameters. This strategy faces a limitation for the existing methods when imaging active galactic nuclei: Large and expensive surveys solving the problem with different optimization parameters are time-consuming. We present a novel nonconvex, multiobjective optimization modeling approach that gives a different type of claim and may provide a pathway to overcome this limitation. Methods. To this end, we use a multiobjective version of the genetic algorithm (GA): the Multiobjective Evolutionary Algorithm Based on Decomposition, or MOEA/D. The GA strategies explore the objective function by evolutionary operations to find the different local minima and to avoid becoming trapped in saddle points. Results. First, we tested our algorithm (MOEA/D) using synthetic data based on the 2017 EHT array and a possible EHT plus next-generation EHT configuration. We successfully recover a fully evolved Pareto front of nondominated solutions for these examples. The Pareto front divides into clusters of image morphologies representing the full set of locally optimal solutions. We discuss approaches to find the most natural guess among these solutions and demonstrate its performance on synthetic data. Finally, we apply MOEA/D to observations of the black hole shadow in Messier 87 with the EHT data in 2017. Conclusions. The MOEA/D is very flexible and faster than any other Bayesian method, and it explores more solutions than regularized maximum likelihood methods. We have written two papers to present this new algorithm. In the first, we explain the basic idea behind multiobjective optimization and MOEA/D, and we use MOEA/D to recover static images. In the second paper, we extend the algorithm to allow dynamic and (static and dynamic) polarimetric reconstructions.
Context. Very long baseline interferometry (VLBI) is a radio-astronomical technique whereby the correlated signal from various baselines is combined into an image of the highest possible angular resolution. Due to the sparsity of the measurements, this imaging procedure constitutes an ill-posed inverse problem. For decades, the CLEAN algorithm has been the standard choice in VLBI studies, despite it bringing on some serious disadvantages and pathologies that are brought on by the requirements of modern frontline VLBI applications.Aims. We developed a novel multiscale CLEAN deconvolution method (DoB-CLEAN) based on continuous wavelet transforms that address several pathologies in CLEAN imaging. We benchmarked this novel algorithm against CLEAN reconstructions on synthetic data and reanalyzed BL Lac observations of RadioAstron with DoB-CLEAN. Methods. The DoB-CLEAN method approaches the image via multiscalar and multidirectional wavelet dictionaries. Two different dictionaries were used: 1) a difference of elliptical spherical Bessel functions dictionary fitted to the uv-coverage of the observation that is used to sparsely represent the features in the dirty image; 2) a difference of elliptical Gaussian wavelet dictionary that is well suited to represent relevant image features cleanly. The deconvolution was performed by switching between the dictionaries. Results. DoB-CLEAN achieves a super-resolution compared to CLEAN and remedies the spurious regularization properties of CLEAN. In contrast to CLEAN, the representation via basis functions has a physical meaning. Hence, the computed deconvolved image still fits the observed visibilities, in contrast to CLEAN. Conclusions. State-of-the-art multiscalar imaging approaches seem to outperform single-scalar standard approaches in VLBI and are well suited to maximize the extraction of information in ongoing frontline VLBI applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.