Using center of gravity to estimate the centroid of the spot in a Shack-Hartmann wavefront sensor, the measurement corrupts with photon and detector noise. Parameters, like window size, often require careful optimization to balance the noise error, dynamic range, and linearity of the response coefficient under different photon flux. It also needs to be substituted by the correlation method for extended sources. We propose a centroid estimator based on stream processing, where the center of gravity calculation window floats with the incoming pixel from the detector. In comparison with conventional methods, we show that the proposed estimator simplifies the choice of optimized parameters, provides a unit linear coefficient response, and reduces the influence of background and noise. It is shown that the stream-based centroid estimator also works well for limited size extended sources. A hardware implementation of the proposed estimator is discussed.
The Zernike coefficients of a light wavefront can be calculated directly by intensity ratios of pairs of spots in the reconstructed image plane of a holographic wavefront sensor (HWFS). However, the response curve of the HWFS heavily depends on the position and size of the detector for each spot and the distortions introduced by other aberrations. In this paper, we propose a method to measure the intensity of each spot by setting a threshold to select effective pixels and using the weighted average intensity within a selected window. Compared with using the integral intensity over a small window for each spot, we show through a numerical simulation that the proposed method reduces the dependency of the HWFS's response curve on the selection of the detector window. We also recorded a HWFS on a holographic plate using a blue laser and demonstrated its capability to detect the strength of encoded Zernike terms in an aberrated beam.
An event-based image sensor works dramatically differently from the conventional frame-based image sensors in a way that it only responds to local brightness changes whereas its counterparts’ output is a linear representation of the illumination over a fixed exposure time. The output of an event-based image sensor therefore is an asynchronous stream of spatial-temporal events data tagged with the location, timestamp and polarity of the triggered events. Compared to traditional frame-based image sensors, event-based image sensors have advantages of high temporal resolution, low latency, high dynamic range and low power consumption. Although event-based image sensors have been used in many computer vision, navigation and even space situation awareness applications, little work has been done to explore their applicability in the field of wavefront sensing. In this work, we present the integration of an event camera in a Shack-Hartmann wavefront sensor and the usage of event data to determine spot displacement and wavefront estimation. We show that it can achieve the same functionality but with substantial speed and can operate in extremely low light conditions. This makes an event-based Shack-Hartmann wavefront sensor a preferable choice for adaptive optics systems where light budget is limited or high bandwidth is required.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.