Abstract
Shack-Hartmann wavefront sensing is a technique for measuring wavefront aberrations, whose use in adaptive optics relies on fast position tracking of an array of spots. These sensors conventionally use frame-based cameras operating at a fixed sampling rate to report pixel intensities, even though only a fraction of the pixels have signal. Prior in-lab experiments have shown feasibility of event-based cameras for Shack-Hartmann wavefront sensing (SHWFS), asynchronously reporting the spot locations as log intensity changes at a microsecond time scale. In our work, we propose a convolutional neural network (CNN) called event-based wavefront network (EBWFNet) that achieves highly accurate estimation of the spot centroid position in real time. We developed a custom Shack-Hartmann wavefront sensing hardware with a common aperture for the synchronized frame- and event-based cameras so that spot centroid locations computed from the frame-based camera may be used to train/test the event-CNN-based centroid position estimation method in an unsupervised manner. Field testing with this hardware allows us to conclude that the proposed EBWFNet achieves sub-pixel accuracy in real-world scenarios with substantial improvement over the state-of-the-art event-based SHWFS. An ablation study reveals the impact of data processing, CNN components, and training cost function; and an unoptimized MATLAB implementation is shown to run faster than 800 Hz on a single GPU.
© 2024 Optica Publishing Group
Full Article | PDF ArticleMore Like This
Fanpeng Kong, Andrew Lambert, Damien Joubert, and Gregory Cohen
Opt. Express 28(24) 36159-36175 (2020)
Yiqun Zhang, Zeyu Gao, Ruiyan Jin, Wang Zhao, Licheng Zhu, Hongwei Ye, Ying Zhang, Ping Yang, and Shuai Wang
Opt. Express 32(9) 15336-15357 (2024)
Zhiqiang Xu, Shuai Wang, Mengmeng Zhao, Wang Zhao, Lizhi Dong, Xing He, Ping Yang, and Bing Xu
Appl. Opt. 59(16) 4768-4774 (2020)