Abstract

In this introduction we provide an overview of the papers that were accepted for publication in the special issue on light detection and ranging (lidar). Four of the papers were published in JOSA A, and four were published in JOSA B. They represent different aspects of this important and fast-growing field while showing the relevant state-of-the-art achievements currently existing in the field of lidars in the world of science and engineering.

© 2021 Optica Publishing Group

The current world is in the midst of a large process under which our daily environment is becoming “smarter” and autonomous. In order to implement such capabilities and functionalities, our homes, offices, and cars are embedded with a large variety of various sensors capable of providing, to the relevant data processing unit, a sufficient amount of information to properly comprehend the situation/environment. One major sensing component that plays a significant role mainly, but not only, in autonomous cars is the light detection and ranging (lidar) device [1]. Lidar technologies are commonly used in high-resolution photonic sensing in a large variety of fields, such as geology, archaeology, seismology, agriculture, atmospheric sciences, and control and navigation [2]. They are used both in terrestrial and in airborne operation scenarios. The beauty of this emerging field is that it is positioned at the interference between high industrial applicability and enormous scientific impact [3,4].

The current special issue was devoted to the selection of papers dealing with different aspects of this important and applied field and to show the relevant state-of-the-art achievements currently existing in the field of lidars in the world of science and engineering. The special issue contains eight papers published both in JOSA A and in JOSA B.

In JOSA A, one of the papers analyzes the impact of spatial resolution of lidars when used for vehicle detection application. The authors examine the effects of resolution on the performance of vehicle detection for both lidar and radar sensors and propose sub-sampling methods to improve the performance and efficiency of deep neural network–based solutions and offer an alternative approach to traditional sensor-design tradeoffs.

Another paper applies structured illumination-based imaging by using an array of integrated optical phased array elements in order to obtain high resolution. The authors demonstrate Fourier-basis imaging in 1D using a six-element array of optical phased arrays, which interfere pairwise to sample up to 11 different spatial Fourier components and reconstruct a 1D delta function target.

The performance of underwater lidar is often limited by scattering. One of the papers deals with underwater lidar and its spatial and temporal filtering needed to enhance the quality of the captured data. This combined spatial and temporal filtering method demonstrates improvement in the performance of underwater lidar systems beyond what either method provides independently.

One of the papers also analyzes the object recognition performance for the viewpoint-independent imaging case when reduced dimensionality data is used. The author demonstrates viewpoint-independent recognition performance using lidar data sets from two vehicles and a simple algorithm for a two-class recognition problem. The author finds that point-separation histograms have good potential for viewpoint-independent recognition over a hemisphere.

In JOSA B, one of the papers addresses the technological barrier for achieving both high speed and wide angle 3D imaging. The authors describe a novel interleaved scanner for an eye-safe 3D scanning lidar system to measure aerodynamic phenomena in a wind tunnel using elastic backscatter from seeding particles. The scanner assembly consists of a rotating polygon scanner for line scanning along the fast axis, a galvanometer scanner for scanning along the slow axis, angular position sensors, and motor controllers.

Another paper focuses on small angle scattering and presents quantum parametric mode sorting realized in this case. Quantum parametric mode sorting has been shown to enable photon counting with precise time gating and exceptional noise rejection that significantly exceeds what is possible with linear filters. Previous experimental demonstrations were in a collinear optical configuration. To apply it in remote sensing missions, the authors investigate its response to off-axis scattering. Their results find no measurable degradation in detecting non-collinear photons along both directions.

Another paper deals with frequency-modulated continuous wave (FMCW) lidar integrating a distributed feedback (DFB) laser that is performing linear sweeping. The authors present an iteration algorithm for generating linear frequency sweep of the DFB laser at 1550 nm both theoretically and experimentally. Their iteration algorithm is convergent and can reach a desired frequency sweep linearity with only a few iterations.

One additional paper discusses the case of enabling a pulse train of laser pulses for performing direct time domain measurements. The authors have developed an amplified frequency double shifting loop with a bi-directionally operated acousto-optic modulator as the in-loop frequency shifter to generate a frequency-stepped pulse train (FSPT) with frequency spacing twice as great as its operation frequency. Based on this configuration, a FSPT with 51 equidistant optical frequencies and 800 MHz frequency spacing around 1572 nm was generated, covering the whole R16 absorption peak of ${\rm CO}_{2}$ for direct spectral measurement in the time domain.

We hope that this collection of papers presented in this special issue will benefit the readers and contribute to the significant progress that the field of lidars is experiencing.

REFERENCES

1. R. T. H. Collis, “Lidar,” Appl. Opt. 9, 1782–1788 (1970). [CrossRef]  

2. R. O. Dubayah and J. B. Drake, “Lidar remote sensing for forestry,” J. For. 98(6), 44–46 (2000).

3. F. G. Fernald, “Analysis of atmospheric lidar observations: some comments,” Appl. Opt. 23, 652–653 (1984). [CrossRef]  

4. S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

References

  • View by:

  1. R. T. H. Collis, “Lidar,” Appl. Opt. 9, 1782–1788 (1970).
    [Crossref]
  2. R. O. Dubayah and J. B. Drake, “Lidar remote sensing for forestry,” J. For. 98(6), 44–46 (2000).
  3. F. G. Fernald, “Analysis of atmospheric lidar observations: some comments,” Appl. Opt. 23, 652–653 (1984).
    [Crossref]
  4. S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

2000 (1)

R. O. Dubayah and J. B. Drake, “Lidar remote sensing for forestry,” J. For. 98(6), 44–46 (2000).

1984 (1)

1970 (1)

Collis, R. T. H.

Drake, J. B.

R. O. Dubayah and J. B. Drake, “Lidar remote sensing for forestry,” J. For. 98(6), 44–46 (2000).

Dubayah, R. O.

R. O. Dubayah and J. B. Drake, “Lidar remote sensing for forestry,” J. For. 98(6), 44–46 (2000).

Fernald, F. G.

Ma, W.-C.

S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

Manivasagam, S.

S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

Sazanovich, M.

S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

Tan, S.

S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

Urtasun, R.

S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

Wang, S.

S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

Wong, K.

S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

Yang, B.

S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

Zeng, W.

S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

Appl. Opt. (2)

J. For. (1)

R. O. Dubayah and J. B. Drake, “Lidar remote sensing for forestry,” J. For. 98(6), 44–46 (2000).

Other (1)

S. Manivasagam, S. Wang, K. Wong, W. Zeng, M. Sazanovich, S. Tan, B. Yang, W.-C. Ma, and R. Urtasun, “LiDARsim: realistic LiDAR simulation by leveraging the real world,” in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020), pp. 11167–11176.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Metrics