Abstract

3D gated range-intensity correlation imaging (GRICI) can reconstruct a 3D scene with high range resolution in real time. However, in the applications of underwater range-gated imaging, targets with low reflectivity or at a far distance typically have a low signal-to-noise ratio (SNR), especially in turbid water. Usually, a global threshold is set to suppress noise in gated images, which easily results in data holes in the degraded depth map reconstructed by 3D GRICI. To solve the problem, we have proposed a binning-based local-threshold filtering (BLF) algorithm to fill depth data holes. Firstly, raw gated images are added to obtain a sum image, and global threshold filtering and pixel binning for the sum image are used to obtain a reference image. Secondly, hole pixels can be registered according to the reference image. Finally, a smaller local threshold is reset for the hole pixels, and then a repaired depth map is given by 3D GRICI. The experimental results have shown that the proposed method can effectively fill depth data holes and the peak signal-to-noise ratio of the repaired depth map is increased from 10.23 dB to 22.45 dB by suppressing water noise with the improved range resolution from 6.25 mm to 4.12 mm. In addition, the 3D depth of viewing is enlarged, and the limit detection distance is increased by 21% in experiment. The research can promote the practical applications of 3D GRICI.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

3D range-gated imaging [15] not only has the characteristics of high spatial resolution and long detection range, but also can acquire target 3D information with high range resolution in real time. It has been used in underwater target detection [2,4], navigation [6], and marine scientific research [4,5]. Up to now, 3D range-gated imaging mainly includes time slicing method [7], and gated range-intensity correlation imaging (GRICI) (also known as super-resolution 3D laser imaging) [1,3]. Compared with the time slicing method, GRICI has lower data volume and higher range resolution, and has great potential in underwater imaging. However, due to strong light scattering in turbid water, gated images still have low SNR and low target-to-background contrast, especially for low-reflectivity or distant targets. Most of water background noise can be removed by threshold filtering in 3D GRICI, which will result in holes without depth data in depth map. These holes will affect target detection and recognition. Therefore, it is necessary to fill depth holes and repair degraded depth maps.

Up to now, many researchers have made important contributions in the reparation of depth maps in different 3D imaging techniques. P. Jaesik et al. repaired depth maps by adding edge information and weights to the non-local mean filtering for 3D time-of-flight cameras [8]. J. Y. Liu et al. improved fast marching method, using color images as guiding information for hole filling in Kinect depth maps [9]. M. Camplani et al. used the adaptively weighted joint bilateral filter alternate iterative to repair depth maps of low-cost depth cameras [10]. In recent years, the research of recovering target depth images based on single photon counting for extreme underwater has been a hot topic [1113]. However, the methods mentioned above are not fully available for depth maps acquired by 3D GRICI. Therefore, a binning-based local-threshold filtering (BLF) algorithm is proposed for 3D GRICI in this paper.

The remainder of this paper is organized as follows. In Section 2, we describe the BLF method in detail. Section 3 shows the experiments of the proposed method, and Section 4 gives conclusions.

2. Binning-based local-threshold filtering (BLF) algorithm

At present, two kinds of 3D GRICI are developed, including trapezoidal GRICI [1] and triangular GRICI [3]. In this paper, we mainly focus on the enhancement of the triangular GRICI, since the range resolution of the triangular algorithm is higher than that of the trapezoidal algorithm [3,14]. Figure 1(a) shows a typical range-gated imaging system, and it includes a pulsed laser, a gated camera and a timing control unit. The time control unit can synchronize the pulsed laser and the gated camera to obtain raw 2D gated images IA and IB with triangular range intensity profiles (RIP) by the triangular GRICI. Both targets and background noise in IA and IB will participate in the reconstruction of the raw depth map. Without noise filtering in the raw depth map, targets are easily submerged in background noise as illustrated in Fig. 1(b). Generally, a global threshold Gth is set to filter noise in IA and IB to form the new information images InewA and InewB, but the pixels with lower gray than the threshold are filtered, which will easily result in depth data holes and generates a degraded depth map in Fig. 1(b). Especially when targets have low reflectivity, or are far from the imaging system, or they are at the boundary near RA and RB.

 figure: Fig. 1.

Fig. 1. (a) The principle of underwater 3D triangular GRICI, (b) Target submerged by water noise without threshold filtering in the raw depth map, and depth data holes in degraded depth map under global threshold filtering.

Download Full Size | PPT Slide | PDF

In this paper, the BLF algorithm is proposed to repair depth data holes for 3D GRICI. As shown in Fig. 2(a), to repair data holes in a degraded depth map, a reference image is formed. IA and IB are added and filtered by a global threshold of $\sqrt 2$Gth to be a sum image Isum, and then the reference image Iref is obtained by pixel binning of the sum image. As shown in Fig. 2(b), the hole pixels which need to be filled for targets in the degraded depth map can be registered from the reference image. A smaller 3D threshold is reset for the hole pixels, and a repaired depth map is reconstructed from corresponding pixels of IA and IB by 3D GRICI.

 figure: Fig. 2.

Fig. 2. BLF algorithm to repair degraded depth map of 3D GRICI, (a) Flow chart of BLF algorithm according to a reference image, (b) Hole pixel registration processing to repair degraded depth map based on pixel binning.

Download Full Size | PPT Slide | PDF

The key of this method is to obtain a reference image with all targets to guide the reparation of the degraded depth map. Since the gated images have triangular RIPs, there are low intensity parts at the end of the tail part of IA and at the beginning of the head part of IB. These parts are easily affected by noise and lost in the global threshold filtering, and the pixel gray value in the new information image is expressed as

$${I_{\textrm{new}A}}(i,j) = \left\{ \begin{array}{l} 0,\;\;\;\;\;\;\;\;\;\;\;\;{\kern 1pt} {\kern 1pt} {I_A}(i,j) < {G_{th}}\\ {I_A}(i,j),\;\;\;\;{\kern 1pt} {I_A}(i,j) \ge {G_{th}} \end{array} \right.$$
where InewA(i, j) is the pixel gray value in the new information image InewA, IA(i, j) is the pixel value of the raw 2D gated image IA, and the pixels with lower gray than the global threshold Gth are set to zero in InewA(i, j). InewB(i, j) has the same expression and processing. Equation (1) clearly shows that the noise in InewA and InewB is eliminated, while the target information with low gray value is also eliminated in Fig. 2(a).

To retrieve the lost target information, the raw 2D gated images IA and IB are added and filtered to obtain a sum image with a trapezoidal RIP as

$${I_{sum}}(i,j) = \left\{ \begin{array}{l} 0\;,{\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} ({I_A}(i,j) + {I_B}(i,j)) < \sqrt 2 {G_{th}}\\ {I_A}(i,j) + {I_B}(i,j),{\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} \sqrt 2 {G_{th}} \le ({I_A}(i,j) + {I_B}(i,j)) \le 255\\ 255,\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;\;{\kern 1pt} \;({I_A}(i,j) + {I_B}(i,j)) > 255 \end{array} \right.$$
where Isum(i, j) is the pixel gray value in the sum image. The global filtering threshold of (IA+IB) is $\sqrt 2$Gth, since the background noises in IA and IB are approximate [7]. Compared with IA and IB, the RIP of Isum is trapezoidal, and the grayscale value is enhanced to obtain more complete target information.

Although the target information in Isum is enhanced, it is still not enough for gated images with low SNR. The use of pixel binning directly on the CCD can be considered as hardware binning technology [15,16]. This is also the basis of our method, that is, the image with low SNR and missing target information can be improved by pixel binning. Therefore, the sum image Isum is further subjected to pixel binning as a reference image Iref. Assume that Isum is composed of M×N pixels, and Iref is obtained from Isum via u×v binning, so it is composed of (M×N)/(u×v) pixels. By the pixel binning, Iref has stronger gray value and more complete target information, which can be used to guide filling holes of the degraded depth map.

When filling holes, finding the position of hole pixels is the first important procedure, which can be called as hole pixel registration processing as shown in Fig. 2(b). To determine whether a pixel in the degraded depth map is a hole to be filled, the following conditions need to be satisfied at the same time.

Firstly, all the pixels in the degraded depth map which do not have depth data are called as candidate pixels, and they are given by

$${D_{\deg }}(i,j) = 0$$
where Ddeg(i, j) is the pixel depth value of the degraded depth map. And the hole pixels must be selected among candidate pixels. Secondly, according to Fig. 2(b) one pixel of Iref is binned by 2×2 pixels of Isum, and Isum has the same size as the degraded depth map. Therefore, 2×2 pixels of the degraded depth map correspond to one pixel of Iref. If the gray value of the pixel in Iref is not zero, the 2×2 pixels in the degraded depth map corresponding to it should be hole pixels that need to be filled with depth data . This process is given by
$${I_{ref}}(\textrm{c}eil(i/u),ceil(j/v)) > 0$$
where ceil() is the up-round function. Finally, when filling a hole, the 3D GRICI is used to recalculate the depth data, which needs that the gray values in raw 2D images IA and IB are both greater than zero as given by
$${I_A}(i,j) > 0$$
$${I_B}(i,j) > 0$$

According to the hole pixels registration processing, one can find hole pixels by traversing the entire degraded depth map. When hole pixels are found, a smaller local threshold Lth is reset for the hole pixels, and the depth data is recalculated by 3D GRICI as shown in

$${D_{\textrm{hole}}}(i,j) = \frac{{{\tau _A}c}}{{2n}} + \frac{{{I_B}(i,j)}}{{{I_A}(i,j) + {I_B}(i,j)}}\frac{{{t_L}c}}{{2n}}, \;\;\;\;{I_A}(i,j) \ge {L_{th}}\;\& \& \;\;{I_B}(i,j) \ge {L_{th}}$$
where Dhole(i, j) is the depth data used to fill the hole, τA is the time delay between the laser pulse and the gate pulse, c is the light speed in vacuum, tL is the laser pulse width which equals the gate width, IA(i, j) is the pixel value of the raw 2D gated image IA with a time delay of τA, IB(i, j) is the gray value of the gate image IB with a time delay of (τA+tL), and n is the refractive index of water. Different from the global threshold Gth for all the pixels of raw 2D images IA and IB, the local threshold Lth is only for the regions of hole pixels. Since the gray values are both greater than zero when using 3D GRICI, typically Lth is set to 1.

When all the holes are filled, a complete repaired depth map is obtained. Compared to other methods of filling holes, our method directly calculates the depth data by non-estimation. When all holes are filled, the data of the end of the tail of IA and the beginning of the head of IB will be retrieved. And this filling process will increase the 3D depth of filed (DOF) for 3D GRICI. In addition, after filling the holes, the detection distance of range gated imaging is also improved. The improvements by the proposed method will be verified in detail in the next section.

3. Experimental results and discussion

For experimental research, an underwater GRICI system named as Fengyan is established as show in Fig. 3 and its internal structure is shown in Fig. 1(a). A 532 nm laser is used as the illuminator in Fengyan, and its full width at half-maximum (FWHM) is 1 ns at a typical pulse repetition frequency of 30 kHz. For the gated camera, a gated intensifier is coupled to a CCD with 1360×1024 pixels. The timing control unit realized by the field-programmable gate array can provide the desired time sequence for the pulsed laser and the gated camera.

 figure: Fig. 3.

Fig. 3. Experiment setup and targets, (a) Underwater GRICI system of Fengyan, (b) Pool for experiment, (c) Standard pyramid target, (d) Whiteboard target.

Download Full Size | PPT Slide | PDF

In Fig. 3, the standard pyramid target is made of plastic with 11 steps, and each step is 1 cm high and 1 cm wide. The top four steps of the pyramid target are glued with low-reflectivity green tape. The size of the whiteboard is 44×44 cm, and its material and surface reflectivity are the same as those of the pyramid target. The pool has a length of 20 m and a width of 1 m. In the pool, the pyramid target is at a distance of 3 m from Fengyan. Under the multi-pulse time delay integration method [17], the equivalent laser pulse width and the gate width are both 3 ns. The frame rate of the CCD is 10 frames per second. The time delay is set as 25 ns and 28 ns for the adjacent gated raw 2D images IA and IB. In contrast, we also obtained the raw 2D images of the pyramid target without low-reflectivity tape under the same parameters.

As shown in Figs. 4(a) and 4(b), the gray value of the steps with green tape is significantly lower than other steps due to low reflectivity, especially in raw image IB. At the same time, one can see that from Fig. 4(e) the raw gated image contains obvious water background noise, and the gray value of the middle region of the intensity profile is approximate to the water noise. It can be clearly seen that although the depth data of the steps with green tape is intact, water noise around the pyramid target appears. In order to remove water noise, one has to set a global threshold as Gth=15 to obtain the new information images InewA and InewB by threshold filtering in Figs. 4(c) and 4(d). From the Fig. 4(f) most of the water noise around the target and the low-gray target information is removed by threshold filtering. The degraded depth map in Fig. 4(h) is obtained from InewA and InewB by 3D GRICI, and there are many holes in the middle of the low reflectivity area. In order to retrieve the depth information lost in the middle of the degraded depth map, a reference image is obtained as show in Fig. 5(b). The raw gated images IA and IB are added, and then filtered by set Gth=21 to obtain a sum image Isum in Fig. 5(a). Finally, Isum is used 2×2 binning to obtain the reference image. The reference image shows that the lost data in the middle of the image has been retrieved.

 figure: Fig. 4.

Fig. 4. Experiment results of low-reflectivity targets, (a) Raw gated image IA, (b) Raw gated image IB, (c) Information image InewA, (d) Information image InewB, (e) Intensity profile of row 470 of IB along the dotted line, (f) Intensity profile of row 470 of InewB along the dotted line, (g) Raw depth map without threshold filtering, (h) Degraded depth map with depth holes by threshold filtering.

Download Full Size | PPT Slide | PDF

 figure: Fig. 5.

Fig. 5. Sub image and reference image, (a) Sum image Isum, (b) Reference image Iref, (c) Intensity profile of row 470 of Isum along the dotted line, (d) Intensity profile of row 470 of Iref along the dotted line.

Download Full Size | PPT Slide | PDF

Finally, after finding depth holes which need to be filled, one set the local threshold Lth=1 to obtain the depth data of Fig. 6(c) by using the 3D GRICI. It shows that the lost depth data in the middle of the degraded depth map in Fig. 4(h) are retrieved, and there is no water noise around the pyramid target.

 figure: Fig. 6.

Fig. 6. Comparative experiment results, (a) Pyramid target without green tape, (b) Depth map of pyramid target without green tape, (c) Repaired depth map with the proposed BLF algorithm, (d) Repaired depth map with JBF algorithm, (e) Range profiles of the row 470 of depth maps of Figs. 4(h), 6(b), 6(c) and 6(d).

Download Full Size | PPT Slide | PDF

Figure 6(b) shows the depth map of the pyramid target of Fig. 6(a) without the green tape. Comparing Fig. 6(c) and Fig. 6(b), the quality of the depth map repaired by our method is similar to that of the depth map without the adhesive tape. We also compare our method with the joint bilateral filtering (JBF) algorithm [18], which is one of the most used algorithms in other depth image filling applications. The repaired depth map with JBF is shown in Fig. 6(d), it shows that the depth data in the middle of the depth map in Fig. 6(d) are wrong from Fig. 6(b).

Compared with the real depth image of Fig. 6(b), the evaluation parameters of the depth maps of Figs. 4(h), 6(c) and 6(d) are shown in Table 1. The range resolution of Fengyan is increased from 6.25 mm to 4.12 mm, and the peak signal-to-noise ratio (PSNR) of the depth map is increased from 10.23 dB to 22.45 dB. From the range profiles of the row 470 in Fig. 6(e), one can clearly see that the 3D DOF of Fengyan is also increased by 3 cm after filling holes in the degraded depth map with the proposed algorithm instead of traditional JBF. Compared with JBF, our method can fill the depth data holes of degraded depth map more accurately and effectively.

Tables Icon

Table 1. Pixel number of depth data, range resolution and PSNR of degraded depth map and repaired depth maps when compared with depth map of pyramid target without green tape in Fig. 6(b)

 figure: Fig. 7.

Fig. 7. Repaired results of depth maps from 11 m to 14 m, (a)Raw depth maps without threshold filtering, (b)Degraded depth maps with global threshold filtering, (c) Repaired depth maps with BLF.

Download Full Size | PPT Slide | PDF

In order to verify the repair effect of the BLF algorithm on underwater 3D imaging at the limited detection distance of Fengyan, we used the whiteboard target instead of the pyramid target in the experimental scene of Fig. 3. The parameters of laser pulse width, pulse repetition frequency, gate width, frame rate and gain of the Fengyan system are unchanged. We gradually moved the whiteboard away from Fengyan and constantly adjusted the delay of system. When the signal intensity of the whiteboard received by Fengyan is equal to the water noise intensity, it is impossible to obtain the 3D information by 3D GRICI with threshold filtering. At this time, one believes that the limit detection distance of Fengyan is reached and the raw images IA and IB were acquired. Then, gradually move the whiteboard target closer to Fengyan and capture the raw images IA and IB every 1 m, and stop capturing images when the whiteboard target can be clearly detected. In the experiment, the limit detection distance of Fengyan under water is 14 m, and when it reaches 11 m, one can clearly find the target.

In Fig. 7, one can clearly find that some whiteboard information can be seen without threshold filtering, but there is much water noise around it. However, the 3D information of the whiteboard is lost from 12 m to 14 m when filtering out water noise by Gth=5. After using the proposed BLF method to fill the holes and enhance the 3D information of the target, the repaired depth map at 14 m is similar to the degraded depth map at 11 m. It is clear that in the experiment the proposed method can increase the limit detection distance of Fengyan from 11 m to 14 m with an increase of 21%.

4. Conclusion

We proposed a BLF algorithm in this paper, which can accurately fill depth data holes of the depth map acquired by 3D GRICI. The method directly calculates the depth data by non-estimation, and thus the data is more accurate. Experimental results have shown that the proposed method can effectively fill holes due to low reflectivity, and the PSNR, range resolution and DOF are all improved for the standard pyramid target in the repaired depth map. The experimental results of whiteboard target show that the limit detection distance is also increased by the proposed algorithm. According to the characteristics of the BLF algorithm, it also can be used in the 3D trapezoidal GRICI algorithm. We believe that the proposed BLF algorithm has potential in promoting the practical applications of 3D GRICI in underwater imaging and remote surveillance [19,20].

Funding

National Natural Science Foundation of China (NSFC) (61875189); National Key Research and Development Program of China (2016YFC0500103); Strategic Priority Program of the Chinese Academy of Sciences (XDC03060103, XDA22030205); Youth Innovation Promotion Association of the Chinese Academy of Sciences (2017155).

Acknowledgment

The authors acknowledge the financial funding of this work.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. M. Laurenzis, F. Christnacher, and D. Monnin, “Long-range three-dimensional active imaging with super resolution depth mapping,” Opt. Lett. 32(21), 3146–3148 (2007). [CrossRef]  

2. F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014). [CrossRef]  

3. X. W. Wang, Y. F. Li, and Y. Zhou, “Triangular-range-intensity profile spatial-correlation method for 3D super-resolution range-gated imaging,” Appl. Opt. 52(30), 7399–7406 (2013). [CrossRef]  

4. X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018). [CrossRef]  

5. P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018). [CrossRef]  

6. D. Monnin, G. Schmitt, C. Fischer, M. Laurenzis, and F. Christnacher, “Active-imaging-based underwater navigation,” Proc. SPIE 9649, 96490H (2015). [CrossRef]  

7. J. Busck and H. Heiselberg, “Gated viewing and high-accuracy three-dimensional laser radar,” Appl. Opt. 43(24), 4705–4710 (2004). [CrossRef]  

8. J. Park, H. Kim, Y. W. Tai, M. S. Brown, and I. Kweon, “High quality depth map upsampling for 3D-TOF cameras,” International Conference on Computer Vision, Barcelona, 1623–1630 (2011).

9. J. Liu, X. Gong, and J. Liu, “Guided inpainting and filtering for Kinect depth maps,” Proceedings of the 21st International Conference on Pattern Recognition, Tsukuba, 2055–2058 (2012).

10. M. Camplani and L. Salgado, “Adaptive spatio-temporal filter for low-cost camera depth maps,” IEEE International Conference on Emerging Signal Processing Applications, 33–36 (2012).

11. A. Halimi, A. Maccarone, A. McCarthy, S. McLaughlin, and G. S. Buller, “Object Depth Profile and Reflectivity Restoration From Sparse Single-Photon Data Acquired in Underwater Environments,” IEEE Trans. Comput. Imaging 3(3), 472–484 (2017). [CrossRef]  

12. Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, and S. McLaughlin, “Lidar Waveform-Based Analysis of Depth Images Constructed Using Sparse Single-Photon Data,” IEEE Trans. on Image Process. 25(5), 1935–1946 (2016). [CrossRef]  

13. A. Maccarone, A. McCarthy, X. Ren, R. E. Warburton, A. M. Wallace, J. Moffat, Y. Petillot, and G. S. Buller, “Underwater depth imaging using time-correlated single-photon counting,” Opt. Express 23(26), 33911–33926 (2015). [CrossRef]  

14. X. D. Zhang, Y. L. Wu, H. F. Chen, and H. M. Yan, “High-resolution three-dimensional active imaging with uniform distance resolution,” Opt. Commun. 312, 47–51 (2014). [CrossRef]  

15. H. Nasibov, A. Kholmatov, B. Akselli, A. Nasibov, and S. Baytaroglu, “Performance Analysis of the CCD Pixel Binning Option in Particle-Image Velocimetry Measurements,” IEEE/ASME Trans. Mechatron. 15(4), 527–540 (2010). [CrossRef]  

16. C. Draijer, F. Polderdijk, A. van der Heide, B. Dillen, W. Klaassens, and J. Bosiers, “A 28 mega pixel large area full frame CCD with 2/spl times/2 on-chip RGB charge-binning for professional digital still imaging,” IEEE International Electron Devices Meeting, IEDM Technical Digest, Washington, DC4, 810 (2005).

17. X. W. Wang, Y. F. Li, and Y. Zhou, “Multi-pulse time delay integration method for flexible 3D super-resolution range-gated imaging,” Opt. Express 23(6), 7820–7831 (2015). [CrossRef]  

18. J. Kopf, M. F. Cohen, D. Lischinski, and M. Uyttendaele, “Joint bilateral upsampling,” ACM SIGGRAPH 2007 papers, 2007.

19. B. Göhler and P. Lutzmann, “Review on short-wavelength infrared laser gated-viewing at Fraunhofer IOSB,” Opt. Eng. 56(3), 031203 (2016). [CrossRef]  

20. X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019). [CrossRef]  

References

  • View by:

  1. M. Laurenzis, F. Christnacher, and D. Monnin, “Long-range three-dimensional active imaging with super resolution depth mapping,” Opt. Lett. 32(21), 3146–3148 (2007).
    [Crossref]
  2. F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014).
    [Crossref]
  3. X. W. Wang, Y. F. Li, and Y. Zhou, “Triangular-range-intensity profile spatial-correlation method for 3D super-resolution range-gated imaging,” Appl. Opt. 52(30), 7399–7406 (2013).
    [Crossref]
  4. X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
    [Crossref]
  5. P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
    [Crossref]
  6. D. Monnin, G. Schmitt, C. Fischer, M. Laurenzis, and F. Christnacher, “Active-imaging-based underwater navigation,” Proc. SPIE 9649, 96490H (2015).
    [Crossref]
  7. J. Busck and H. Heiselberg, “Gated viewing and high-accuracy three-dimensional laser radar,” Appl. Opt. 43(24), 4705–4710 (2004).
    [Crossref]
  8. J. Park, H. Kim, Y. W. Tai, M. S. Brown, and I. Kweon, “High quality depth map upsampling for 3D-TOF cameras,” International Conference on Computer Vision, Barcelona, 1623–1630 (2011).
  9. J. Liu, X. Gong, and J. Liu, “Guided inpainting and filtering for Kinect depth maps,” Proceedings of the 21st International Conference on Pattern Recognition, Tsukuba, 2055–2058 (2012).
  10. M. Camplani and L. Salgado, “Adaptive spatio-temporal filter for low-cost camera depth maps,” IEEE International Conference on Emerging Signal Processing Applications, 33–36 (2012).
  11. A. Halimi, A. Maccarone, A. McCarthy, S. McLaughlin, and G. S. Buller, “Object Depth Profile and Reflectivity Restoration From Sparse Single-Photon Data Acquired in Underwater Environments,” IEEE Trans. Comput. Imaging 3(3), 472–484 (2017).
    [Crossref]
  12. Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, and S. McLaughlin, “Lidar Waveform-Based Analysis of Depth Images Constructed Using Sparse Single-Photon Data,” IEEE Trans. on Image Process. 25(5), 1935–1946 (2016).
    [Crossref]
  13. A. Maccarone, A. McCarthy, X. Ren, R. E. Warburton, A. M. Wallace, J. Moffat, Y. Petillot, and G. S. Buller, “Underwater depth imaging using time-correlated single-photon counting,” Opt. Express 23(26), 33911–33926 (2015).
    [Crossref]
  14. X. D. Zhang, Y. L. Wu, H. F. Chen, and H. M. Yan, “High-resolution three-dimensional active imaging with uniform distance resolution,” Opt. Commun. 312, 47–51 (2014).
    [Crossref]
  15. H. Nasibov, A. Kholmatov, B. Akselli, A. Nasibov, and S. Baytaroglu, “Performance Analysis of the CCD Pixel Binning Option in Particle-Image Velocimetry Measurements,” IEEE/ASME Trans. Mechatron. 15(4), 527–540 (2010).
    [Crossref]
  16. C. Draijer, F. Polderdijk, A. van der Heide, B. Dillen, W. Klaassens, and J. Bosiers, “A 28 mega pixel large area full frame CCD with 2/spl times/2 on-chip RGB charge-binning for professional digital still imaging,” IEEE International Electron Devices Meeting, IEDM Technical Digest, Washington, DC4, 810 (2005).
  17. X. W. Wang, Y. F. Li, and Y. Zhou, “Multi-pulse time delay integration method for flexible 3D super-resolution range-gated imaging,” Opt. Express 23(6), 7820–7831 (2015).
    [Crossref]
  18. J. Kopf, M. F. Cohen, D. Lischinski, and M. Uyttendaele, “Joint bilateral upsampling,” ACM SIGGRAPH 2007 papers, 2007.
  19. B. Göhler and P. Lutzmann, “Review on short-wavelength infrared laser gated-viewing at Fraunhofer IOSB,” Opt. Eng. 56(3), 031203 (2016).
    [Crossref]
  20. X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
    [Crossref]

2019 (1)

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

2018 (2)

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

2017 (1)

A. Halimi, A. Maccarone, A. McCarthy, S. McLaughlin, and G. S. Buller, “Object Depth Profile and Reflectivity Restoration From Sparse Single-Photon Data Acquired in Underwater Environments,” IEEE Trans. Comput. Imaging 3(3), 472–484 (2017).
[Crossref]

2016 (2)

Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, and S. McLaughlin, “Lidar Waveform-Based Analysis of Depth Images Constructed Using Sparse Single-Photon Data,” IEEE Trans. on Image Process. 25(5), 1935–1946 (2016).
[Crossref]

B. Göhler and P. Lutzmann, “Review on short-wavelength infrared laser gated-viewing at Fraunhofer IOSB,” Opt. Eng. 56(3), 031203 (2016).
[Crossref]

2015 (3)

2014 (2)

X. D. Zhang, Y. L. Wu, H. F. Chen, and H. M. Yan, “High-resolution three-dimensional active imaging with uniform distance resolution,” Opt. Commun. 312, 47–51 (2014).
[Crossref]

F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014).
[Crossref]

2013 (1)

2010 (1)

H. Nasibov, A. Kholmatov, B. Akselli, A. Nasibov, and S. Baytaroglu, “Performance Analysis of the CCD Pixel Binning Option in Particle-Image Velocimetry Measurements,” IEEE/ASME Trans. Mechatron. 15(4), 527–540 (2010).
[Crossref]

2007 (1)

2004 (1)

Akselli, B.

H. Nasibov, A. Kholmatov, B. Akselli, A. Nasibov, and S. Baytaroglu, “Performance Analysis of the CCD Pixel Binning Option in Particle-Image Velocimetry Measurements,” IEEE/ASME Trans. Mechatron. 15(4), 527–540 (2010).
[Crossref]

Altmann, Y.

Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, and S. McLaughlin, “Lidar Waveform-Based Analysis of Depth Images Constructed Using Sparse Single-Photon Data,” IEEE Trans. on Image Process. 25(5), 1935–1946 (2016).
[Crossref]

Baytaroglu, S.

H. Nasibov, A. Kholmatov, B. Akselli, A. Nasibov, and S. Baytaroglu, “Performance Analysis of the CCD Pixel Binning Option in Particle-Image Velocimetry Measurements,” IEEE/ASME Trans. Mechatron. 15(4), 527–540 (2010).
[Crossref]

Bosiers, J.

C. Draijer, F. Polderdijk, A. van der Heide, B. Dillen, W. Klaassens, and J. Bosiers, “A 28 mega pixel large area full frame CCD with 2/spl times/2 on-chip RGB charge-binning for professional digital still imaging,” IEEE International Electron Devices Meeting, IEDM Technical Digest, Washington, DC4, 810 (2005).

Brown, M. S.

J. Park, H. Kim, Y. W. Tai, M. S. Brown, and I. Kweon, “High quality depth map upsampling for 3D-TOF cameras,” International Conference on Computer Vision, Barcelona, 1623–1630 (2011).

Buller, G. S.

A. Halimi, A. Maccarone, A. McCarthy, S. McLaughlin, and G. S. Buller, “Object Depth Profile and Reflectivity Restoration From Sparse Single-Photon Data Acquired in Underwater Environments,” IEEE Trans. Comput. Imaging 3(3), 472–484 (2017).
[Crossref]

Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, and S. McLaughlin, “Lidar Waveform-Based Analysis of Depth Images Constructed Using Sparse Single-Photon Data,” IEEE Trans. on Image Process. 25(5), 1935–1946 (2016).
[Crossref]

A. Maccarone, A. McCarthy, X. Ren, R. E. Warburton, A. M. Wallace, J. Moffat, Y. Petillot, and G. S. Buller, “Underwater depth imaging using time-correlated single-photon counting,” Opt. Express 23(26), 33911–33926 (2015).
[Crossref]

Busck, J.

Camplani, M.

M. Camplani and L. Salgado, “Adaptive spatio-temporal filter for low-cost camera depth maps,” IEEE International Conference on Emerging Signal Processing Applications, 33–36 (2012).

Chardard, Y.

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

Chen, H. F.

X. D. Zhang, Y. L. Wu, H. F. Chen, and H. M. Yan, “High-resolution three-dimensional active imaging with uniform distance resolution,” Opt. Commun. 312, 47–51 (2014).
[Crossref]

Chen, J. N.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

Christnacher, F.

D. Monnin, G. Schmitt, C. Fischer, M. Laurenzis, and F. Christnacher, “Active-imaging-based underwater navigation,” Proc. SPIE 9649, 96490H (2015).
[Crossref]

F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014).
[Crossref]

M. Laurenzis, F. Christnacher, and D. Monnin, “Long-range three-dimensional active imaging with super resolution depth mapping,” Opt. Lett. 32(21), 3146–3148 (2007).
[Crossref]

Cohen, M. F.

J. Kopf, M. F. Cohen, D. Lischinski, and M. Uyttendaele, “Joint bilateral upsampling,” ACM SIGGRAPH 2007 papers, 2007.

Dillen, B.

C. Draijer, F. Polderdijk, A. van der Heide, B. Dillen, W. Klaassens, and J. Bosiers, “A 28 mega pixel large area full frame CCD with 2/spl times/2 on-chip RGB charge-binning for professional digital still imaging,” IEEE International Electron Devices Meeting, IEDM Technical Digest, Washington, DC4, 810 (2005).

Dong, H.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

Draijer, C.

C. Draijer, F. Polderdijk, A. van der Heide, B. Dillen, W. Klaassens, and J. Bosiers, “A 28 mega pixel large area full frame CCD with 2/spl times/2 on-chip RGB charge-binning for professional digital still imaging,” IEEE International Electron Devices Meeting, IEDM Technical Digest, Washington, DC4, 810 (2005).

Fan, S. T.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

Fischer, C.

D. Monnin, G. Schmitt, C. Fischer, M. Laurenzis, and F. Christnacher, “Active-imaging-based underwater navigation,” Proc. SPIE 9649, 96490H (2015).
[Crossref]

Göhler, B.

B. Göhler and P. Lutzmann, “Review on short-wavelength infrared laser gated-viewing at Fraunhofer IOSB,” Opt. Eng. 56(3), 031203 (2016).
[Crossref]

Gong, X.

J. Liu, X. Gong, and J. Liu, “Guided inpainting and filtering for Kinect depth maps,” Proceedings of the 21st International Conference on Pattern Recognition, Tsukuba, 2055–2058 (2012).

Halimi, A.

A. Halimi, A. Maccarone, A. McCarthy, S. McLaughlin, and G. S. Buller, “Object Depth Profile and Reflectivity Restoration From Sparse Single-Photon Data Acquired in Underwater Environments,” IEEE Trans. Comput. Imaging 3(3), 472–484 (2017).
[Crossref]

Haugholt, K. H.

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

He, J.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

Heiselberg, H.

Huang, P.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

Kholmatov, A.

H. Nasibov, A. Kholmatov, B. Akselli, A. Nasibov, and S. Baytaroglu, “Performance Analysis of the CCD Pixel Binning Option in Particle-Image Velocimetry Measurements,” IEEE/ASME Trans. Mechatron. 15(4), 527–540 (2010).
[Crossref]

Kim, H.

J. Park, H. Kim, Y. W. Tai, M. S. Brown, and I. Kweon, “High quality depth map upsampling for 3D-TOF cameras,” International Conference on Computer Vision, Barcelona, 1623–1630 (2011).

Klaassens, W.

C. Draijer, F. Polderdijk, A. van der Heide, B. Dillen, W. Klaassens, and J. Bosiers, “A 28 mega pixel large area full frame CCD with 2/spl times/2 on-chip RGB charge-binning for professional digital still imaging,” IEEE International Electron Devices Meeting, IEDM Technical Digest, Washington, DC4, 810 (2005).

Kopf, J.

J. Kopf, M. F. Cohen, D. Lischinski, and M. Uyttendaele, “Joint bilateral upsampling,” ACM SIGGRAPH 2007 papers, 2007.

Kweon, I.

J. Park, H. Kim, Y. W. Tai, M. S. Brown, and I. Kweon, “High quality depth map upsampling for 3D-TOF cameras,” International Conference on Computer Vision, Barcelona, 1623–1630 (2011).

Laurenzis, M.

D. Monnin, G. Schmitt, C. Fischer, M. Laurenzis, and F. Christnacher, “Active-imaging-based underwater navigation,” Proc. SPIE 9649, 96490H (2015).
[Crossref]

F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014).
[Crossref]

M. Laurenzis, F. Christnacher, and D. Monnin, “Long-range three-dimensional active imaging with super resolution depth mapping,” Opt. Lett. 32(21), 3146–3148 (2007).
[Crossref]

Lei, P. S.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

Li, M. Z.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

Li, Y. F.

Lischinski, D.

J. Kopf, M. F. Cohen, D. Lischinski, and M. Uyttendaele, “Joint bilateral upsampling,” ACM SIGGRAPH 2007 papers, 2007.

Liu, J.

J. Liu, X. Gong, and J. Liu, “Guided inpainting and filtering for Kinect depth maps,” Proceedings of the 21st International Conference on Pattern Recognition, Tsukuba, 2055–2058 (2012).

J. Liu, X. Gong, and J. Liu, “Guided inpainting and filtering for Kinect depth maps,” Proceedings of the 21st International Conference on Pattern Recognition, Tsukuba, 2055–2058 (2012).

Lutzmann, P.

B. Göhler and P. Lutzmann, “Review on short-wavelength infrared laser gated-viewing at Fraunhofer IOSB,” Opt. Eng. 56(3), 031203 (2016).
[Crossref]

Maccarone, A.

A. Halimi, A. Maccarone, A. McCarthy, S. McLaughlin, and G. S. Buller, “Object Depth Profile and Reflectivity Restoration From Sparse Single-Photon Data Acquired in Underwater Environments,” IEEE Trans. Comput. Imaging 3(3), 472–484 (2017).
[Crossref]

A. Maccarone, A. McCarthy, X. Ren, R. E. Warburton, A. M. Wallace, J. Moffat, Y. Petillot, and G. S. Buller, “Underwater depth imaging using time-correlated single-photon counting,” Opt. Express 23(26), 33911–33926 (2015).
[Crossref]

Mariani, P.

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

McCarthy, A.

A. Halimi, A. Maccarone, A. McCarthy, S. McLaughlin, and G. S. Buller, “Object Depth Profile and Reflectivity Restoration From Sparse Single-Photon Data Acquired in Underwater Environments,” IEEE Trans. Comput. Imaging 3(3), 472–484 (2017).
[Crossref]

Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, and S. McLaughlin, “Lidar Waveform-Based Analysis of Depth Images Constructed Using Sparse Single-Photon Data,” IEEE Trans. on Image Process. 25(5), 1935–1946 (2016).
[Crossref]

A. Maccarone, A. McCarthy, X. Ren, R. E. Warburton, A. M. Wallace, J. Moffat, Y. Petillot, and G. S. Buller, “Underwater depth imaging using time-correlated single-photon counting,” Opt. Express 23(26), 33911–33926 (2015).
[Crossref]

McLaughlin, S.

A. Halimi, A. Maccarone, A. McCarthy, S. McLaughlin, and G. S. Buller, “Object Depth Profile and Reflectivity Restoration From Sparse Single-Photon Data Acquired in Underwater Environments,” IEEE Trans. Comput. Imaging 3(3), 472–484 (2017).
[Crossref]

Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, and S. McLaughlin, “Lidar Waveform-Based Analysis of Depth Images Constructed Using Sparse Single-Photon Data,” IEEE Trans. on Image Process. 25(5), 1935–1946 (2016).
[Crossref]

Metzger, N.

F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014).
[Crossref]

Moffat, J.

Monnin, D.

D. Monnin, G. Schmitt, C. Fischer, M. Laurenzis, and F. Christnacher, “Active-imaging-based underwater navigation,” Proc. SPIE 9649, 96490H (2015).
[Crossref]

F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014).
[Crossref]

M. Laurenzis, F. Christnacher, and D. Monnin, “Long-range three-dimensional active imaging with super resolution depth mapping,” Opt. Lett. 32(21), 3146–3148 (2007).
[Crossref]

Nasibov, A.

H. Nasibov, A. Kholmatov, B. Akselli, A. Nasibov, and S. Baytaroglu, “Performance Analysis of the CCD Pixel Binning Option in Particle-Image Velocimetry Measurements,” IEEE/ASME Trans. Mechatron. 15(4), 527–540 (2010).
[Crossref]

Nasibov, H.

H. Nasibov, A. Kholmatov, B. Akselli, A. Nasibov, and S. Baytaroglu, “Performance Analysis of the CCD Pixel Binning Option in Particle-Image Velocimetry Measurements,” IEEE/ASME Trans. Mechatron. 15(4), 527–540 (2010).
[Crossref]

Park, J.

J. Park, H. Kim, Y. W. Tai, M. S. Brown, and I. Kweon, “High quality depth map upsampling for 3D-TOF cameras,” International Conference on Computer Vision, Barcelona, 1623–1630 (2011).

Petillot, Y.

Piccinno, G.

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

Polderdijk, F.

C. Draijer, F. Polderdijk, A. van der Heide, B. Dillen, W. Klaassens, and J. Bosiers, “A 28 mega pixel large area full frame CCD with 2/spl times/2 on-chip RGB charge-binning for professional digital still imaging,” IEEE International Electron Devices Meeting, IEDM Technical Digest, Washington, DC4, 810 (2005).

Quincoces, I.

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

Reali, G.

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

Ren, X.

Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, and S. McLaughlin, “Lidar Waveform-Based Analysis of Depth Images Constructed Using Sparse Single-Photon Data,” IEEE Trans. on Image Process. 25(5), 1935–1946 (2016).
[Crossref]

A. Maccarone, A. McCarthy, X. Ren, R. E. Warburton, A. M. Wallace, J. Moffat, Y. Petillot, and G. S. Buller, “Underwater depth imaging using time-correlated single-photon counting,” Opt. Express 23(26), 33911–33926 (2015).
[Crossref]

Risholm, P.

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

Salgado, L.

M. Camplani and L. Salgado, “Adaptive spatio-temporal filter for low-cost camera depth maps,” IEEE International Conference on Emerging Signal Processing Applications, 33–36 (2012).

Schertzer, S.

F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014).
[Crossref]

Schmitt, G.

D. Monnin, G. Schmitt, C. Fischer, M. Laurenzis, and F. Christnacher, “Active-imaging-based underwater navigation,” Proc. SPIE 9649, 96490H (2015).
[Crossref]

F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014).
[Crossref]

Scholtzl, T.

F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014).
[Crossref]

Sun, L.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

Tai, Y. W.

J. Park, H. Kim, Y. W. Tai, M. S. Brown, and I. Kweon, “High quality depth map upsampling for 3D-TOF cameras,” International Conference on Computer Vision, Barcelona, 1623–1630 (2011).

Thielemann, J. T.

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

Uyttendaele, M.

J. Kopf, M. F. Cohen, D. Lischinski, and M. Uyttendaele, “Joint bilateral upsampling,” ACM SIGGRAPH 2007 papers, 2007.

van der Heide, A.

C. Draijer, F. Polderdijk, A. van der Heide, B. Dillen, W. Klaassens, and J. Bosiers, “A 28 mega pixel large area full frame CCD with 2/spl times/2 on-chip RGB charge-binning for professional digital still imaging,” IEEE International Electron Devices Meeting, IEDM Technical Digest, Washington, DC4, 810 (2005).

Visser, A. W.

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

Wallace, A. M.

Wang, X. W.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

X. W. Wang, Y. F. Li, and Y. Zhou, “Multi-pulse time delay integration method for flexible 3D super-resolution range-gated imaging,” Opt. Express 23(6), 7820–7831 (2015).
[Crossref]

X. W. Wang, Y. F. Li, and Y. Zhou, “Triangular-range-intensity profile spatial-correlation method for 3D super-resolution range-gated imaging,” Appl. Opt. 52(30), 7399–7406 (2013).
[Crossref]

Warburton, R. E.

Wu, Y. L.

X. D. Zhang, Y. L. Wu, H. F. Chen, and H. M. Yan, “High-resolution three-dimensional active imaging with uniform distance resolution,” Opt. Commun. 312, 47–51 (2014).
[Crossref]

Yan, H. M.

X. D. Zhang, Y. L. Wu, H. F. Chen, and H. M. Yan, “High-resolution three-dimensional active imaging with uniform distance resolution,” Opt. Commun. 312, 47–51 (2014).
[Crossref]

Yang, Y. Q.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

Yates, C.

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

Zhang, X. D.

X. D. Zhang, Y. L. Wu, H. F. Chen, and H. M. Yan, “High-resolution three-dimensional active imaging with uniform distance resolution,” Opt. Commun. 312, 47–51 (2014).
[Crossref]

Zhong, X.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

Zhou, Y.

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

X. W. Wang, Y. F. Li, and Y. Zhou, “Multi-pulse time delay integration method for flexible 3D super-resolution range-gated imaging,” Opt. Express 23(6), 7820–7831 (2015).
[Crossref]

X. W. Wang, Y. F. Li, and Y. Zhou, “Triangular-range-intensity profile spatial-correlation method for 3D super-resolution range-gated imaging,” Appl. Opt. 52(30), 7399–7406 (2013).
[Crossref]

Appl. Opt. (2)

IEEE Trans. Comput. Imaging (1)

A. Halimi, A. Maccarone, A. McCarthy, S. McLaughlin, and G. S. Buller, “Object Depth Profile and Reflectivity Restoration From Sparse Single-Photon Data Acquired in Underwater Environments,” IEEE Trans. Comput. Imaging 3(3), 472–484 (2017).
[Crossref]

IEEE Trans. on Image Process. (1)

Y. Altmann, X. Ren, A. McCarthy, G. S. Buller, and S. McLaughlin, “Lidar Waveform-Based Analysis of Depth Images Constructed Using Sparse Single-Photon Data,” IEEE Trans. on Image Process. 25(5), 1935–1946 (2016).
[Crossref]

IEEE/ASME Trans. Mechatron. (1)

H. Nasibov, A. Kholmatov, B. Akselli, A. Nasibov, and S. Baytaroglu, “Performance Analysis of the CCD Pixel Binning Option in Particle-Image Velocimetry Measurements,” IEEE/ASME Trans. Mechatron. 15(4), 527–540 (2010).
[Crossref]

Infrared and Laser Eng. (1)

X. W. Wang, L. Sun, P. S. Lei, S. T. Fan, H. Dong, Y. Q. Yang, X. Zhong, J. N. Chen, J. He, and Y. Zhou, “Underwater 3D triangular range-intensity correlation imaging beyond visibility range(invited),” Infrared and Laser Eng. 47(9), 903001 (2018).
[Crossref]

Opt. Commun. (1)

X. D. Zhang, Y. L. Wu, H. F. Chen, and H. M. Yan, “High-resolution three-dimensional active imaging with uniform distance resolution,” Opt. Commun. 312, 47–51 (2014).
[Crossref]

Opt. Eng. (1)

B. Göhler and P. Lutzmann, “Review on short-wavelength infrared laser gated-viewing at Fraunhofer IOSB,” Opt. Eng. 56(3), 031203 (2016).
[Crossref]

Opt. Express (2)

Opt. Lett. (1)

Proc. SPIE (3)

F. Christnacher, M. Laurenzis, D. Monnin, G. Schmitt, N. Metzger, S. Schertzer, and T. Scholtzl, “3D laser gated viewing from a moving submarine platform,” Proc. SPIE 9250, 92500F (2014).
[Crossref]

D. Monnin, G. Schmitt, C. Fischer, M. Laurenzis, and F. Christnacher, “Active-imaging-based underwater navigation,” Proc. SPIE 9649, 96490H (2015).
[Crossref]

X. W. Wang, L. Sun, X. Zhong, P. S. Lei, H. Dong, S. T. Fan, J. N. Chen, Y. Q. Yang, J. He, Y. Zhou, P. Huang, and M. Z. Li, “3D NIR laser night vision based on gated range-intensity correlation imaging,” Proc. SPIE 11182, 111820I (2019).
[Crossref]

Sustainability (1)

P. Mariani, I. Quincoces, K. H. Haugholt, Y. Chardard, A. W. Visser, C. Yates, G. Piccinno, G. Reali, P. Risholm, and J. T. Thielemann, “Range-Gated Imaging System for Underwater Monitoring in Ocean Environment,” Sustainability 11(1), 162 (2018).
[Crossref]

Other (5)

J. Park, H. Kim, Y. W. Tai, M. S. Brown, and I. Kweon, “High quality depth map upsampling for 3D-TOF cameras,” International Conference on Computer Vision, Barcelona, 1623–1630 (2011).

J. Liu, X. Gong, and J. Liu, “Guided inpainting and filtering for Kinect depth maps,” Proceedings of the 21st International Conference on Pattern Recognition, Tsukuba, 2055–2058 (2012).

M. Camplani and L. Salgado, “Adaptive spatio-temporal filter for low-cost camera depth maps,” IEEE International Conference on Emerging Signal Processing Applications, 33–36 (2012).

J. Kopf, M. F. Cohen, D. Lischinski, and M. Uyttendaele, “Joint bilateral upsampling,” ACM SIGGRAPH 2007 papers, 2007.

C. Draijer, F. Polderdijk, A. van der Heide, B. Dillen, W. Klaassens, and J. Bosiers, “A 28 mega pixel large area full frame CCD with 2/spl times/2 on-chip RGB charge-binning for professional digital still imaging,” IEEE International Electron Devices Meeting, IEDM Technical Digest, Washington, DC4, 810 (2005).

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. (a) The principle of underwater 3D triangular GRICI, (b) Target submerged by water noise without threshold filtering in the raw depth map, and depth data holes in degraded depth map under global threshold filtering.
Fig. 2.
Fig. 2. BLF algorithm to repair degraded depth map of 3D GRICI, (a) Flow chart of BLF algorithm according to a reference image, (b) Hole pixel registration processing to repair degraded depth map based on pixel binning.
Fig. 3.
Fig. 3. Experiment setup and targets, (a) Underwater GRICI system of Fengyan, (b) Pool for experiment, (c) Standard pyramid target, (d) Whiteboard target.
Fig. 4.
Fig. 4. Experiment results of low-reflectivity targets, (a) Raw gated image IA, (b) Raw gated image IB, (c) Information image InewA, (d) Information image InewB, (e) Intensity profile of row 470 of IB along the dotted line, (f) Intensity profile of row 470 of InewB along the dotted line, (g) Raw depth map without threshold filtering, (h) Degraded depth map with depth holes by threshold filtering.
Fig. 5.
Fig. 5. Sub image and reference image, (a) Sum image Isum, (b) Reference image Iref, (c) Intensity profile of row 470 of Isum along the dotted line, (d) Intensity profile of row 470 of Iref along the dotted line.
Fig. 6.
Fig. 6. Comparative experiment results, (a) Pyramid target without green tape, (b) Depth map of pyramid target without green tape, (c) Repaired depth map with the proposed BLF algorithm, (d) Repaired depth map with JBF algorithm, (e) Range profiles of the row 470 of depth maps of Figs. 4(h), 6(b), 6(c) and 6(d).
Fig. 7.
Fig. 7. Repaired results of depth maps from 11 m to 14 m, (a)Raw depth maps without threshold filtering, (b)Degraded depth maps with global threshold filtering, (c) Repaired depth maps with BLF.

Tables (1)

Tables Icon

Table 1. Pixel number of depth data, range resolution and PSNR of degraded depth map and repaired depth maps when compared with depth map of pyramid target without green tape in Fig. 6(b)

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

I new A ( i , j ) = { 0 , I A ( i , j ) < G t h I A ( i , j ) , I A ( i , j ) G t h
I s u m ( i , j ) = { 0 , ( I A ( i , j ) + I B ( i , j ) ) < 2 G t h I A ( i , j ) + I B ( i , j ) , 2 G t h ( I A ( i , j ) + I B ( i , j ) ) 255 255 , ( I A ( i , j ) + I B ( i , j ) ) > 255
D deg ( i , j ) = 0
I r e f ( c e i l ( i / u ) , c e i l ( j / v ) ) > 0
I A ( i , j ) > 0
I B ( i , j ) > 0
D hole ( i , j ) = τ A c 2 n + I B ( i , j ) I A ( i , j ) + I B ( i , j ) t L c 2 n , I A ( i , j ) L t h & & I B ( i , j ) L t h

Metrics