Abstract

In this paper, we discuss the reconstruction and the recognition of partially occluded objects using photon counting integral imaging (II). Irradiance scenes are numerically reconstructed for the reference target in three-dimensional (3D) space. Photon counting scenes are estimated for unknown input objects using maximum likelihood estimation (MLE) of Poisson parameter. We propose nonlinear matched filtering in 3D space to recognize partially occluded targets. The recognition performance is substantially improved from the nonlinear matched filtering of elemental images without 3D reconstruction. The discrimination capability is analyzed in terms of Fisher ratio (FR) and receiver operating characteristic (ROC) curves.

© 2007 Optical Society of America

1. Introduction

Imaging systems using a photon counting detector have advantages for detection in the very low light level requiring less power than the conventional imaging system. The photon counting image system has been applied to various fields such as night vision, laser radar imaging, radiological imaging, stellar imaging, and medical imaging [110]. Automatic target recognition (ATR) [1113] with the photon counting detector has been proposed in [2]. Photon counting recognition techniques have been applied to infrared imaging [3]. Three-dimensional (3D) active sensing by LADAR has been researched in [47]. In [8], the Bhattacharyya distance has been investigated for target detection. Recently, nonlinear matched filtering and photon counting linear discriminant analysis have been proposed in [9] and [10], respectively.

Integral imaging (II) records and visualizes 3D information using a lenslet array [1422]. Capture elemental images have different perspective of objects. 3D scenes are reconstructed optically or numerically. Partially obstructed objects can be visualized by the computational reconstruction of sectional images at different depth levels. There have been researches to overcome the partial occlusion for the purpose of the object recognition [21,22]. A reconstruction work of occluded objects using an array of cameras is found in [23].

In this paper, we propose photon counting nonlinear matched filtering in 3D space. Photon counting nonlinear-matched filtering measures correlation between reconstructed irradiance scenes and estimated photon-counting scenes in 3D space. We modify the computational reconstruction method [19] to generate irradiance scenes. The photon counting scenes are composed of the Poisson parameters estimated by maximum likelihood estimation (MLE). This nonlinear matched filtering has the same first and second order statistical properties with the ordinary nonlinear matched filtering which is performed without reconstruction [9]. We show that the proposed system substantially improves the recognition performance in terms of Fisher ratios and ROC curves.

The organization of the paper is as follows. In Sec. 2, we provide the modified reconstruction method for irradiance scenes and the newly proposed reconstruction method of photon counting scenes. The nonlinear matched filtering in 3D space is proposed in Sec. 3. Photon counting images are simulated from experimentally-derived irradiance information. In Sec. 4, these simulated images are used to compare the performances of the nonlinear matched filtering between the reconstructed images in 3D space and the elemental images. Conclusions follow in Sec. 5.

2. Computational reconstruction of integral imaging

In this section, the computational reconstruction of II is described. The estimation of photon counting scenes is also discussed.

2.1 Reconstruction of irradiance information

As illustrated in Fig. 1(a), II recording system is composed of a micro-lens array, an imaging lens, and a CCD (charge-coupled device) array camera. The micro-lens array is composed of a large number of small convex lenslets. The ray information captured by each lenslet appears as a small two-dimensional (2D) image with different points of view. The imaging lens is used to compensate the short focal length of micro-lenslets. Except for the magnification factor and negligible aberration of the imaging lens, the image on the image plane of the micro-lens array and the captured image by the CCD camera are assumed to be identical.

 

Fig. 1. (a) II recording system, (b) II reconstruction model.

Download Full Size | PPT Slide | PDF

By the computational reconstruction of II, the volumetric information of 3D objects is numerically reconstructed by means of the ray projection method [19]. Elemental images are projected through a virtual pinhole array at the reconstruction plane with arbitrary depths. Therefore, volumetric scenes are reconstructed to reproduce the original 3D information.

Suppose that a point A in Fig. 1(b) is located at [iA, jA, zA] on the 3D object surface. The power density at the point A is denoted as xA. Let xn be the captured irradiance corresponding to the point A on the imaging plane of the n-th micro-lenslet. The points are located at [in, jn, gA], n=1,…,NA and assumed to have a unit area; on is the center of the n-th micro-lenslet. Under the assumption that the distance (zA) between the point A and the micro-lens array is large, the same power is transferred from xA and collected for x1,,xNA; the scale factors between xA and xn, n=1,…,NA are approximately the same. Therefore, xA can be estimated as proportional to the average of xn.

x̂A1NAn=1NAxn.

2.2 Reconstruction of photon counting information

Assuming the fluctuations in irradiance are small compared to the fluctuations produced by the quantized nature of the radiation, the Poisson distribution is considered for the photon counting model. In this case, the Poisson parameter for the photon-counts at each pixel can be assumed to be proportional to the irradiance of the pixel in the detector [9,10]. Therefore, the probability of photon event at pixel i can be given by

Pd(y(i);λ(i))=λ(i)y(i)eλ(i)y(i)!,y(i)=0,1,2,,
λ(i)=NP[x(i)i=1NTx(i)],

where y(i) is the number of photons detected at pixel i, NP is an expected number of photon-counts in the scene, x(i) is irradiance at pixel i, NT is the total number of pixels in the scene. Let yn, n=1,…,NA be the photon counts detected with the parameter λn which is associated with xn. Let us consider λA which is associated with xA. With the assumption that x1,,xNA are measurements of xA, xA,λ1,,λNA are also originated from λn. Therefore, the joint probability density function of photon counts is calculated as

Pd(y1,,yNA;λA)=n=1NAλAyneλAyn!.

The MLE (maximum likelihood estimation) [24] of λA is

λ̂A=1NAn=1NAyn.

3. Recognition of occluded objects from the photon counting scene

Matched filtering for photon counting recognition estimates the correlation between the irradiance image of a reference target and an unknown input image obtained during the photon counting event. The nonlinear matched filtering is defined as the nonlinear correlation normalized with the power v of the photon-counting image [9]:

Crs(ν)=i=1NTxr(i)ys(i)(i=1NTxr(i)2)12(i=1NTys(i))ν,

where xr(i) is the irradiance at pixel i of the reference image of the target r, and ys(i) is the photon counts at pixel i of the unknown input object s. The first and second order characteristics of the nonlinear matched filtering is that the mean of Crs(1) is constant and the variance is approximately proportional to 1/NP [9].

We propose nonlinear matched filtering in 3D space as

Drs(ν)=d=1Nd[iΩdx̂r(i;zd)λ̂s(i;zd)(iΩdx̂r(i;zd)2)12(iΩdλ̂s(i;zd))ν],

where r(i;zd) is the reconstructed irradiance information of the reference target at the depth zd; i denotes the position of the virtual voxels on the reconstruction plane Ωd, thus, voxels on the plane Ωd have the same longitudinal distance (zd) from the micro-lens array; λ^s(i;zd) is the estimated Poisson parameter on the reconstruction plane Ωd for the unknown input target. It is noted that the sailing factors between xA and xn, n=1,…,NA are set at 1 without the loss of generality for the reconstruction and recognition purpose.

Since there is no information of the object location in the 3D space, the nonlinear matched filtering is performed on the entire reconstruction space Ωd, d=1,…,Nd, where Nd is the total number of depth levels. The nonlinearity decided by v in the denominator in Eq. (7) causes the same first and second order characteristics with Eq. (6) since r(i) and λ^s(i) are merely the linear combination of the irradiance and the photon counts in the elemental images.

We employ Fisher ratio (FR) as the performance metric [3,9]. Receiver operating characteristic (ROC) curves are also illustrated in the next section to investigate the discrimination capability of the proposed system.

4. Experimental and simulation results

In this section, we present the experimental and simulation results of the reconstruction and the recognition of occluded objects in 3D space.

4.1 Reconstruction results

The II recording system is composed of a micro-lens array, a pick-up camera as shown in Fig. 1(a). The pitch of each micro-lenslet is 1.09 mm, and the focal length of each micro-lenslet is about 3 mm. The size of the cars is about 4.5 cm×2.5 cm×2.5 cm. To simulate the partial occlusion, a tree model is placed between the toy car and the micro-lens array. The distance between the imaging lens and the micro-lens array is 9.5 cm, and the distance between the micro-lens array and the occluding objects is 4~5 cm and the distance between the micro-lens array and the toy car is 9.5 cm. Figure 2 shows the elemental images of the white car for the reference image and the partially occluded white toy car for the true-class target and the yellow car for the false-class target. The size of the elemental image array is 1161×1419 pixels and the number of elemental images is 18×22. Figure 3 shows the central parts of elemental images in Fig. 2. Movies in Fig. 4 show that the reconstructed sectional images in the 3D space for the partially occluded true-class target. The reconstruction distance of Fig. 4(a) is 30~50 mm and the distance for Fig. 4(b) is 66~80 mm.

 

Fig. 2. Elemental images for (a) reference target, (b) true-class target with partial occlusion, (c) false-class target.

Download Full Size | PPT Slide | PDF

 

Fig. 3. Central image of elemental images in Fig. 2, (a) reference target, (b) true-class target with partial occlusion, (c) false-class target.

Download Full Size | PPT Slide | PDF

 

Fig. 4. Movies of reconstructed images of the partially occluded true-class target, (a) d=30~50 mm (the movie file size: 4.61 MB) [Media 1] (b) d=66~80 mm (the movie file size: 6.27 MB) [Media 2].

Download Full Size | PPT Slide | PDF

4.2 Recognition results

In this subsection, we show the recognition results between the reference target and two unknown input objects. We generate photon counting images, each elemental array in Fig. 3 with a random number of photons. One hundred photon counting images for each object are generated to compute the means and variances of the nonlinear matched filtering when v=1. It is noted that only the gray level of irradiance in Fig. 3 is used for photo-counts simulation. Several values of NP, mean number of photo-counts in the entire image, are used to test the recognition performances. Figures 5(a) and 5(b) show the experimental results of nonlinear matched filtering for elemental images, that is without reconstruction [See Eq. (6)] and with reconstruction in 3D space [See Eq. (7)], respectively. The red solid line graph represents the sample mean for the true class target with occlusion and the blue dotted line graph is the sample mean for the false class target. Error bars stand for mrs±σrs where mrs and σrs are the sample mean and the sample standard deviation of the matched filtering, respectively. Nd in Eq. (7) is set at 1 where the depth of reconstruction plane is 74 mm.

Figure 6(a) shows the Fisher ratios. Fisher ratio decreases when a lower number of photo-counts are used. Figures 6(b) and 6(c) show the ROC curves when Np=500 and Np=100, respectively. It is shown that the recognition performance of the proposed technique is substantially improved compared from the conventional method.

 

Fig. 5. Sample mean and error bars (standard deviation) for nonlinear matched filtering for, (a) elemental images without reconstruction, (b) reconstruction of input scenes in the 3D space.

Download Full Size | PPT Slide | PDF

 

Fig. 6. Comparison of recognition performance between elemental images and 3D reconstruction, (a) Fisher ratios, ROC curves when (b) Np=500, (b) Np=100.

Download Full Size | PPT Slide | PDF

In Eq. (7), it is noted that the estimated Poisson parameters of the unknown input object and reconstructed irradiance information of the reference object are used instead of the photon counts and the irradiance elemental images, respectively. When the object is occluded partially, we can visualize the target by II computational reconstruction as shown in Fig. 4(b). This reconstruction processes might be equivalent with extracting “good” features for the matched filtering recognition. For photon counting matched filtering, 3D reconstruction of irradiance scenes and estimation of photon counting scenes are shown to be proper feature extraction techniques resulting in substantially improved performance compared with the conventional method. Another merit of these reconstruction techniques is that the nonlinear matched filtering in 3D space has the same first and second order statistical properties with the ordinary nonlinear matched filtering, which is superior to the linear matched filtering in terms of the recognition performance [9].

5. Conclusions

In the paper, we propose nonlinear matched filtering in 3D space using the computational reconstruction of elemental images. The computational II visualizes the partially occluded object by generating sectional images at different depth planes. Computationally reconstructed irradiance scenes and estimated photon counting scenes are nonlinearly correlated for pattern recognition purpose. We have observed in the experiments that the output of the proposed nonlinear matched filtering with reconstruction in 3D space provides better performance than the nonlinear matched filtering between elemental images in 2D space.

Acknowledgements

This research was supported in part by the Daegu University Research Grant 2007.

References and links

1. J. W. Goodman, Statistical Optics, (Jonh wiley & Sons, inc., 1985), Chap 9.

2. G. M. Morris, “Scene matching using photon-limited images,” J. Opt. Soc. Am. A. 1, 482–488 (1984). [CrossRef]  

3. E. A. Watson and G. M. Morris, “Comparison of infrared upconversion methods for photon-limited imaging,” J. Appl. Phys. 67, 6075–6084 (1990). [CrossRef]  

4. D. Stucki, G. Ribordy, A. Stefanov, H. Zbinden, J. G. Rarity, and T. Wall, “Photon counting for quantum key distribution with Peltier cooled InGaAs/InP APDs,” J. Mod. Opt. 48, 1967–1981 (2001). [CrossRef]  

5. P. A. Hiskett, G. S. Buller, A. Y. Loudon, J. M. Smith, I Gontijo, A. C. Walker, P. D. Townsend, and M. J. Robertson, “Performance and design of InGaAs/InP photodiodes for single-photon counting at 1.55 um,” Appl. Opt. 39, 6818–6829 (2000). [CrossRef]  

6. L. Duraffourg, J.-M. Merolla, J.-P. Goedgebuer, N. Butterlin, and W. Rhods, “Photon counting in the 1540-nm wavelength region with a Germanium avalanche photodiode,” IEEE J. Quantum Electron. 37, 75–79 (2001). [CrossRef]  

7. J. G. Rarity, T. E. Wall, K. D. Ridley, P. C. M. Owens, and P. R. Tapster, “Single-photon counting for the 1300–1600-nm range by use of Peltier-cooled and passively quenched InGaAs avalanche photodiodes,” Appl. Opt. 39, 6746–6753 (2000). [CrossRef]  

8. P. Refregier, F. Goudail, and G. Delyon, “Photon noise effect on detection in coherent active images,” Opt. Lett. 29, 162–164 (2004). [CrossRef]   [PubMed]  

9. S. Yeom, B. Javidi, and E. Watson, “Photon counting passive 3D image sensing for automatic target recognition,” Opt. Express13, 9310–9330 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-23-9310. [CrossRef]   [PubMed]  

10. S. Yeom, B. Javidi, and E. Watson, “Three-dimensional distortion-tolerant object recognition using photon-counting integral imaging,” Opt. Express15, 1513–1533 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=oe-15-4-1513. [CrossRef]   [PubMed]  

11. F. Sadjadi, ed., Selected Papers on Automatic Target Recognition, (SPIE-CDROM, 1999).

12. A. Mahalanobis, R. R. Muise, S. R. Stanfill, and A. V. Nevel, “Design and application of quadratic correlation filters for target detection,” IEEE Trans. on Aerosp. Electron. Syst. 40, 837–850 (2004). [CrossRef]  

13. H. Kwon and N. M. Nasrabadi, “Kernel RX-algorithm: a nonlinear anomaly detector for hyperspectral imagery,” IEEE Trans. on Geosci. Remote Sens. 43, 388–397 (2005). [CrossRef]  

14. J.-S. Jang and B. Javidi, “Time-multiplexed integral imaging for 3D sensing and display,” Optics and Photonics News 15, 36–43 (2004).

15. J. Y. Son, V. V. Saveljev, J. S. Kim, S. S. Kim, and B. Javidi, “Viewing zones in three-dimensional imaging systems based on lenticular, parallax-barrier, and microlens-array plates,” Appl. Opt. 43, 4985–4992 (2004). [CrossRef]   [PubMed]  

16. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proceedings of the IEEE 94, 591–607 (2006). [CrossRef]  

17. Y. Frauel, E. Tajahuerce, O. Matoba, M.A. Castro, and B. Javidi, “Comparison of passive ranging integral imaging and active imaging digital holography for 3D object recognition,” Appl. Opt. 43, 452–462 (2004). [CrossRef]   [PubMed]  

18. Raul Martinez-Cuenca, Amparo Pons, Genaro Saavedra, Manuel Martinez-Corral, and Bahram Javidi, “Optically-corrected elemental images for undistorted Integral image display,” Opt. Express14, 9657–9663 (2006), http://www.opticsinfobase.org/abstract.cfm?URI=oe-14-21-9657. [CrossRef]   [PubMed]  

19. S. H. Hong, J. S. Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging,” Opt. Express12, 483–491 (2004), http://www.opticsinfobase.org/abstract.cfm?URI=oe-12-3-483. [CrossRef]   [PubMed]  

20. J.-H. Park, J. Kim, and B. Lee, “Three-dimensional optical correlator using a sub-image array,” Opt. Express13, 5116–5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef]   [PubMed]  

21. B. Javidi, R. Ponce-Diaz, and S-. H. Hong, “Three-dimensional recognition of occluded objects by using computational integral imaging,” Opt. Lett. 31, 1106–1108 (2006). [CrossRef]   [PubMed]  

22. S. H. Hong and B. Javidi, “Distortion-tolerant 3D recognition of occluded objects using computational integral imaging,” Opt. Express14, 12085–12095 (2006), http://www.opticsinfobase.org/abstract.cfm?URI=oe-14-25-12085. [CrossRef]   [PubMed]  

23. V. Vaish, R. Szeliski, C. L. Zitnick, S. B. Kang, and M. Levoy, “Reconstructing occluded surfaces using synthetic: stereo, focus and robust measures,” Proceedings of the IEEE CVPR’06 (2006).

24. S. M. Kay, Fundamentals of Statistical Signal Processing (Prentice Hall, New Jersey, 1993).

References

  • View by:
  • |
  • |

  1. J. W. Goodman, Statistical Optics, (Jonh wiley & Sons, inc., 1985), Chap 9.
  2. G. M. Morris, "Scene matching using photon-limited images," J. Opt. Soc. Am. A. 1, 482-488 (1984).
    [CrossRef]
  3. E. A. Watson and G. M. Morris, "Comparison of infrared upconversion methods for photon-limited imaging," J. Appl. Phys. 67, 6075-6084 (1990).
    [CrossRef]
  4. D. Stucki, G. Ribordy, A. Stefanov, H. Zbinden, J. G. Rarity, and T. Wall, "Photon counting for quantum key distribution with Peltier cooled InGaAs/InP APDs," J. Mod. Opt. 48, 1967-1981 (2001).
    [CrossRef]
  5. P. A. Hiskett, G. S. Buller, A. Y. Loudon, J. M. Smith, I Gontijo, A. C. Walker, P. D. Townsend, and M. J. Robertson, "Performance and design of InGaAs/InP photodiodes for single-photon counting at 1.55 um," Appl. Opt. 39, 6818-6829 (2000).
    [CrossRef]
  6. L. Duraffourg, J.-M. Merolla, J.-P. Goedgebuer, N. Butterlin, and W. Rhods, "Photon counting in the 1540-nm wavelength region with a Germanium avalanche photodiode," IEEE J. Quantum Electron. 37, 75-79 (2001).
    [CrossRef]
  7. J. G. Rarity, T. E. Wall, K. D. Ridley, P. C. M. Owens, and P. R. Tapster, "Single-photon counting for the 1300-1600-nm range by use of Peltier-cooled and passively quenched InGaAs avalanche photodiodes," Appl. Opt. 39, 6746-6753 (2000).
    [CrossRef]
  8. P. Refregier, F. Goudail, and G. Delyon, "Photon noise effect on detection in coherent active images," Opt. Lett. 29, 162-164 (2004).
    [CrossRef] [PubMed]
  9. S. Yeom, B. Javidi, and E. Watson, "Photon counting passive 3D image sensing for automatic target recognition," Opt. Express 13, 9310-9330 (2005).
    [CrossRef] [PubMed]
  10. S. Yeom, B. Javidi, and E. Watson, "Three-dimensional distortion-tolerant object recognition using photon-counting integral imaging," Opt. Express 15, 1513-1533 (2007).
    [CrossRef] [PubMed]
  11. F. Sadjadi, ed., Selected Papers on Automatic Target Recognition, (SPIE-CDROM, 1999).
  12. A. Mahalanobis, R. R. Muise, S. R. Stanfill, and A. V. Nevel, "Design and application of quadratic correlation filters for target detection," IEEE Trans. on Aerosp. Electron. Syst. 40, 837-850 (2004).
    [CrossRef]
  13. H. Kwon and N. M. Nasrabadi, "Kernel RX-algorithm: a nonlinear anomaly detector for hyperspectral imagery," IEEE Trans. on Geosci.Remote Sens. 43, 388-397 (2005).
    [CrossRef]
  14. J.-S. Jang and B. Javidi, "Time-multiplexed integral imaging for 3D sensing and display," Optics and Photonics News 15, 36-43 (2004).
  15. J. Y. Son, V. V. Saveljev, J. S. Kim, S. S. Kim, and B. Javidi, "Viewing zones in three-dimensional imaging systems based on lenticular, parallax-barrier, and microlens-array plates," Appl. Opt. 43, 4985-4992 (2004).
    [CrossRef] [PubMed]
  16. A. Stern and B. Javidi, "Three-dimensional image sensing, visualization, and processing using integral imaging," Proceedings of the IEEE 94, 591- 607 (2006).
    [CrossRef]
  17. Y. Frauel, E. Tajahuerce, O. Matoba, M.A. Castro, and B. Javidi, "Comparison of passive ranging integral imaging and active imaging digital holography for 3D object recognition," Appl. Opt. 43, 452-462 (2004).
    [CrossRef] [PubMed]
  18. Raul Martinez-Cuenca, Amparo Pons, Genaro Saavedra, Manuel Martinez-Corral, and Bahram Javidi, "Optically-corrected elemental images for undistorted Integral image display," Opt. Express 14, 9657-9663 (2006).
    [CrossRef] [PubMed]
  19. S. H. Hong, J. S. Jang, and B. Javidi, "Three-dimensional volumetric object reconstruction using computational integral imaging," Opt. Express 12, 483-491 (2004).
    [CrossRef] [PubMed]
  20. J.-H. Park, J. Kim, and B. Lee, "Three-dimensional optical correlator using a sub-image array," Opt. Express 13, 5116-5126 (2005).
    [CrossRef] [PubMed]
  21. B. Javidi, R. Ponce-Diaz, and S-. H. Hong, "Three-dimensional recognition of occluded objects by using computational integral imaging," Opt. Lett. 31, 1106-1108 (2006).
    [CrossRef] [PubMed]
  22. S. H. Hong and B. Javidi, "Distortion-tolerant 3D recognition of occluded objects using computational integral imaging," Opt. Express 14, 12085-12095 (2006).
    [CrossRef] [PubMed]
  23. V. Vaish, R. Szeliski, C. L. Zitnick, S. B. Kang, and M. Levoy, "Reconstructing occluded surfaces using synthetic: stereo, focus and robust measures," Proceedings of the IEEE CVPR’06 (2006).
  24. S. M. Kay, Fundamentals of Statistical Signal Processing (Prentice Hall, New Jersey, 1993).

2007

2006

2005

2004

2001

D. Stucki, G. Ribordy, A. Stefanov, H. Zbinden, J. G. Rarity, and T. Wall, "Photon counting for quantum key distribution with Peltier cooled InGaAs/InP APDs," J. Mod. Opt. 48, 1967-1981 (2001).
[CrossRef]

L. Duraffourg, J.-M. Merolla, J.-P. Goedgebuer, N. Butterlin, and W. Rhods, "Photon counting in the 1540-nm wavelength region with a Germanium avalanche photodiode," IEEE J. Quantum Electron. 37, 75-79 (2001).
[CrossRef]

2000

1990

E. A. Watson and G. M. Morris, "Comparison of infrared upconversion methods for photon-limited imaging," J. Appl. Phys. 67, 6075-6084 (1990).
[CrossRef]

1984

G. M. Morris, "Scene matching using photon-limited images," J. Opt. Soc. Am. A. 1, 482-488 (1984).
[CrossRef]

Appl. Opt.

IEEE J. Quantum Electron.

L. Duraffourg, J.-M. Merolla, J.-P. Goedgebuer, N. Butterlin, and W. Rhods, "Photon counting in the 1540-nm wavelength region with a Germanium avalanche photodiode," IEEE J. Quantum Electron. 37, 75-79 (2001).
[CrossRef]

IEEE Trans. on Aerosp. Electron. Syst.

A. Mahalanobis, R. R. Muise, S. R. Stanfill, and A. V. Nevel, "Design and application of quadratic correlation filters for target detection," IEEE Trans. on Aerosp. Electron. Syst. 40, 837-850 (2004).
[CrossRef]

J. Appl. Phys.

E. A. Watson and G. M. Morris, "Comparison of infrared upconversion methods for photon-limited imaging," J. Appl. Phys. 67, 6075-6084 (1990).
[CrossRef]

J. Mod. Opt.

D. Stucki, G. Ribordy, A. Stefanov, H. Zbinden, J. G. Rarity, and T. Wall, "Photon counting for quantum key distribution with Peltier cooled InGaAs/InP APDs," J. Mod. Opt. 48, 1967-1981 (2001).
[CrossRef]

J. Opt. Soc. Am. A.

G. M. Morris, "Scene matching using photon-limited images," J. Opt. Soc. Am. A. 1, 482-488 (1984).
[CrossRef]

Opt. Express

Opt. Lett.

Optics and Photonics News

J.-S. Jang and B. Javidi, "Time-multiplexed integral imaging for 3D sensing and display," Optics and Photonics News 15, 36-43 (2004).

Proceedings of the IEEE

A. Stern and B. Javidi, "Three-dimensional image sensing, visualization, and processing using integral imaging," Proceedings of the IEEE 94, 591- 607 (2006).
[CrossRef]

Remote Sens.

H. Kwon and N. M. Nasrabadi, "Kernel RX-algorithm: a nonlinear anomaly detector for hyperspectral imagery," IEEE Trans. on Geosci.Remote Sens. 43, 388-397 (2005).
[CrossRef]

Other

F. Sadjadi, ed., Selected Papers on Automatic Target Recognition, (SPIE-CDROM, 1999).

J. W. Goodman, Statistical Optics, (Jonh wiley & Sons, inc., 1985), Chap 9.

V. Vaish, R. Szeliski, C. L. Zitnick, S. B. Kang, and M. Levoy, "Reconstructing occluded surfaces using synthetic: stereo, focus and robust measures," Proceedings of the IEEE CVPR’06 (2006).

S. M. Kay, Fundamentals of Statistical Signal Processing (Prentice Hall, New Jersey, 1993).

Supplementary Material (2)

» Media 1: AVI (4725 KB)     
» Media 2: AVI (6431 KB)     

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1.

(a) II recording system, (b) II reconstruction model.

Fig. 2.
Fig. 2.

Elemental images for (a) reference target, (b) true-class target with partial occlusion, (c) false-class target.

Fig. 3.
Fig. 3.

Central image of elemental images in Fig. 2, (a) reference target, (b) true-class target with partial occlusion, (c) false-class target.

Fig. 4.
Fig. 4.

Movies of reconstructed images of the partially occluded true-class target, (a) d=30~50 mm (the movie file size: 4.61 MB) [Media 1] (b) d=66~80 mm (the movie file size: 6.27 MB) [Media 2].

Fig. 5.
Fig. 5.

Sample mean and error bars (standard deviation) for nonlinear matched filtering for, (a) elemental images without reconstruction, (b) reconstruction of input scenes in the 3D space.

Fig. 6.
Fig. 6.

Comparison of recognition performance between elemental images and 3D reconstruction, (a) Fisher ratios, ROC curves when (b) Np =500, (b) Np =100.

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

x ̂ A 1 N A n = 1 N A x n .
P d ( y ( i ) ; λ ( i ) ) = λ ( i ) y ( i ) e λ ( i ) y ( i ) ! , y ( i ) = 0 , 1 , 2 , ,
λ ( i ) = N P [ x ( i ) i = 1 N T x ( i ) ] ,
P d ( y 1 , , y N A ; λ A ) = n = 1 N A λ A y n e λ A y n ! .
λ ̂ A = 1 N A n = 1 N A y n .
C rs ( ν ) = i = 1 N T x r ( i ) y s ( i ) ( i = 1 N T x r ( i ) 2 ) 1 2 ( i = 1 N T y s ( i ) ) ν ,
D rs ( ν ) = d = 1 N d [ i Ω d x ̂ r ( i ; z d ) λ ̂ s ( i ; z d ) ( i Ω d x ̂ r ( i ; z d ) 2 ) 1 2 ( i Ω d λ ̂ s ( i ; z d ) ) ν ] ,

Metrics