## Abstract

In this paper, we present three dimensional (3D) object reconstruction using photon-counted elemental images acquired by a passive 3D Integral Imaging (II) system. The maximum likelihood (ML) estimator is derived to reconstruct the irradiance of the 3D scene pixels and the reliability of the estimator is described by confidence intervals. For applications in photon scarce environments, our proposed technique provides 3D reconstruction for better visualization as well as significant reduction in the computational burden and required bandwidth for transmission of integral images. The performance of the reconstruction is illustrated qualitatively and compared quantitatively with Peak to Signal to Noise Ratio (PSNR) criterion.

©2008 Optical Society of America

## 1. Introduction

There has been growing interest in three-dimensional (3D) imaging systems recently [1–4] and one of the promising methods for 3D sensing and visualization is Integral Imaging (II) [5–8] based on Integral Photography [8,9]. In this technique in addition to irradiance, directional information of the rays is recorded by acquiring two dimensional images from different perspectives of the scene. In particular, in Synthetic Aperture Integral Imaging (SAII) systems [11], an imaging device scans a planar grid in order to capture high resolution 2D images. These elemental images are used for reconstruction which can be performed optically [12,13] or computationally by applying the inverse procedure of the recording [6–7,11] The sensitivity of SAII reconstruction results to pickup position uncertainty is studied in [14]. The application of the II system has been extended to object recognition, depth estimation, occlusion removal and multiple viewing point generation [15–19]. Multiple image acquisition is also applied to retrieve high spatial-resolution information from low resolution elemental images [20, 21].

In some applications, low illumination levels can lead to small irradiances at the image plane. There are various applications for photon-counting imaging such as night vision [22–24], laser radar imaging [27,28], single photon emission tomography [29] and astronomical imaging [30]. Photon-counting imaging systems in general require less received power than conventional imaging systems that generate gray-scale irradiance images. Computational burden for processing photon-counted images with very low number of photons is much less than processing regular gray scale images and they have a potential to be compressed with higher compression ratio which consequently require less bandwidth for transmission.

There are numerous techniques for estimating an irradiance image from the counts of photon detectors in the two dimensional cases. The approaches are based on the statistical model of light fluctuation measurements which declare that given practical assumptions on the imaging conditions, the photo-count statistics follows a Poisson distribution [31]. Therefore the problem of reconstruction of photon-counted images is equivalent to a Poisson inverse problem [32].

Object recognition and classification for 3D objects using photon counting integral imaging is explored in [33]. In this paper, we study the Maximum Likelihood (ML) irradiance estimation of 3D objects using photon-counted elemental images captured by SAII system and the results are compared with the computational reconstruction using gray scale irradiance elemental images. The sections of this paper are as follows. Section 2 is an overview of the SAII and computational reconstruction. In Section 3 a model for photon-counted image is described and section 4 presents maximum likelihood estimation of 3D objects’ irradiance and the uncertainty surrounding the estimate is expressed with confidence intervals. Section 5 shows the experimental results and section 6 is the conclusion.

## 2. Three dimensional imaging and computational reconstruction

Synthetic Aperture Integral Imaging (SAII) is a three dimensional passive imaging technique, in which in addition to the irradiance, directional information of the rays are acquired. For this reason, an imaging device such as a digital camera is moved across a synthetic aperture to capture 2D images from different perspectives of the scene. The synthetic aperture is a planar grid with defined separations on x-y direction while the normal to the grid is parallel to the optical axes of the cameras’ lens. The output is a series of high resolution two dimensional images called elemental images, which are used for reconstruction. The pickup process of SAII is illustrated in Fig. 1 in which a simple model for the 3D scene is chosen which consists of two planar objects located at two different distances from the pick up plane.

The lens along with its sensor translates on the grid with the separation of *S _{x}* and

*S*in

_{y}*x*and

*y*directions respectively in order to capture the elemental images. An array of these elemental images and the reconstruction at two planes where the objects are located is shown in Fig. 2.

One of the possible approaches in the 3D II reconstruction computationally is to simulate the reverse of the pick up process using geometrical optics. In this method, a 2D plane of the 3D scene located at a particular distance is reconstructed by back propagating the elemental images on to that distance, through the simulated pinhole arrays. The back projection process consist of magnifying each elemental image with respect to the distance of the desired reconstruction plane and shifting according to the location of its associated imaging device on the pick up plane. The magnified elemental images overlap on the desired plane such that the objects originally located at that distance are reconstructed properly and appeared in focus while other objects become smeared. The full 3D scene is the collection of the reconstruction at all of the 2D planes.

Magnifying large number of high resolution elemental images and handling them is a computationally intensive process. For an object located at the distance *z*=*z _{0}* from the lens of the imaging device, the magnification factor is

*M*=

_{0}*z*

_{0}/

*g*where g is the distance between pick up grid and image plane [see Fig.1]. Thus another approach for computational reconstruction is applied exclusive of magnification. It is illustrated in Fig. 2 that if the camera shifts by

*S*, the image of the object located at

_{x}*z*=

*z*

_{0}, shifts by

*S*/

_{x}*M*. Thus, the objects are reconstructed by shifting the elemental images opposite to the direction of the image shift due to camera motion. This is equivalent to shrinking the pickup grid with the factor of

_{0}*S*/

_{x}*M*and

_{0}*S*/

_{y}*M*in x and y direction respectively.

_{0}The final reconstruction plane consists of partial overlap of shifted elemental images expressed as following:

in which subscripts *k* and *l* indicate the location of elemental image, I_{kl}, in the pickup grid. Notice that the size of the objects reconstructed with this method is *1*/*M*
_{0} of their actual size and if the actual size is required the reconstructed plane should be magnified.

## 3. Photon-counting detection model

Association of the irradiance image with the photon-counted image is studied in order to estimate the irradiance from photon counts. Each photon of the light carries energy *hυ*, where *h* is Plank’s constant (6.6262×10^{-34}) and *υ* is the mean frequency of the quasi monochromatic light source. Suppose that the energy incident on one pixel of the photosurface during the time Δ*T* be *E _{x}*, then the mean number of photons detected during this time interval can be expressed as(

*ηE*)/(

_{x}*hυ*), where

*η*≤1 is the sensor quantum efficiency and represents the average number of photoelectrons generated by each incident photon [31].

The incident energy is the integrated irradiance over time interval Δ*T*, i.e. *E _{x}*=

*I*Δ

_{x}*T*. Consequently the irradiance is proportional to the mean number of photoevents. On the other hand, the statistical properties of the photoevents show that the probability of the number of photons detected in a time interval, ΔT, smaller than the coherence time of the light, by the photosurface, smaller than the coherence area of the incident light, follows Poisson density function [31]. However, in practical cases of our interest, neither of the above conditions hold, i.e. the pixel area and exposure time are significantly larger than the coherence area and coherence time of the passive illumination. Nevertheless, for polarized thermal illumination with high degrees of coherence freedom, the degeneracy parameter approaches zero so the probability of detecting

*C*photons at pixel

*x*during the exposure time,

*C*, given the irradiance

_{x}*I*follows Poisson distribution as expressed in Eq. (2) [31].

_{x}Since Eq. (2) is a realistic model for passive integral imaging, we simulate the photon limited images from the irradiance images according to this probability density function with a constraint on the total number of photons detected by each sensor.

Assume that *I _{x}* is the normalized irradiance at pixel

*x*such that $\sum _{x=1}^{{N}_{T}}{I}_{x}=1$ , where

*N*is the total number of pixels of the image. In order to simulate a photon-counted image that has

_{T}*N*number of photons in average, a Poisson Random number

_{p}*C*with the mean parameter

_{x}*N*is generated,

_{p}I_{x}*C*|

_{x}*I*~

_{x}*Poisson*(

*N*). It is confirmed with Eq. (3) that the expected number of photons in the generated image is

_{p}I_{x}*N*. So the normalization is required to meet the constraint on the number of photons per image.

_{p}## 4. Three Dimensional reconstruction using photon-counted elemental images

Assume that an object is being imaged with an SAII system and the image of a single pixel of that object which is located at distance *z _{0}* from the pick up grid, is captured at pixel

*p*≡(

*x,y*) of the first sensor. As discussed in section 2, the image of this object pixel appears on the elemental images periodically in the positions as follows:

If the intensities falling on all such pixels of irradiance elemental images are assumed to be equal, i.e.
${I}_{\mathrm{kl}}\left(p+\Delta {p}_{\mathrm{kl}}\right)={I}_{p}^{{z}_{0}}$
, then the collection of the values of such pixels in photon-counted elemental images obtained according to Eq. (2), represent the ensemble of realizations from a Poisson random variables with mean
${N}_{p}{I}_{p}^{{z}_{0}}$
which are denoted by{*C _{kl}*(

*p*+Δ

*p*)} where

_{kl}*N*is constant and equal to the expected number of photons per elemental image. Our purpose is to estimate the irradiance based on these set of photon counts for each pixel of the 3D object. The likelihood estimation for the hypothesis ${I}_{p}^{{z}_{0}}$ can be calculated as following:

_{p}Then the log likelihood is:

$${C}_{\mathrm{kl}}\mid {I}_{p}^{{z}_{0}}\phantom{\rule{.2em}{0ex}}~\phantom{\rule{.2em}{0ex}}\mathrm{Poisson}\left({N}_{p}{I}_{p}^{{z}_{0}}\right)\Rightarrow $$

$$l\left({I}_{p}^{{z}_{0}}\right)=\sum _{k}^{K}\sum _{l}^{L}\left(-{N}_{p}{I}_{p}^{{z}_{0}}+{C}_{\mathrm{kl}}\left(p+\Delta {p}_{\mathrm{kl}}\right)\mathrm{log}\left({N}_{p}{I}_{p}^{{z}_{0}}\right)-\mathrm{log}({C}_{\mathrm{kl}}\left(p+\Delta {p}_{\mathrm{kl}}\right)!)\right)$$

The Maximum Likelihood (ML) estimate will obtained d by $\frac{\partial l\left({I}_{p}^{{z}_{0}}\right)}{\partial {I}_{p}^{{z}_{0}}}=0$ which leads to

So the ML estimate of the irradiance, ${\stackrel{~}{I}}_{p}^{{z}_{0}}$ , is proportional to the average of the corresponding observed samples in the elemental images. This average is the minimal sufficient statistics of the real valued independent identically distributed Poisson random variables and it is proven statistically that this is the Uniformly Minimum Variance Unbiased Estimator (UMVUE) for the mean of Poisson data [34].

Alternatively, computational reconstruction of SAII can be explained point by point in which in order to reconstruct one point of the object located at the specific distance, the elemental images are shifted in a way that all the pixels contain the image of that object point overlap and the average of the associated intensities of these overlapped pixels is the reconstructed point Using the new notation in Eq. (4), computational reconstruction of SAII calculated from Eq. (1) can be written as ${I}_{p}^{{z}_{0}}=\frac{1}{\mathrm{KL}}\mathrm{\sum}_{k=0}^{K-1}\mathrm{\sum}_{l=0}^{L-1}{I}_{\mathrm{kl}}\left(p+\Delta {p}_{\mathrm{kl}}\right)$ and it can be seen that the ML estimate of the irradiance for a single point is similar to the computational reconstruction of that point which means that the result of the computational reconstruction for one plane of the scene located at the specific distance, using photon-counted elemental images is also the ML irradiance estimate of the scene at that plane. Thus, by photon-counted elemental images, the irradiance of the three dimensional objects can be estimated.

It should be noted that in practical cases, sensors see the scene from different perspectives and a particular object pixel may not appear inside the field of view of all sensors or might be covered by other objects in some perspectives. As a result, the number of samples of the Poisson random variable may not be exactly equal to the number of elemental images for all object pixels. However, as long as the number of samples remains above 30 it can be considered in the large sample inference domain [34]. In our experiments we assume that such condition holds for all object pixels.

The reliability of an estimator is indicated through a parameter called the confidence interval that dictates the bounds of the estimation error [34]. Estimation error is defined as the absolute difference between an estimated parameter and its actual value. Smaller confidence intervals mean that there is smaller estimation error and therefore the estimator is more reliable. The probability that the estimation error is within the bounds determined by the confidence interval is assigned to be 1-*α*, where*α* ∈(0,1) and is chosen according to the desired accuracy. It is shown in appendix A how one gets the following expression for the confidence intervals of the grayscale irradiance;
${I}_{p}^{{z}_{0}}$

where *KL* is the total number of elemental images, and *z*
_{α/2} is the upper 100(*α*/2)_{%} point of the standard normal distribution and is known for each desired α as shown in Appendix A.

Equation (8) indicates that by increasing the number of photons, *N _{p}*, or number of elemental images, KL, confidence intervals shrink that is equivalent to the decrease of the estimation uncertainty. On the other hand, decrease of the number of photons can be compensated by increasing the number of elemental images.

Another significant factor which affects the irradiance estimation is the number of object pixels or the total number of pixels per elemental image, *N _{T}*. Suppose two objects which are identical except in size, are being imaged separately while the number of photons per elemental image is the same for both of them. Intuitively, we expect more error for the irradiance estimation of the larger object. This fact is explained according to the confidential intervals as follows. The larger object has more number of pixels and consequently to meet the condition explained in section 3; i.e.
$\mathrm{\sum}_{x=1}^{{N}_{T}}{I}_{x}=1$
, the normalized pixels of the larger object have smaller values in compare with the small object. Assume that the normalized irradiance of a single pixel of the large object is
$\frac{{I}_{p}^{{z}_{0}}}{\gamma}$
, therefore the associated pixels in the photon-counted elemental images follows Poisson with the mean
$\left(\frac{{N}_{p}{I}_{p}^{{z}_{0}}}{\gamma}\right)$
, so the confidence interval of the irradiance of that pixel is computed as:

By increasing the number of object pixels, γ increases and according to Eq. (9) confidence interval expands and the irradiance estimation error increases. As a result, in order to get the same accuracy for the irradiance estimation of the large object with the irradiance estimation of the small object, we need to increase the number of photons per elemental image or equivalently increase the number of captured elemental images.

## 5. Experimental results

We compare the 3D computational reconstruction using irradiance elemental images of SAII experiment with the reconstruction using corresponding photon-counted elemental images qualitatively and quantitatively.

The experimental scene is composed of two toy cars and a model helicopter located at different distances from the sensor plane. The closest and farthest objects are located 24cm and 40cm away form the center of the pickup grid. The scene is illuminated with diffused incoherent light. The imaging device is a digital camera with the focal plane array size of 22.7×15.6mm and a 10µm pixel pitch. The effective focal length of the camera lens is about 20mm. The camera is shifted with the equal separations of 5mm in both x and y directions on a planar grid. The size of the synthetic aperture is 80×80 mm^{2} with and 16×16 elemental images are captured.

Figure 3 shows the 3D reconstruction of the scene in two different distances of the objects according to Eq. (1) using 256 irradiance elemental images. As is clear, at each distance one of the objects is in focus while the others appear washed out.

According to section 3, we generate photon-counted elemental images from grayscale irradiance images which are captured experimentally, with the constraint on the total number of photons per elemental image. Consistent with the ML estimator calculated in section 4, the irradiance of the 3D scene pixels is reconstructed at different distances using these photon-counted elemental images.

In Fig. 4 two sample of the photon-counted elemental images are represented with the expected number of photons N_{p}=10^{3} and N_{p}=10^{5} in part (a) and (d) respectively while the corresponding irradiance image is captured from the center of the pick up grid. Clearly, recognizing the objects visually is not trivial in the 2D image with N_{p}=10^{3}. It is illustrated that the objects become recognizable after 3D computational reconstruction using all the gathered 2D photon-counted elemental images and the results can be compared qualitatively with the reconstruction using irradiance elemental images presented in Fig. 3. The movies of reconstruction from z=24cm to z=40cm is presented in Fig. 5 which shows that the objects become in focus at their corresponding distances even by using elemental images with very low number of photons.

We use Peak signal to noise ratio to compare computational reconstruction using original grayscale elemental images with the reconstructions using photon-counted elemental images quantitatively as given in Eq. (10) where MSE is the Mean Square Error which provides an estimate of average error per reconstructed pixels of the 3D scene and *I _{max}* is the maximum irradiance of the gray scale elemental images.

Figure 6 shows that as is anticipated, with increasing the expected number of photons per elemental image i.e. *N _{p}*, the error decreases and as a result PSNR increases exponentially. The increase of the PSNR is confirmed with the confidence intervals, Eq. (8) which proves that the estimation error is proportional to
$\frac{1}{\sqrt{{N}_{p}}}$
and consequently we expect that PSNR be proportional tolog(

*N*).

_{p}In photon-counted image generation, the pixel values of the normalized irradiance images are in the range [0, 2.5×10^{-5}], so with *N _{p}*=10

^{3}, the probability of counting more than one photon per pixel is less than 0.03%, i.e. Pr(

*C*>2|

*I*)<0.03 as calculated by Eq. (2). Therefore, we can assume to good approximation that the count for each pixel is either zero or one. Thus, with little error we treat photon-counted images with very low number of photons as binary images.

_{max}According to this fact, a possible performance criterion can be comparison of the reconstructions using binarized photon-counted images with the reconstructions using binarized elemental images based on Otsu’s thresholding method [35]. This method finds the threshold such that the intraclass variance of the black and white pixels approaches minimum. Reconstruction using Otsu’s method at z=240mm is shown in Fig. 7 in which it can be seen that the details of the image are lost. This result is compared with the reconstruction using photon-counted and irradiance elemental images respectively in Fig.8. For quantitative comparison we have computed the PSNR at different planes of the reconstructed image resulting from these binarized elemental images. We find the PSNR is approximately 5db smaller than the PSNR using photon-counted elemental images with *N _{p}*=10

^{3}, shown in Fig. 5. It is interesting to note that using a 16×16 array of elemental images, 1000 photo-counts per elemental image (average), and a mean illumination wavelength of 500 nm, the total received energy would only be approximately 10

^{-16}J.

The computational burden and the size of required memory are important issues for manipulating large data sets. In integral imaging, one typically deals with hundreds of millions of pixels. In different applications, various processes are performed on images such as classification, pattern recognition, tracking, filtering, restoration, denoising, etc for which process complexity and computational burden is directly related to the number of nonzero pixels. For the cases where probability of detecting more than one photon per pixel is very small, the number of nonzero pixels is approximately equal to the total number of detected photons with a good accuracy. Therefore, using photon-counted elemental images can reduce number of operations to the order of the total number of detected photons. One common computational process is the Fast Fourier Transform in which *n*log_{2}
*n* operations are needed where *n* is the number of nonzero pixels. The experimental gray scale images have 2×10^{5} nonzero pixels on average therefore computing an FFT for the corresponding photon-counted images with 10^{3} photons reduces the number of operations by the factor of 380. On the other hand, photon-counted images have the potential to be compressed with high compression ratios by using efficient binary image compression methods.

## 6. Conclusion

In this paper, we have proposed the irradiance estimation of the three dimensional objects using photon-counted elemental images obtained by SAII system. The irradiance at different distances from the pick up plane is estimated using Maximum Likelihood estimator while photons counts are follow Poisson distribution with the mean parameter proportional to the desired irradiance. Confidential intervals are used to investigate the reliability of the estimator which states that the parameters of the imaging system can be modified in order to get the desired accuracy for the irradiance estimation. It is shown that computational reconstruction of the 3D objects in integral imaging systems coincide with Maximum likelihood irradiance estimation using photon-counted elemental images. The qualitative results of irradiance estimation illustrate that visualization improves using multi perspective photon-counted images. The performance of results is studied quantitatively using PSNR which increases exponentially with increasing number of photons per elemental image. Photon-counted images are the result of reduced input light or can be extracted from the irradiance values using the Poisson model for photon events. In this case, we may take advantage of the low number of photons generated by the photon-counting detector to improve the computational efficiency for data processing.

## Appendix A

The ensemble of realizations from Poisson random variables with mean
${N}_{p}{I}_{p}^{{z}_{0}}$
is denoted by{*C _{kl}*(

*p*+Δ

*p*)} where ${I}_{p}^{{z}_{0}}$ is the irradiance of one pixel of the 3D object located at

_{kl}*z*.

_{0}The size of this ensemble is equal to the number of elemental images, KL. By applying Central Limit Theorem to this KL independent identically distributed random variables we have:

As a result, for the pre assigned *α*∈(0,1), we claim that:

which leads to the following confidence interval for ${I}_{p}^{{z}_{0}}$ :

where *z*
_{α/2} is the upper 100(*α*/2)% point of the standard normal distribution that is obtained for each desired α as follows:

## References and links

**1. **Y. Frauel, T. Naughton, O. Matoba, E. Tahajuerce, and B. Javidi, “Three Dimensional Imaging and Display Using Computational Holographic Imaging,” Proc. IEEE Journal **94**636–654, (2006). [CrossRef]

**2. **Javidi and F. Okano, eds., *Three Dimensional Television, Video, and Display Technologies* (Springer, Berlin, 2002).

**3. **M. Levoy and P. Hanrahan. “Light field rendering” Proc. ACM Siggarph, ACM Press , 31–42 (1996).

**4. **B. Javidi, S.-H. Hong, and O. Matoba, “Multi dimensional optical sensors and imaging systems,” Appl. Opt. **45**, 2986–2994 (2006). [CrossRef] [PubMed]

**5. **Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE **94**, 591–607 (2006). [CrossRef]

**6. **H. Arimoto and B. Javidi, “Integrate three-dimensional imaging with computed reconstruction,” Opt. Lett. **26**, 157–159 (2001). [CrossRef]

**7. **Stern and B. Javidi, “3-D computational synthetic aperture integral imaging (COMPSAII),” Opt. Express **11**, 2446–2451 (2003). [CrossRef] [PubMed]

**8. **Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A **15**, 2059–2065 (1998). [CrossRef]

**9. **M. G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. **7**, 821–825 (1908).

**10. **H. E. Ives, “Optical properties of a Lippmann lenticuled sheet,” J. Opt. Soc. Am. **21**, 171–176 (1931). [CrossRef]

**11. **S. Jang and B. Javidi, “Three-dimensional synthetic aperture integral imaging,” Opt. Lett. **27**, 1144–1146 (2002). [CrossRef]

**12. **T. Okoshi, “Three-dimensional displays,” Proc. IEEE **68**, 548–564 (1980). [CrossRef]

**13. **Y. Igarishi, H. Murata, and M. Ueda, “3D display system using a computer-generated integral photograph,” Jpn. J. Appl. Phys. **17**, 1683–1684 (1978). [CrossRef]

**14. **Tavakoli, M. Danesh Panah, B. Javidi, and E. Watson, “Performance of 3D integral imaging with position uncertainty,” Opt. Express **15**, 11889–11902 (2007). [CrossRef] [PubMed]

**15. **Wilburn, N. Joshi, V. Vaish, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” Proc. of the ACM **24**, 765–776 (2005).

**16. **M. Martínez-Corral, B. Javidi, R. Martínez-Cuenca, and G. Saavedra, “Integral imaging with improved depth of field by use of amplitude modulated microlens array,” Appl. Opt. **43**, 5806–5813 (2004). [CrossRef] [PubMed]

**17. **O. Matoba, E. Tajahuerce, and B. Javidi, “Real-time three-dimensional object recognition with multiple perspectives imaging,” Appl. Opt. **40**, 3318–3325 (2001). [CrossRef]

**18. **Y. Frauel and B. Javidi, “Digital three-dimensional image correlation by use of computer-reconstructed integral imaging,” Appl. Opt. **41**, 5488–5496 (2002). [CrossRef] [PubMed]

**19. **F. A. Sadjadi and A. Mahalanobis, “Target-adaptive polarimetric synthetic aperture radar target discrimination using maximum average correlation height filters,” Appl. Opt. **45**, 3063–3070 (2006). [CrossRef] [PubMed]

**20. **Erdmann and K. J. Gabriel, “High resolution digital photography by use of a scanning microlens array,” Appl. Opt. **40**, 5592–5599 (2001). [CrossRef]

**21. **K. Nitta, R. Shogenji, S. Miyatake, and J. Tanida, “Image reconstruction for thin observation module by bound optics by using the iterative backprojection method,” Appl. Opt. **45**, 2893–2900 (2006). [CrossRef] [PubMed]

**22. **G. M. Morris, “Scene matching using photon-limited images,” J. Opt. Soc. Am. A. **1**, 482–488 (1984). [CrossRef]

**23. **G. M. Morris, “Image correlation at low light levels: a computer simulation,” Appl. Opt. **23**, 3152–3159 (1984). [CrossRef] [PubMed]

**24. **Watson and G. M. Morris, “Comparison of infrared up conversion methods for photon-limited imaging,” J. Appl. Phys. **67**, 6075–6084 (1990). [CrossRef]

**25. **Watson and G. M. Morris, “Imaging thermal objects with photon-counting detector,” Appl. Opt. **31**, 4751–4757 (1992). [CrossRef] [PubMed]

**26. **D. Stucki, G. Ribordy, A. Stefanov, H. Zbinden, J. G. Rarity, and T. Wall, “Photon counting for quantum key distribution with Peltier cooled InGaAs/InP APDs,” J. Mod. Opt. **48**, 1967–1981 (2001). [CrossRef]

**27. **P. A. Hiskett, G. S. Buller, A. Y. Loudon, J. M. Smith, I Gontijo, A. C. Walker, P. D. Townsend, and M. J. Robertson, “Performance and design of InGaAs/InP photodiodes for single-photon counting at 1.55 um,” Appl. Opt. **39**, 6818–6829 (2000). [CrossRef]

**28. **L. Duraffourg, J.-M. Merolla, J.-P. Goedgebuer, N. Butterlin, and W. Rhods, “Photon Counting in the 1540-nm Wavelength Region with a Germanium Avalanche photodiode,” IEEE J. Quantum Electron. **37**, 75–79 (2001). [CrossRef]

**29. **K. Lange and R. Carson, “EM reconstruction algorithms for emission and transmission tomography,” Proc. IEEE J. Comput. Assist. Tomogr. **8**, 306–316 (1984).

**30. **M. Guillaume, P. Melon, and P. Refregier, “Maximum-likelihood estimation of an astronomical image from a sequence at low photon levels,” J. Opt. Soc. Am. A. **15**, 2841–2848 (1998). [CrossRef]

**31. **J. W. Goodman, *Statistical optics* (John Wiley & Sons, inc., 1985), Chap 9.

**32. **E. Kolaczyk, “Bayesian multi-scale models for Poisson processes,” J. Amer. Stat. Assoc. **94**, 920–933 (1999). [CrossRef]

**33. **S. Yeom, B. Javidi, and E. Watson, “Three-dimensional distortion-tolerant object recognition using photon-counting integral imaging,” Opt. Express **15**, 1513–1533 (2007). [CrossRef] [PubMed]

**34. **N. Mukhopadhyay, *Probability and Statistical Inference* (Marcel Dekker, Inc. New York, 2000).

**35. **N. Otsu, “A Threshold Selection Method from Gray-Level Histograms,” IEEE Trans. Syst. Man. Cybern **9**, 62–66 (1979). [CrossRef]