Abstract

In this paper, a statistical approach is presented for three-dimensional (3D) visualization and recognition of objects having very small number of photons based on a parametric estimator. A truncated Poisson probability density function is assumed for modeling the distribution of small number of photons count observation. For 3D visualization and recognition of photon-limited objects, an integral imaging system is employed. We utilize virtual geometrical ray propagation for 3D reconstruction of objects. A maximum likelihood estimator (MLE) and statistical inference algorithms are applied to small number of photons counted elemental images captured with integral imaging. We have demonstrated that the MLE using a truncated Poisson model for estimating the average number of photon for each voxel of a photon starved 3D object has a small estimation error compared with the MLE using a Poisson model. Also, we present experiments to investigate the effect of 3D sensing parallax on the recognition performance under a fixed mean number of photons.

©2009 Optical Society of America

1. Introduction

There are many potential benefits and applications for three-dimensional (3D) sensing, visualization and recognition of objects using integral imaging (II), including photon-counting scenarios [1-8]. In [6], photon counting 3D object recognition using II was shown. Recently, 3D visualization of photon-limited objects by II was reported where a maximum likelihood estimator (MLE) [9-10] was applied to reconstruct the irradiance of the 3D scene pixels with Poisson distribution [7]. However, in very low illumination environments the scaling factor of a Poisson distribution may overestimate or underestimate the dispersion of observed photon counts. Therefore, the estimates of the parameter of a Poisson distribution provided by the MLE may be inaccurate. In this paper, we focus on a statistical approach which can provide 3D visualization and recognition of objects with small number of photons. For 3D sensing and visualization of photon starved objects, multi view imaging, that is, integral imaging is utilized. We shall consider cases where the average total number of photons in the scene is very small (less than 50). The image detector array senses a very small number of photons per each elemental image (or view image). Thus, a small number of pixels of the elemental image may detect a photon and vast majority of the pixels will not detect any photons. The likelihood that a pixel detects more than one photon becomes extremely small. For these reasons, we consider a truncated Poisson distribution [11] in modeling the photon counting elemental images. The MLE of the truncated Poisson distribution is applied to visualize the photon starved objects. For 3D recognition, statistical sampling algorithms [9] are developed to measure statistical characteristics of the photon counted objects and statistical hypothesis testing is used to analyze the difference of the sampling distributions of two populations.

2. Analysis

Typically, a Poisson model is used for modeling the distribution of the discrete count observation or approximating its distribution. The probability density function for counting C photons at an observation area or a detector pixel over time interval T seconds is well-known to be a Poisson distribution [12]:

Pr(C|ρ,T)=[ρT]CeρTC!,
where ρ is the intensity parameter measured in photons per second, and the random variable C has values 0, 1, 2, · · ·. We assume that an observation area has uniform size and is distributed to cover the imaging plane.

In II system, we assume that a camera array of regular square elements with uniform size records the view images of a 3D object illuminated by a photon beam. The irradiance at a voxel (pixel point) of a 3D object is projected and recorded on the corresponding pixel position of each elemental (view) image. The values of such pixels are assumed to be a random variable following a Poisson density function. And, it is assumed that the Poisson parameter ρ in Eq. (1) for the photon counts at each pixel in the photon limited image is proportional to the irradiance of the pixel in elemental image. Therefore, the photon limited images can be simulated from the elemental images by generating a Poisson random number Cp with the mean parameter N˜Ip at each pixel of the irradiance image, where subscript p denotes the pixel index, N˜ is mean number of photons in the photon limited image and Ip is the normalized irradiance [6]. For computational reconstruction [7] of a photon-limited object in II, geometrical ray optics computationally simulates the reverse of the pick up process. In this method, a 2D sectional image of the 3D object located at a particular distance from the sensor is reconstructed by back propagating the elemental images through the virtual pin-hole arrays.

It is possible that the dispersion of the Poisson model overestimates or underestimates the dispersion of the observed number of photons in photon starved environments because a single parameter of Poisson distribution is often insufficient to represent the population. In modeling discrete counts, it can be assumed that the observed counts are coming from the mixture density. A truncated Poisson distribution [11] is one in which frequencies of particular classes are only observed from the sample space. Since the image channels in II system is illuminated by a few photon beams, a truncated Poisson density function can be applied for modeling the distribution of the count observation as follows:

Pr(Cpt=cp)=Pr(Cp=cp)Pr(Cp<k)=exp(N˜Ip)(N˜Ip)cpcp!i=0k1exp(N˜Ip)(N˜Ip)ii!,
where it is assumed that only the classes corresponding to values under a particular value k are observed. The mean and variance of the truncated Poisson distribution are given by:
E[Cpt]=aE[Cp],V[Cpt]=aE[Cp]+(aa2){E[Cp]}2,
where a=i=0k1[i!/(exp(N˜Ip)(N˜Ip)i)]. It is noted that if the image pixel contains very few photons or one photon, i.e. N˜Ip<< 1, the mean value of the truncated Poisson distribution is not equal to the variance of the truncated Poisson distribution, that is, E[Cpt]V[Cpt]. In other words, it indicates that the variance of Poisson distribution Pr(Cp=cp) overestimates or underestimates the variance of the observed samples in case of a few photons illumination.

For 3D visualization of an object in extremely low illumination, we estimate each voxel value for 3D object with the samples obtained from one pixel value of each photon limited elemental image corresponding to one pixel point in the image reconstruction plane using a MLE on the truncated Poisson distribution. The log likelihood function is given by [9]:

L(N˜Ip|cp(1),...,cp(NxNy))=logn=1NxNyexp(N˜Ip)(N˜Ip)cp(n)cp(n)!i=0k1exp(N˜Ip)(N˜Ip)ii!,
where the samples cp(n) are independently obtained from one pixel value of each ray propagated elemental image in the overlapped area and NxNy is the total number of elemental images. Equation (4) provides a MLE as follows:
MLE(N˜Ip)=argmaxN˜IpL(N˜Ip|cp(1),...,cp(NxNy)).
In order to maximize the likelihood function in Eq. (4), the following expression can be shown by setting its derivative with respect to N˜Ipto zero [9]:
MLE(N˜Ip)=IIp(x,y,z0)=1NxNyn=1NxNycp(n)/(11NxNyn=1NxNycp(n))=MLE¯(N˜Ip)1MLE¯(N˜Ip),
where the MLE¯(N˜Ip)is calculated from the samples following a standard Poisson distribution, the particular value k is set to 2, and z0 is a distance between the arbitrary point of the 3D object and the camera array [7]. It is noted that MLE of the irradiance at the arbitrary point of the original 3D object, MLE(N˜Ip), is greater than the MLE¯(N˜Ip)which is obtained by standard Poisson model. For evaluation of the MLE(N˜Ip)performance, we assume that the MLE(N˜Ip) obtained by truncated Poisson distribution is MLE¯(N˜Ip)×γ, where a factor γ can be an arbitrary value which is greater than 1 or is equal to 1 (γ1). Therefore, the estimator error can be defined as follows [9]:
[I^p±zα/2(1NxNyN˜)2I^pγ],,
where zα/2 is the upper 100(α/2)% point of the standard normal distribution and MLE¯(N˜Ip)=N˜I^p. Ifγ=1, the confidence interval of theMLE(N˜Ip)is equal to the one of MLE¯(N˜Ip). The estimation error is proportional to 1/γ according to Eq. (7). It is noted that the MLE using a truncated Poisson model for estimating the average number of photons for each voxel of a photon starved 3D object has a small estimation error compared with the MLE using a Poisson model. In the following, we describe the design procedure for 3D recognition of objects with very small number of photons. The integrated image IIp is reconstructed n times by using the computational II reconstruction algorithm and the parametric MLE [9] with the photon-counts modeled by the truncated Poisson density function. Let IIpi denote the ith integrated image reconstructed from the corresponding photon-limited elemental images set, where the photon-limited elemental image is generated with a random number of photons. The statistical sampling distribution [9] of the dispersion parameter for the reconstructed integrated image is obtained by calculating the statistical standard deviation of the each IIpi, where i = 0,…,n. Then, given n ordered data samplesX(1),X(2), X(n) the empirical cumulative density function (ECDF) [13] for the statistical distribution is computed. A statistical K-S test [13] is applied for a statistical decision about populations on the basis of statistical sampling distribution information. The statistical method calculates the maximum distance between the ECDF Fr(u) and Fi(u)of reference and unknown input data sets.

3. Experimental results

Experiments to verify the proposed statistical approach for 3D visualization and recognition of objects with very small number of photons are presented. We recorded a 3D object’s elemental images set by moving a CCD camera transversally in both x and y directions using II system [7]. The resolution of image sensor array is a 2028×2044 and has a pixel size of 9μm×9μm. Each imaging channel in the II system captured a 2D elemental image containing directional information of the object (toy car) and then 10×10 elemental images were generated. In the experiments, the elemental image sets of the rear and front views of a toy car are recorded (see Fig. 1 ). The front and rear views are used as two different classes of data for classification.

 figure: Fig. 1

Fig. 1 Toy car used in the experiments. (a) Front view. (b) Rear view.

Download Full Size | PPT Slide | PDF

The photon-counted elemental images of the toy car were generated according to a photon counting detection model [5-6]. Figure 2 shows the sectional images of the 3D scenes for the front and rear views of toy car reconstructed with the corresponding photon-counted elemental images set using the proposed truncated photon counting model, respectively, where the cars were reconstructed at distance z0 = 100cm and we set the expected number of photons with N˜=10 (the k in Eq. (4) is set to 2) in each elemental image. The reconstructed image results are statistically analyzed in order to inspect the 3D recognition enhancement of objects with very small number of photons. A statistical K-S test is applied to test the 3D recognition performance. We first computationally reconstruct one sectional image of 3D scene located at distance z0 = 100cm from the photon-limited elemental images set using the truncated photon counting method. The process is repeated n times and each time the statistical standard deviation of the reconstructed sectional image is computed. We define the computed statistical standard deviation as random variable σ. As a result, we have n samples for the random variable σ and the statistical sampling distribution or histogram for the random variable is formed.

 figure: Fig. 2

Fig. 2 Sectional images of the 3D scenes reconstructed from very small number of photons-counted elemental images set by using the proposed truncated photon counting model. The total number of the elemental images (parallax) Ne was either 4 or 25. (a) Reconstructed front view car with Ne=4, (b) reconstructed rear view car with Ne=4, (c) reconstructed front view car with Ne=25, (d) reconstructed rear view car with Ne=25.

Download Full Size | PPT Slide | PDF

The statistical K-S test is performed to compare the two histograms between reference and unknown input data to reach a statistical decision. K-S statistic tests the null hypothesis (H0:Fr(u)=Fi(u)) that the two histograms follow same distributions using a level of significance. Figure 3(a) shows the computed K-S test p-value versus the sample size n for the random variables σ, where we changed the sample size n for calculation of σ from n=100 to n=1000 with an interval of 100. The expected number of photons and the total number of elemental images were set with N˜=10 and Ne=4, respectively. As shown in Fig. 3(a), a statistical p-value calculated by the proposed truncated photon counting method was less than a statistical p-value by the conventional photon counting method in the range of the sample size 100n1000. The statistical p-value is a measure of how much evidence we have against the null hypothesis. A small p-value is evidence against the null hypothesis while a large p-value means little or no evidence against the null hypothesis. A hypothesis may be rejected if the p-value is less than 0.01. It is noted in Fig. 3(a) that the statistical p-value computed by the proposed truncated photon counting method for random variable σ is smaller than 0.01 in the range of the sample size 500n1000 and the statistical p-value decreases very rapidly when the sample size increase, while the statistical p-values computed by the conventional Poisson counting method were larger than 0.19 in the same range. K-S test statistic calculated by truncated photon counting method discriminates between the two different classes with over 99% accuracy. However the conventional Poisson distribution with K-S states that with 81% confidence the two classes of data are same.

 figure: Fig. 3

Fig. 3 Experimental results of K-S test for the equality of two histograms of the front view and rear view cars for random variable σ versus the sample size n. “PC” and “TPC” refer to photon counting and truncated photon counting, respectively. (a) Ne=4 and (b) Ne=25.

Download Full Size | PPT Slide | PDF

Varying the parallax in II affects the photon counting 3D recognition performance for fixed number of photons. We reconstruct the 3D images with varying number of elemental images (parallax) while keeping the same average number of photonsN˜=10. Figure 3(b) shows the computed K-S test p-value versus the sample size n for the random variables σ, where Ne=25. As shown in Fig. 3(b), the statistical p-values computed by using the truncated photon counting is much lower than the Poisson distribution, that is, 1.54×10−66 versus 2.14×10−56 for sample size n=1000, respectively. The statistical p-values computed by using both the truncated photon counting and Poisson distribution rapidly decrease when the total number of elemental images increases. However the truncated photon counting method has a much lower statistical p-value than the Poisson distribution method. The difference between the statistical p-values of the two methods is more substantial when the total number of elemental images (parallax) increases. Therefore, given n sample size, the test statistic for random variable σ calculated by the truncated photon counting method discriminates between the two different data sets with much better accuracy compared with Poisson method. Figure 4 shows the cumulative density function plots of the statistical sampling distributions of known reference and unknown input data from the toy car for the random variable σ, where the sample size n was 1000. The total number of elemental images (parallax) Ne is 4 and the mean number of photonsN˜is 10. The graph shows that the K-S distance Λ obtained by the truncated photon counting method is larger than the Poisson distribution method, where the maximum difference between the cumulative distributions were 0.0480 for the Poisson distribution method and 0.1297 for the proposed truncated photon counting method. Thus, truncated photon counting method may provide better results for statistical classification.

 figure: Fig. 4

Fig. 4 Cumulative density function plots of the statistical sampling distributions of the toy car. The front view and rear view of toy car are used as reference and unknown input data for the random variable σ. (a) Poisson distribution method and (b) truncated photon counting method.

Download Full Size | PPT Slide | PDF

4. Conclusion

We have proposed multi view 3D visualization and recognition of objects with small number of photons using truncated photon counting method. For 3D visualization and recognition of photon counting objects, the sectional image of a 3D scene is estimated using parametric MLE of the photon-counts modeled by the truncated Poisson density function. Statistical inference algorithms are applied for a statistical decision about populations on the basis of statistical sampling distribution. It is shown that the parametric MLE using a truncated Poisson model for estimating the average number of photons for each voxel of a photon starved 3D object has a smaller estimation error than the MLE using a Poisson model. The 3D recognition performance for objects having small number of photons can be enhanced by using the truncated photon counting method. We have investigated effects of 3D imaging parallax on the recognition performance when a fixed mean number of photons are used.

References and links

1. B. Javidi, F. Okano, and J. Sun, 3D Imaging, Visualization, and Display Technologies (Springer, 2008).

2. G. Lippmann, “La photographie intégrale,” C. R. Acad. Sci. 146, 446–451 (1908).

3. H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15(8), 2059–2065 (1998). [CrossRef]  

4. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006). [CrossRef]  

5. O. Matoba, E. Tajahuerce, and B. Javidi, “Real-time three-dimensional object recognition with multiple perspectives imaging,” Appl. Opt. 40(20), 3318–3325 (2001). [CrossRef]  

6. S. Yeom, B. Javidi, and E. Watson, “Three-dimensional distortion-tolerant object recognition using photon-counting integral imaging,” Opt. Express 15(4), 1513–1533 (2007). [CrossRef]   [PubMed]  

7. I. Moon and B. Javidi, “Three-dimensional recognition of photon-starved events using computational integral imaging and statistical sampling,” Opt. Lett. 34(6), 731–733 (2009). [CrossRef]   [PubMed]  

8. Proceedings of IEEE Journal, special issue on 3D Technologies for Imaging and Displays 94, (2006).

9. N. Mukhopadhyay, ed., Probability and Statistical Inference (Marcel Dekker, Inc. New York, 2000).

10. M. Guillaume, P. Melon, P. Refregier, and A. Llebaria, “Maximum-likelihood estimation of an astronomical image from a sequence at low photon levels,” J. Opt. Soc. Am. A 15(11), 2841–2848 (1998). [CrossRef]  

11. V. Rohatgi, and A. Ehsanes Saleh, eds., An Introduction to Probability and Statistics (Wiley, 2000).

12. J. Goodman, ed., Statistical Optics, (John Wiley & Sons, Inc. 1985).

13. M. Hollander, and D. Wolfe, eds., Nonparametric Statistical Methods (Wiley, 1999).

References

  • View by:
  • |
  • |
  • |

  1. B. Javidi, F. Okano, and J. Sun, 3D Imaging, Visualization, and Display Technologies (Springer, 2008).
  2. G. Lippmann, “La photographie intégrale,” C. R. Acad. Sci. 146, 446–451 (1908).
  3. H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15(8), 2059–2065 (1998).
    [Crossref]
  4. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
    [Crossref]
  5. O. Matoba, E. Tajahuerce, and B. Javidi, “Real-time three-dimensional object recognition with multiple perspectives imaging,” Appl. Opt. 40(20), 3318–3325 (2001).
    [Crossref]
  6. S. Yeom, B. Javidi, and E. Watson, “Three-dimensional distortion-tolerant object recognition using photon-counting integral imaging,” Opt. Express 15(4), 1513–1533 (2007).
    [Crossref] [PubMed]
  7. I. Moon and B. Javidi, “Three-dimensional recognition of photon-starved events using computational integral imaging and statistical sampling,” Opt. Lett. 34(6), 731–733 (2009).
    [Crossref] [PubMed]
  8. Proceedings of IEEE Journal, special issue on 3D Technologies for Imaging and Displays 94, (2006).
  9. N. Mukhopadhyay, ed., Probability and Statistical Inference (Marcel Dekker, Inc. New York, 2000).
  10. M. Guillaume, P. Melon, P. Refregier, and A. Llebaria, “Maximum-likelihood estimation of an astronomical image from a sequence at low photon levels,” J. Opt. Soc. Am. A 15(11), 2841–2848 (1998).
    [Crossref]
  11. V. Rohatgi, and A. Ehsanes Saleh, eds., An Introduction to Probability and Statistics (Wiley, 2000).
  12. J. Goodman, ed., Statistical Optics, (John Wiley & Sons, Inc. 1985).
  13. M. Hollander, and D. Wolfe, eds., Nonparametric Statistical Methods (Wiley, 1999).

2009 (1)

2007 (1)

2006 (1)

A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
[Crossref]

2001 (1)

1998 (2)

1908 (1)

G. Lippmann, “La photographie intégrale,” C. R. Acad. Sci. 146, 446–451 (1908).

Guillaume, M.

Hoshino, H.

Isono, H.

Javidi, B.

Lippmann, G.

G. Lippmann, “La photographie intégrale,” C. R. Acad. Sci. 146, 446–451 (1908).

Llebaria, A.

Matoba, O.

Melon, P.

Moon, I.

Okano, F.

Refregier, P.

Stern, A.

A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
[Crossref]

Tajahuerce, E.

Watson, E.

Yeom, S.

Yuyama, I.

Appl. Opt. (1)

C. R. Acad. Sci. (1)

G. Lippmann, “La photographie intégrale,” C. R. Acad. Sci. 146, 446–451 (1908).

J. Opt. Soc. Am. A (2)

Opt. Express (1)

Opt. Lett. (1)

Proc. IEEE (1)

A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006).
[Crossref]

Other (6)

B. Javidi, F. Okano, and J. Sun, 3D Imaging, Visualization, and Display Technologies (Springer, 2008).

Proceedings of IEEE Journal, special issue on 3D Technologies for Imaging and Displays 94, (2006).

N. Mukhopadhyay, ed., Probability and Statistical Inference (Marcel Dekker, Inc. New York, 2000).

V. Rohatgi, and A. Ehsanes Saleh, eds., An Introduction to Probability and Statistics (Wiley, 2000).

J. Goodman, ed., Statistical Optics, (John Wiley & Sons, Inc. 1985).

M. Hollander, and D. Wolfe, eds., Nonparametric Statistical Methods (Wiley, 1999).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (4)

Fig. 1
Fig. 1 Toy car used in the experiments. (a) Front view. (b) Rear view.
Fig. 2
Fig. 2 Sectional images of the 3D scenes reconstructed from very small number of photons-counted elemental images set by using the proposed truncated photon counting model. The total number of the elemental images (parallax) Ne was either 4 or 25. (a) Reconstructed front view car with Ne =4, (b) reconstructed rear view car with Ne =4, (c) reconstructed front view car with Ne =25, (d) reconstructed rear view car with Ne =25.
Fig. 3
Fig. 3 Experimental results of K-S test for the equality of two histograms of the front view and rear view cars for random variable σ versus the sample size n. “PC” and “TPC” refer to photon counting and truncated photon counting, respectively. (a) Ne =4 and (b) Ne =25.
Fig. 4
Fig. 4 Cumulative density function plots of the statistical sampling distributions of the toy car. The front view and rear view of toy car are used as reference and unknown input data for the random variable σ. (a) Poisson distribution method and (b) truncated photon counting method.

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

Pr(C|ρ,T)=[ρT]CeρTC!,
Pr(Cpt=cp)=Pr(Cp=cp)Pr(Cp<k)=exp(N˜Ip)(N˜Ip)cpcp!i=0k1exp(N˜Ip)(N˜Ip)ii!,
E[Cpt]=aE[Cp],V[Cpt]=aE[Cp]+(aa2){E[Cp]}2,
L(N˜Ip|cp(1),...,cp(NxNy))=logn=1NxNyexp(N˜Ip)(N˜Ip)cp(n)cp(n)!i=0k1exp(N˜Ip)(N˜Ip)ii!,
MLE(N˜Ip)=argmaxN˜IpL(N˜Ip|cp(1),...,cp(NxNy)).
MLE(N˜Ip)=IIp(x,y,z0)=1NxNyn=1NxNycp(n)/(11NxNyn=1NxNycp(n))=MLE¯(N˜Ip)1MLE¯(N˜Ip),
[I^p±zα/2(1NxNyN˜)2I^pγ],,

Metrics