Abstract

We experimentally demonstrate a light-field moment microscopy (LFMM). The proposed technique employs a better estimation of the intensity derivative in solving the Poisson equation and therefore can significantly reduce the noise and error in the reconstructed light-field moment. The light field can be reconstructed then by using the moment, enabling the perspective view and depth estimation of the object. The proposed LFMM can be simply implemented using a standard commercial light microscope. This will open up new possibility for standard microscopes in high-resolution light-field observations.

© 2015 Optical Society of America

1. Introduction

Conventional cameras or microscopes capture two-dimensional (2D) image of the scene, whereas the depth (z) information is missing as the acquisition essentially integrates the light field along the z direction. However, the depth information is very important in many applications such as entertainment, optical inspection, biological imaging, and so on. In the last decade, a rapid, scanless three-dimensional (3D) imaging approach, called light field imaging, has received more and more attention because of its high compatibility with various applications ranging from photographic camera [1] to microscope [2, 3]. From the geometric optics point of view, light field imaging can simultaneously record the 2D spatial and 2D angular information of each light ray in the bundle of rays observed by the digital image sensors, thus allowing the generation of perspective views and focal stacks of the scene by computationally tracing the rays back to certain planes. Technically, the light field can be captured by using either a microlens array with a standard camera [1, 2], a camera array [4, 5], or an absorbing mask [6]. Of all these, microlens-array-based light field instruments receive particular interest because of its commercial success and potential application in light field microscopy [2, 3, 7]. However, there is an intrinsic trade-off between the spatial and the angular resolution due to the microlens array: one must sacrifices spatial resolution in order to obtain angular resolution, which determines the number of perspective views, number of focal stacks, and the accuracy of the estimation of depth map.

Alternatively, one can manage to collect the angular moment of the light rays using the light field moment imaging (LMI) technique recently proposed by Orth and Crozier [8]. With the LMI, one does not need to sacrifices the spatial resolution to achieve the perspective views since the raw images are taken by standard cameras or light microscopes. Recently, the LMI has also been used in Fourier hologram synthesis [9] and coherent diffractive imaging [10]. The principle of the LMI is to extract the first-order angular moment of the light field by solving the Poisson equation, which connects the Laplacian of the scalar potential of the light field with the intensity derivative, in a way similar to the transport of intensity equation [11]. We note that the Poisson solver severely affected by how accurately the intensity derivative can be estimated. To obtain a good estimation, one can think of taking the two intensity images at two lateral plane as close as possible from the pure mathematical perspective. However, as the two planes are getting close to each other, noise of the sensors becomes increasingly significant, and eventually submerges the intensity derivative, leading to a fault reconstruction.

In this paper, we propose a light field moment microscopy (LFMM) inspired by the work of Orth and Crozier [8] for the inspection of samples in micro-scale. Significant difference between our LFMM and the standard LMI is that we use a better method to estimate the intensity derivative in the Poisson solver. Both numerical and optical experiments are carried out to demonstrate our method. The paper is organized as follows: in Sec. 2, we review the basic principle of LMI. Then, we discuss the noise problem in LMI and describe our method of noise reduction LMI in Sec. 3. The simulation and experiment results of the proposed approach is described in Sec. 4 and Sec. 5, respectively. Finally, the conclusion is drew in Sec 6.

2. Light field moment imaging

According to the plenoptic function [12], the light field can be parameterized as a five-dimensional function L(x, y, z, tanθX, tanθY), where (x, y, z) is the spatial coordinates and (tanθX, tanθY) is the angular coordinates. The function L describes the position and propagation direction of each ray in the light field. The lateral intensity I(x, y) observed at a plane z can be expressed as the integral of L with respect to the angular dimensions [13] and over the distance z. The LMI connects the derivative of I(x, y; z) with respect to z to a scalar potential U(x, y; z) as [8]:

I(x,y;z)z=2U(x,y;z).
where 2=2x2+2y2 is the Laplacian, and the potential U (x, y ;z) is defined by ∇U (x, y ;z) = I (x, y; z)M(x, y; z), with M(x, y; z) = [s(x, y; z), t (x, y; z)] being the vector of the normalized first-order moments of the light field
[s,t]=L[tanθX,tanθY]dtanθXdtanθYLdtanθXdtanθY.

It is clear that in the coherent limit, Eq. (1) reduces to the transport of intensity equation, and the scalar potential U becomes the phase of the field. Numerically, Eq. (1) can be solved using a Fourier-tranform-based algorithm

U=1{H×{Iz}},
where and 1 denote the forward and inverse Fourier transform, and can be computed using the fast Fourier transform algorithm, and the “filter” H is defined as
H(fx,fy)={[4π2(fx2+fy2)]2,forfx2+fy201,forfx2+fy2=0
with (fx, fy) being the spatial frequency, and the intensity derivative is estimated using the first-order finite difference of the two intensity images I 1 = I (x, y; z 1) and I 2 = I (x, y; z 2) captured at two adjacent planes [8]
IzI1I2Δz,
where Δz = z 1 − z 2 denote the defocus distance between the two images.

Once the scalar potential U(x, y; z) is computed [Eq. (3)], we can reconstruct the angular moment vector field M via the definition of U (x, y; z)

[s(x,y),t(x,y)]=U/I1+I22,
where the mean intensity is used here. With the retrieved moment [s(x, y), t (x, y)], one can construct an approximate light field of the scene at z 1 plane by an empirical assumption that the angular distribution of the rays is Gaussian of standard deviation δ
L(x,y,tanθX,tanθY)=I2(x,y)exp{[tanθXs(x,y)]2/σ2[tanθYt(x,y)]2/σ2}.

Orth and Crozier have shown that σ 2 = NA 2 gives a good perspective shifting effect [8]. By extracting the 2D slice of the light field L at different (tanθX, tanθY) coordinate, we have different perspective view of the scene, similar to the perspective-shifting effect observed in the light-field photography [1] and light-field microscopy [2].

3. Noise reduction LFMM

It is seen from Eq. (5) in Section 2 that the intensity derivative ∂ I/∂ z is estimated from the intensity measured at the two neighoring planes. Ideally, the estimation accuracy is determined by the distance Δz between the two planes according to the finite difference theory [14]. Thus, Δz should be as small as possible in order to obtain a good estimation. Otherwise, the influence of the higher-order terms cannot be ignored, and the linearity assumption to finite difference approximation is invalid, leading to a significant noise in the reconstructed light field. Therefore, the most direct way to increase the estimation accuracy of ∂ I/∂ z is to capture the two images at two planes as close as possible. However, it becomes difficult to distinguish the difference between I 1 (x, y; z 1) and I 2 (x, y; z 2) when the two planes are too close owing to the dynamic range and noise characteristics of the camera in actual experiments, in particular when the object is weakly diffractive. As a result, significant noise appears in the difference I 1 (x, y; z 1) − I 2 (x, y; z 2). Furthermore, the Poisson solver described in Eq. (3) requires the construction of the function H(fx, fy) defined in Eq. (4). This filter function has an extremely large value as fx2+fy20, and is therefore very sensitive to the noise at lower spatial frequency region. This means that the reconstructed scalar potential U (x, y; z) is vulnerable to noise at this area as well, limiting the effect of the perspective view in LMI.

It becomes clear now that the approach in [8] Orth and Crozier proposed must have a trade-off between the estimation accuracy of ∂ I/∂ z and noise. In this work, we propose to fit the intensity derivative using multi-images to address this problem. In this algorithm, the intensity derivative ∂ I/∂ z is calculated in a pixelwise manner. At each pixel (xi,yj), where i, j = 1,…N, we first fit the intensity evolution to an m th order polynomial. Then the intensity derivative at this pixel, ∂ I(xi, yj)/∂ z, is calculated by directly taking the first order coefficient of the polynomial. One can see that all the images contribute to the estimation of intensity derivative, which will help improve the estimation accuracy. The similar method has been proposed in the field of phase retrieval for improving the accuracy of phase reconstruction [15].

Now let us provide a more theoretical analysis to the problem. To begin, we take the Taylor series expansion of the intensity captured at a defocus distance Δz, and have [14]

I(Δz)=I(0)+Δz1!Iz+(Δz)22!2Iz2+(Δz)33!3Iz3+(Δz)44!4Iz4+
where I (0) is the intensity at the focal plane and ∂ I ∂ z is the intensity derivative that we wish to measure at the focal plane. It is clear that the error between the estimated derivative and the ground-true derivative of intensity comes from the higher order terms in Eq. (8)
Iz=I(Δz)I(0)Δz[Δz2!2Iz2+(Δz)23!3Iz3+(Δz)34!4Iz4+].

Apparently, we need to estimate the higher-order terms in order to improve the estimation accuracy of ∂ I/∂ z. It has been shown that the nth order derivative of a function can be computed numerically from n + 1 or more equally spaced measurements using polynomial fitting. Then we can use the more accurate estimation of ∂ I/∂ z in the Poisson solver, and Eq. (3) now can be written as

U=1{H×{I(Δz)I(0)Δz[Δz2!2Iz2+(Δz)23!3Iz3+(Δz)34!4Iz4+]}}.

It is expected to obtain a better solution by solving this higher order Poisson equation. In the following sections, we provide both numerical simulation and experimental results to demonstrate this method.

4. Numerical simulation

In numerical simulation, we used a pure phase object to verify the usefulness of the higher order terms as this will provides a more quantitative understanding of the method. To put it simple, we use a coherent imaging system in our simulation. As discussed above, in the coherent limit, the light field moment reduces to the spatial frequency. In simulations, we first numerically construct a 4f imaging system based on wave optics, and propagate the object wavefront to a number of axial planes at both sides of the focal plane by using the Fresnel transform [16, 17], and store the intensity obtained at each of these planes. The distance between every two neighboring planes is set to the same value. Using this intensity data set, we can examine the evolution of the intensity at each pixel of these planes, and fit it to a curve specified by a high-order polynomial. The reduced noise ∂ I/∂ z can be calculated by directly taking the first order coefficient of the polynomial.

In simulation, we obtained intensity images with the axial spacing of Δz = 1 μm by using our numerical coherent imaging system. The simulation results are shown in Fig. 1 and Fig. 2. Figure 1(a) shows the axial evolution of the intensity of a certain pixel, fitted to the 1st, the 7th and the 13th orders polynomial, respectively. This clearly shows that the estimation of the intensity derivative using the first-order difference [Iz) − I(0)]/Δz only gives a very rough estimation, with significant deviation from the true value (shown by the blue markers). Instead, high-order polynomial fitting gives a far better estimation. To quantify the accuracy, we plot in Fig. 1(b) the fit error, the value of which corresponding to the 1st, 7th and 13th order polynomial fitting is 0.0249, 0.0065 and 0.0008 RMS, respectively. It is therefore expected to recover the moment M and the light field L using the better estimation of the intensity derivative, i.e., solving Eq. (10) instead of Eq. (3).

 figure: Fig. 1

Fig. 1 Simulation results: (a) Change of the intensity of the field at a certain pixel due to the propagation along the z direction (blue markers), and the fitting polynomial to the 1st, the 7th and the 13th order, and (b) the corresponding fit errors.

Download Full Size | PPT Slide | PDF

 figure: Fig. 2

Fig. 2 The s (first row) and t (second row) component of the moment vector field M. The two images in the first column show the ground truth data for comparison, and the images in the other three columns are recovered using the intensity derivative estimated by fitting intensity variation along z to the 1st, the 7th and the 13th orders polynomials, respectively.

Download Full Size | PPT Slide | PDF

Poisson noise was added to the simulated intensity patterns in order to test the noise reduction performance of our LFMM technique. The recovered moment vectors M using different orders of polynomial fitting are plotted in Fig. 2. The first column in Fig. 2 shows the s and t components of the ideal moment M. These two images serve as the ground-truth data to test the proposed method. The images in the second column were computed by solving Eq. (3). And the images in the third and the fourth columns were computed by solving the high-order LMI equation [Eq. (10)] with the intensity derivative fitted to the 7th and the 13th order, respectively.

It is clearly seen from Fig. 2 that the estimation of the intensity derivation using the finite difference gives the worst result due to the noise sensitivity of the LMI solver [8]. The result is significantly improved when using the higher-order estimation of the intensity derivative. However, we observed in the third and the fourth columns that the angular moments recovered using the 13th order polynomial fitting is worse than the one recovered using the 7th order fitting. The reason for this is that the higher-order polynomial fitting may fit the noise as the signal, yielding smaller RMS errors. This is the overfitting issue [18] encountered in curve fitting problems.

5. Experimental results

Now we set up an experimental system as shown in Fig. 3(a) to demonstrate the proposed LFMM technique. The system is just a standard microscope (Nikon Ni) mounted with a 10 × 0.3 NA microscope objective (Nikon) and an sCMOS camera (PCO Edge 4.2 M). The sample we used was a mosquitoes’ mouth-parts, which was a common biological sample slice one can buy on the market. We captured a stack of intensity images of the sample with an axial spacing of Δz = 1 μm by tuning the focus knobs. Three of intensity images are shown in Fig. 3(b).

 figure: Fig. 3

Fig. 3 (a) Experimental setup. The intensity focal stack images of the mosquitoes’ mouth-parts on a microscopic slice are captured by adjusting the focus knob. (b) Three of the sample images that we captured.

Download Full Size | PPT Slide | PDF

It is clearly seen from Fig 3(b) that the intensity images captured at adjacent planes are very close to each other. This means that the corresponding intensity difference is easily corrupted by noise, as evidenced in Fig. 4(a). Instead, the noise can be significantly suppressed by using the proposed method, as shown in Fig. 4(b)Fig. 4(d), in which the intensity derivative was calculated by fitting the polynomial to the 5th, the 7th and the 13th order using the experimental data set.

 figure: Fig. 4

Fig. 4 Intensity derivative ∂ I/∂ z calculated by using (a) the finite difference [Eq. (5)], and (b) the 5th, (c) the 7th, and (d) the 13th order fitting polynomials [Eq. (9)].

Download Full Size | PPT Slide | PDF

With these data at hand, we can recover the scalar potential U and the moment M by solving Eq. (10) and Eq. (6). The experimental results are plotted in Fig. 5. We observed that the recovered potentials and moments contain more detailed and finer structures by using our method (the second and third column) in comparison to the results recovered using the finite difference (the first column). This suggests that the proposed LFMM can significantly reduce the noise. In our experiments, we also observed the effect of overfitting in the recovered U and M. This is shown in the fourth column in Fig. 5 as well as in Fig. 4(d). One can see that noise is present in these images because fitting the data to a polynomial of too high order will take the noise as signal.

 figure: Fig. 5

Fig. 5 Experimental results of the recovered scalar potential U (first row) and the s (second row) and t (third row) component of the moment vector field M using the intensity derivative estimated by fitting the intensity variation along z to different order of polynomial.

Download Full Size | PPT Slide | PDF

We can then construct the light field L of the scene according to Eq. (7) after recovering the moment vector M. Different perspective views of the sample can then be achieved by extracting the corresponding 2D intensity slice out of the constructed light field L. Examples of the extracted perspective views of the mosquitoes’ mouth-parts at angles coordinates (0°, 5.8°) are shown in Fig. 6. One can find in Visualization 1, Visualization 2, Visualization 3, and Visualization 4 for the complete perspective view of the light field. The constructed light-fields shown in Fig. 6 and the Visualizations suggest that the proposed LFMM technique has a much better overall noise performance in comparison to the standard LMI. The noise performance of the LFMM is dependent on the estimation accuracy of the intensity derivative. In our experiments, we observed that the intensity derivative estimated using the 7th-order fitting offers the best perspective-shifting impression (see Visualization 3). The other two are either suffered from noise (see Visualization 2) due to underfitting, or suffered from degeneration, especially on the marginal perspective view (see Visualization 4), by virtue of overfitting.

 figure: Fig. 6

Fig. 6 2D intensity slice extracted from the constructed 4D light field with the angular coordinates (θX, θY) equal to (0°, 5.8°). It is clearly seen that the 7th-order fitting offers the best performance. See Visualization 1, Visualization 2, Visualization 3, and Visualization 4 for the view of the perspective shifting effect.

Download Full Size | PPT Slide | PDF

6. Conclusion

To summarize, we have described a noise reduction LFMM technique. The proposed technique makes a better estimation of the intensity derivative in solving the LMI equation by using polynomial fitting instead of finite difference, resulting in the significant reduction of noise. We have performed both simulation and experiments to demonstrate the proposed technique. From the experimental results, we can clearly see the light-field-like perspective view of the biological sample by using a standard microscope, without the sacrifice of the spatial resolution as in standard light field microscopy [2,3,7]. We believe that the proposed technique will open up new possibility for standard microscopes in high-resolution light-field observations.

It is also necessary to discuss the computational complexity of the noise reduction algorithm. Note that the computational complexity of polynomial fitting using the least square method is O(M 2 P), where M is the number of points, and P is the order of polynomial to be fit. So, for the images of N × N pixels as in our case, the total computation cost is O(M 2 N 2 P). It is worthy of pointing out that the pixelwise treatment of the algorithm can be implemented using parallel computing. In that case, the computation can be significantly sped up. In a follow-up study, we will try to find a better approach to reduce the noise with fewer images and lighter computation load.

Acknowledgments

We thank Antony Orth and Zhilong Jiang for helpful discussions with LMI. This project was supported by the National Science Foundation of China (Grant Numbers 61377005, 61327902, 61172178, 61371132 and 61471043), the Recruitment Program of Global Youth Experts, the Specialized Research Fund for the Doctoral Program of Higher Education (No. 20121101110022), and the International Science and Technology Cooperation Program of China (No. 2014DFR10960), and the Shanghai Pujiang Program.

References and links

1. R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Stanford Tech. Report. CTSR2005-02 (2005).

2. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006). [CrossRef]  

3. M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microscopy 235, 144–162 (2009). [CrossRef]  

4. B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005). [CrossRef]  

5. X. Lin, J. Wu, and Q. Dai, “Camera array based light field microscopy,” in Optical Molecular Probes, Imaging and Drug Delivery, paper JT3A.48, (Optical Society of America, 2015). [CrossRef]  

6. A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69–80 (2007). [CrossRef]  

7. R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014). [CrossRef]   [PubMed]  

8. A. Orth and K. B. Crozier, “Light field moment imaging,” Opt. Lett. 38, 2666–2668 (2013). [CrossRef]   [PubMed]  

9. N. Chen, J.-H. Park, J. Yeom, J. Kim, G. Li, and B. Lee, “Fourier hologram synthesis from two photographic images captured at different focal planes,” Signal Recovery and Synthesis, Paper JTu4A-6 (Optical Society of America, 2014). [CrossRef]  

10. Z. Jiang, X. Pan, C. Liu, L. Wang, and J. Zhu, “Light field moment imaging with the ptychographic iterative engine,” AIP Advances 4, 107108 (2014). [CrossRef]  

11. C. Zuo, Q. Chen, and A. Asundi, “Light field moment imaging: comment,” Opt. Lett. 39, 654 (2014). [CrossRef]   [PubMed]  

12. M. Levoy, “Light fields and computational imaging,” Comput. Aug. 39(8), 46–55 (2006). [CrossRef]  

13. Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” IEEE International Conference on Computational Photography (ICCP), 1–10 (2009).

14. G. Strang, Computational Science and Engineering, vol. 1 (Wellesley-Cambridge Press Wellesley, 2007).

15. L. Waller, L. Tian, and G. Barbastathis, “Transport of intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express 18, 12552–12561 (2010). [CrossRef]   [PubMed]  

16. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company Publishers, 2005).

17. D. G. Voelz, Computational Fourier Optics: a MATLAB Tutorial (SPIE Press Bellingham, 2011).

18. P. Flach, Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Combridge University Press, 2012). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Stanford Tech. Report. CTSR2005-02 (2005).
  2. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
    [Crossref]
  3. M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microscopy 235, 144–162 (2009).
    [Crossref]
  4. B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
    [Crossref]
  5. X. Lin, J. Wu, and Q. Dai, “Camera array based light field microscopy,” in Optical Molecular Probes, Imaging and Drug Delivery, paper JT3A.48, (Optical Society of America, 2015).
    [Crossref]
  6. A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69–80 (2007).
    [Crossref]
  7. R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
    [Crossref] [PubMed]
  8. A. Orth and K. B. Crozier, “Light field moment imaging,” Opt. Lett. 38, 2666–2668 (2013).
    [Crossref] [PubMed]
  9. N. Chen, J.-H. Park, J. Yeom, J. Kim, G. Li, and B. Lee, “Fourier hologram synthesis from two photographic images captured at different focal planes,” Signal Recovery and Synthesis, Paper JTu4A-6 (Optical Society of America, 2014).
    [Crossref]
  10. Z. Jiang, X. Pan, C. Liu, L. Wang, and J. Zhu, “Light field moment imaging with the ptychographic iterative engine,” AIP Advances 4, 107108 (2014).
    [Crossref]
  11. C. Zuo, Q. Chen, and A. Asundi, “Light field moment imaging: comment,” Opt. Lett. 39, 654 (2014).
    [Crossref] [PubMed]
  12. M. Levoy, “Light fields and computational imaging,” Comput. Aug. 39(8), 46–55 (2006).
    [Crossref]
  13. Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” IEEE International Conference on Computational Photography (ICCP), 1–10 (2009).
  14. G. Strang, Computational Science and Engineering, vol. 1 (Wellesley-Cambridge Press Wellesley, 2007).
  15. L. Waller, L. Tian, and G. Barbastathis, “Transport of intensity phase-amplitude imaging with higher order intensity derivatives,” Opt. Express 18, 12552–12561 (2010).
    [Crossref] [PubMed]
  16. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company Publishers, 2005).
  17. D. G. Voelz, Computational Fourier Optics: a MATLAB Tutorial (SPIE Press Bellingham, 2011).
  18. P. Flach, Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Combridge University Press, 2012).
    [Crossref]

2014 (3)

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Z. Jiang, X. Pan, C. Liu, L. Wang, and J. Zhu, “Light field moment imaging with the ptychographic iterative engine,” AIP Advances 4, 107108 (2014).
[Crossref]

C. Zuo, Q. Chen, and A. Asundi, “Light field moment imaging: comment,” Opt. Lett. 39, 654 (2014).
[Crossref] [PubMed]

2013 (1)

2010 (1)

2009 (1)

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microscopy 235, 144–162 (2009).
[Crossref]

2007 (1)

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69–80 (2007).
[Crossref]

2006 (2)

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[Crossref]

M. Levoy, “Light fields and computational imaging,” Comput. Aug. 39(8), 46–55 (2006).
[Crossref]

2005 (1)

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

Adams, A.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[Crossref]

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

Agrawal, A.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69–80 (2007).
[Crossref]

Antunez, E.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

Asundi, A.

Barbastathis, G.

Barth, A.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

Boyden, E. S.

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Brédif, M.

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Stanford Tech. Report. CTSR2005-02 (2005).

Chen, N.

N. Chen, J.-H. Park, J. Yeom, J. Kim, G. Li, and B. Lee, “Fourier hologram synthesis from two photographic images captured at different focal planes,” Signal Recovery and Synthesis, Paper JTu4A-6 (Optical Society of America, 2014).
[Crossref]

Chen, Q.

Crozier, K. B.

Dai, Q.

X. Lin, J. Wu, and Q. Dai, “Camera array based light field microscopy,” in Optical Molecular Probes, Imaging and Drug Delivery, paper JT3A.48, (Optical Society of America, 2015).
[Crossref]

Duval, G.

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Stanford Tech. Report. CTSR2005-02 (2005).

Flach, P.

P. Flach, Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Combridge University Press, 2012).
[Crossref]

Footer, M.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[Crossref]

Goodman, J. W.

J. W. Goodman, Introduction to Fourier Optics (Roberts and Company Publishers, 2005).

Hanrahan, P.

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Stanford Tech. Report. CTSR2005-02 (2005).

Hoffmann, M.

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Horowitz, M.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[Crossref]

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Stanford Tech. Report. CTSR2005-02 (2005).

Jiang, Z.

Z. Jiang, X. Pan, C. Liu, L. Wang, and J. Zhu, “Light field moment imaging with the ptychographic iterative engine,” AIP Advances 4, 107108 (2014).
[Crossref]

Joshi, N.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

Kato, S.

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Kim, J.

N. Chen, J.-H. Park, J. Yeom, J. Kim, G. Li, and B. Lee, “Fourier hologram synthesis from two photographic images captured at different focal planes,” Signal Recovery and Synthesis, Paper JTu4A-6 (Optical Society of America, 2014).
[Crossref]

Lee, B.

N. Chen, J.-H. Park, J. Yeom, J. Kim, G. Li, and B. Lee, “Fourier hologram synthesis from two photographic images captured at different focal planes,” Signal Recovery and Synthesis, Paper JTu4A-6 (Optical Society of America, 2014).
[Crossref]

Levoy, M.

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microscopy 235, 144–162 (2009).
[Crossref]

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[Crossref]

M. Levoy, “Light fields and computational imaging,” Comput. Aug. 39(8), 46–55 (2006).
[Crossref]

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Stanford Tech. Report. CTSR2005-02 (2005).

Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” IEEE International Conference on Computational Photography (ICCP), 1–10 (2009).

Li, G.

N. Chen, J.-H. Park, J. Yeom, J. Kim, G. Li, and B. Lee, “Fourier hologram synthesis from two photographic images captured at different focal planes,” Signal Recovery and Synthesis, Paper JTu4A-6 (Optical Society of America, 2014).
[Crossref]

Lin, X.

X. Lin, J. Wu, and Q. Dai, “Camera array based light field microscopy,” in Optical Molecular Probes, Imaging and Drug Delivery, paper JT3A.48, (Optical Society of America, 2015).
[Crossref]

Liu, C.

Z. Jiang, X. Pan, C. Liu, L. Wang, and J. Zhu, “Light field moment imaging with the ptychographic iterative engine,” AIP Advances 4, 107108 (2014).
[Crossref]

McDowall, I.

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microscopy 235, 144–162 (2009).
[Crossref]

Mohan, A.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69–80 (2007).
[Crossref]

Ng, R.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[Crossref]

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Stanford Tech. Report. CTSR2005-02 (2005).

Orth, A.

Pak, N.

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Pan, X.

Z. Jiang, X. Pan, C. Liu, L. Wang, and J. Zhu, “Light field moment imaging with the ptychographic iterative engine,” AIP Advances 4, 107108 (2014).
[Crossref]

Park, J.-H.

N. Chen, J.-H. Park, J. Yeom, J. Kim, G. Li, and B. Lee, “Fourier hologram synthesis from two photographic images captured at different focal planes,” Signal Recovery and Synthesis, Paper JTu4A-6 (Optical Society of America, 2014).
[Crossref]

Prevedel, R.

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Raskar, R.

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69–80 (2007).
[Crossref]

Schrodel, T.

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Strang, G.

G. Strang, Computational Science and Engineering, vol. 1 (Wellesley-Cambridge Press Wellesley, 2007).

Talvala, E.-V.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

Tian, L.

Tumblin, J.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69–80 (2007).
[Crossref]

Vaish, V.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

Veeraraghavan, A.

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69–80 (2007).
[Crossref]

Voelz, D. G.

D. G. Voelz, Computational Fourier Optics: a MATLAB Tutorial (SPIE Press Bellingham, 2011).

Waller, L.

Wang, L.

Z. Jiang, X. Pan, C. Liu, L. Wang, and J. Zhu, “Light field moment imaging with the ptychographic iterative engine,” AIP Advances 4, 107108 (2014).
[Crossref]

Wetzstein, G.

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Wilburn, B.

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

Wu, J.

X. Lin, J. Wu, and Q. Dai, “Camera array based light field microscopy,” in Optical Molecular Probes, Imaging and Drug Delivery, paper JT3A.48, (Optical Society of America, 2015).
[Crossref]

Yeom, J.

N. Chen, J.-H. Park, J. Yeom, J. Kim, G. Li, and B. Lee, “Fourier hologram synthesis from two photographic images captured at different focal planes,” Signal Recovery and Synthesis, Paper JTu4A-6 (Optical Society of America, 2014).
[Crossref]

Yoon, Y.-G.

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Zhang, Z.

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microscopy 235, 144–162 (2009).
[Crossref]

Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” IEEE International Conference on Computational Photography (ICCP), 1–10 (2009).

Zhu, J.

Z. Jiang, X. Pan, C. Liu, L. Wang, and J. Zhu, “Light field moment imaging with the ptychographic iterative engine,” AIP Advances 4, 107108 (2014).
[Crossref]

Zimmer, M.

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Zuo, C.

ACM Trans. Graph. (3)

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25, 924–934 (2006).
[Crossref]

B. Wilburn, N. Joshi, V. Vaish, E.-V. Talvala, E. Antunez, A. Barth, A. Adams, M. Horowitz, and M. Levoy, “High performance imaging using large camera arrays,” ACM Trans. Graph. 24, 765–776 (2005).
[Crossref]

A. Veeraraghavan, R. Raskar, A. Agrawal, A. Mohan, and J. Tumblin, “Dappled photography: Mask enhanced cameras for heterodyned light fields and coded aperture refocusing,” ACM Trans. Graph. 26, 69–80 (2007).
[Crossref]

AIP Advances (1)

Z. Jiang, X. Pan, C. Liu, L. Wang, and J. Zhu, “Light field moment imaging with the ptychographic iterative engine,” AIP Advances 4, 107108 (2014).
[Crossref]

Comput. Aug. (1)

M. Levoy, “Light fields and computational imaging,” Comput. Aug. 39(8), 46–55 (2006).
[Crossref]

J. Microscopy (1)

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microscopy 235, 144–162 (2009).
[Crossref]

Nature Methods (1)

R. Prevedel, Y.-G. Yoon, M. Hoffmann, N. Pak, G. Wetzstein, S. Kato, T. Schrodel, R. Raskar, M. Zimmer, and E. S. Boyden, “Simultaneous whole-animal 3D imaging of neuronal activity using light-field microscopy,” Nature Methods 11, 727–730 (2014).
[Crossref] [PubMed]

Opt. Express (1)

Opt. Lett. (2)

Other (8)

N. Chen, J.-H. Park, J. Yeom, J. Kim, G. Li, and B. Lee, “Fourier hologram synthesis from two photographic images captured at different focal planes,” Signal Recovery and Synthesis, Paper JTu4A-6 (Optical Society of America, 2014).
[Crossref]

X. Lin, J. Wu, and Q. Dai, “Camera array based light field microscopy,” in Optical Molecular Probes, Imaging and Drug Delivery, paper JT3A.48, (Optical Society of America, 2015).
[Crossref]

Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” IEEE International Conference on Computational Photography (ICCP), 1–10 (2009).

G. Strang, Computational Science and Engineering, vol. 1 (Wellesley-Cambridge Press Wellesley, 2007).

J. W. Goodman, Introduction to Fourier Optics (Roberts and Company Publishers, 2005).

D. G. Voelz, Computational Fourier Optics: a MATLAB Tutorial (SPIE Press Bellingham, 2011).

P. Flach, Machine Learning: The Art and Science of Algorithms that Make Sense of Data (Combridge University Press, 2012).
[Crossref]

R. Ng, M. Levoy, M. Brédif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a hand-held plenoptic camera,” Stanford Tech. Report. CTSR2005-02 (2005).

Supplementary Material (4)

NameDescription
» Visualization 1: AVI (2576 KB)      Perspective view of the light field with standard LMI
» Visualization 2: AVI (2492 KB)      Perspective view of the light field with 5th-order fitting
» Visualization 3: AVI (1973 KB)      Perspective view of the light field with 7th-order fitting
» Visualization 4: AVI (2417 KB)      Perspective view of the light field with 13th-order fitting

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 Simulation results: (a) Change of the intensity of the field at a certain pixel due to the propagation along the z direction (blue markers), and the fitting polynomial to the 1st, the 7th and the 13th order, and (b) the corresponding fit errors.
Fig. 2
Fig. 2 The s (first row) and t (second row) component of the moment vector field M. The two images in the first column show the ground truth data for comparison, and the images in the other three columns are recovered using the intensity derivative estimated by fitting intensity variation along z to the 1st, the 7th and the 13th orders polynomials, respectively.
Fig. 3
Fig. 3 (a) Experimental setup. The intensity focal stack images of the mosquitoes’ mouth-parts on a microscopic slice are captured by adjusting the focus knob. (b) Three of the sample images that we captured.
Fig. 4
Fig. 4 Intensity derivative ∂ I/∂ z calculated by using (a) the finite difference [Eq. (5)], and (b) the 5th, (c) the 7th, and (d) the 13th order fitting polynomials [Eq. (9)].
Fig. 5
Fig. 5 Experimental results of the recovered scalar potential U (first row) and the s (second row) and t (third row) component of the moment vector field M using the intensity derivative estimated by fitting the intensity variation along z to different order of polynomial.
Fig. 6
Fig. 6 2D intensity slice extracted from the constructed 4D light field with the angular coordinates (θX, θY ) equal to (0°, 5.8°). It is clearly seen that the 7th-order fitting offers the best performance. See Visualization 1, Visualization 2, Visualization 3, and Visualization 4 for the view of the perspective shifting effect.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

I ( x , y ; z ) z = 2 U ( x , y ; z ) .
[ s , t ] = L [ tan θ X , tan θ Y ] d tan θ X d tan θ Y L d tan θ X d tan θ Y .
U = 1 { H × { I z } } ,
H ( f x , f y ) = { [ 4 π 2 ( f x 2 + f y 2 ) ] 2 , for f x 2 + f y 2 0 1 , for f x 2 + f y 2 = 0
I z I 1 I 2 Δ z ,
[ s ( x , y ) , t ( x , y ) ] = U / I 1 + I 2 2 ,
L ( x , y , tan θ X , tan θ Y ) = I 2 ( x , y ) exp { [ tan θ X s ( x , y ) ] 2 / σ 2 [ tan θ Y t ( x , y ) ] 2 / σ 2 } .
I ( Δ z ) = I ( 0 ) + Δ z 1 ! I z + ( Δ z ) 2 2 ! 2 I z 2 + ( Δ z ) 3 3 ! 3 I z 3 + ( Δ z ) 4 4 ! 4 I z 4 +
I z = I ( Δ z ) I ( 0 ) Δ z [ Δ z 2 ! 2 I z 2 + ( Δ z ) 2 3 ! 3 I z 3 + ( Δ z ) 3 4 ! 4 I z 4 + ] .
U = 1 { H × { I ( Δ z ) I ( 0 ) Δ z [ Δ z 2 ! 2 I z 2 + ( Δ z ) 2 3 ! 3 I z 3 + ( Δ z ) 3 4 ! 4 I z 4 + ] } } .

Metrics