Abstract

Abstract

We develop a 3D region tracking method based on Maximum A Posteriori (MAP) tracker and adapt it to digital hologram sequences to efficiently track biological microorganisms in holographic microscopy data. In our approach, the target surface is modeled as the iso-surface of a level set function which is evolved at each frame via level set Hamilton Jacobian update rule in Euler-Lagrangian framework. The statistical characteristics of the target microorganism versus the background are exploited to evolve the interface at each frame, thus the algorithm works independent of the shape or morphology of the target. We use the bivariate Gaussian distribution to model the reconstructed hologram data which enables us to take into account the correlation between the amplitude and phase of the reconstructed wavefront to obtain a more accurate tracking solution.

© 2007 Optical Society of America

1. Introduction

Dynamic events play a central role in the minute world of microorganisms. Mechanisms such as motion, growth, division and cell migration are examples that are commonly used by many living organisms in order to interact with the surrounding environment efficiently and to remain viable. Understanding these behaviors requires time lapse monitoring of microorganisms either in vivo or in vitro and will eventually provide a solid foundation for studying complex behaviors at the sub-cellular level.

Coherent optical imaging systems using digital holography [19] have been studied extensively for three-dimensional (3D) imaging, visualization and recognition of objects. Recently, 3D digital holographic microscopy has been adapted for 3D visualization and recognition of biological microorganisms [68]. In this methodology, 3D images of minute microorganisms are computationally reconstructed from a single recorded on axis digital hologram using Fresnel transformation at different longitudinal depths.

There have been numerous approaches to address detection and tracking problem in 2D images [1014]. A number of these methods rely on the computation of the motion field with the assumption of a translational, Euclidean or affine transformation [15, 16]. However, for non-rigid objects which may change shape or morphology, one needs a different treatment. Contour based object tracking is one possible solution [1719]. Such methods need to detect the object of interest once, and thereafter tracking is accomplished by moving the previous target contour toward the current actual boundaries. Active contour method [19] is limited to small displacements since it evaluates local intensity changes along the boundary to anchor the contour accordingly. On the other hand, region based methods exploit the information inside and outside of the target contour to achieve a more robust and flexible method [17, 18]. In this paper, we address a novel application by using a region tracking method based on Maximum A Posterior (MAP) estimator in a framework and we adapt it to 3D holographic data sequences to track microorganisms in a sequence of 3D holographic microscopy images. In our formulation, the target surface is modeled as the iso-surface of a level set function. At each incoming hologram frame, the iso-surface from the previous frame is used as the initial target surface, while the derived Bayesian function uses the statistical properties of the voxels to provide a speed function for surface evolution of the target level set between frames. This leads the iso-surface of the level set function that encloses the target in the most recent hologram. We believe that the novelty of this paper is two folded. First, it performs tracking in a volume data with both intensity and phase provided by holographic data. This makes tracking of transparent microorganisms or phase objects possible. Second, the correlation between the amplitude and the phase of each voxel in the reconstructed holographic volume data [8] is not overlooked which makes the data model more accurate.

In section 2, we briefly overview concepts of Single Exposure On Line (SEOL) digital holography and reconstruction. The MAP tracker is derived in section 3 followed by its level set implementation in section 4. The experimental results are demonstrated in section 5 by tracking a diatom alga in a sequence of holograms. The microorganism motion is digitally simulated from a single hologram recorded by SEOL digital holographic microscopy.

2. Single Exposure On Line (SEOL) digital holographic microscopy

Digital holography has been studied extensively for 3D micro/macro imaging [17]. Having a digital hologram recorded, one can computationally reconstruct the whole complex wavefront along the optical axis using inverse diffraction in the Fresnel or Franhofer regions [1].

In contrast to phase-shifting techniques [2] in which a couple of recordings are needed, SEOL holography [6, 7] requires a single exposure. Thus, SEOL can be used to study the dynamic phenomena in living microorganisms. Figure 1 illustrates the SEOL setup [6]. Let the scalar electric field within the specimen be V(x,y,z) in which the amplitude and phase are related to the absorption and refractive index change within the specimen. The field distribution at distance z0 from the specimen can be recorded on an optoelectronic sensor (e.g. CCD). Assume that the specimen has a thickness equal to δ. Using the recorded hologram, H(x,y), it is possible to recover wavefront in the specimen volume by reconstructing in Δz steps from z 0 to z 0+δ through propagating the angular spectrum back to specimen plane using two consecutive Fourier transforms [1, 68].

 

Fig 1. Single Exposure On Line digital holography setup. The diffracted field from the specimen is magnified by a 100x/0.8NA microscope objective.

Download Full Size | PPT Slide | PDF

The result is a complex dataset which conveys the amplitude and phase information of the electric field at the specimen plane. Arbitrary number of such reconstruction can be stacked to reconstruct the specimen volume. Figure 2 illustrates a sample volumetric reconstruction.

 

Fig 2. Diatom alga with 100x/0.8NA microscope objective (a) (332 KB) Movie of magnitude reconstruction in 1µm steps along optical axis [Media 1], (b) phase of a single plane of reconstruction in focus. (c) Magnitude of three planes in the reconstruction volume.

Download Full Size | PPT Slide | PDF

For biological specimen, the magnitude and phase elements are correlated. This is due to the fact that the intracellular structures and organelles which have higher absorption coefficient usually have higher density of complex molecules and proteins -thus higher refractive index-comparing to cytosol which is mainly constituted from hydrophilic cytoplasmic components. This fact is later used to model the hologram reconstruction with a bivariate Gaussian data.

3. Maximum A Posteriori (MAP) tracker

Region tracking can be formulated as an estimation problem. In this paper, we adopt a MAP estimator to perform region tracking in complex 3D reconstructed volumes from a sequence of acquired holograms. Our objective is to track the subspace ace T (t)⊂ℜ3 that represents the desired target in the volume, V (t)(x,y,z), formed by a stack of reconstructed planes. Each voxel is a complex number and superscript (t) denotes the frame index in the sequence. We make no assumption on the geometry, morphology or evolution of the target, nor do we assume a well defined boundary between the target and background. However, we assume that the statistical properties of the target have small variations between two successive frames, which holds true for microorganisms. We assume statistically independent voxels which follow a joint bivariate Gaussian distribution, N(µαφαφ), for magnitude (α) and phase (φ). This data model is proved to be a working model for holographic microscopy data [8].

In a Bayesian model, the distinguishing factor between the target and background is their statistical properties. In essence, each voxel can be associated with a number representing its likelihood to be of a target or of the background. This assignment is accomplished by observing the voxels in the spatial proximity of the voxel of interest through a probabilistic likelihood formulation. We show in appendix A. that by assuming statistically independent voxels in target and background regions with distinct distributions of Ptar and Pbck respectively, one can write the likelihood function of the voxel located at r 0=(x 0,y 0,z 0) as:

P(r0)=rκPtar[V(l+1)(r)V(t),T(t)]w(r)Pbck[V(t+1)(r)|V(t),T(t)]1w(r),

where κ denotes a spherical neighborhood region around r 0 with diameter d. Nevertheless, cubical or any other symmetric region will be equally appropriate [15, 17, 18]. Here, we have used w(.) as a binary volume mask which is 1 for the target voxels and 0 elsewhere. Using Eq. (1) and including the target surface a priori as in Eq. (A2), we can define the energy function at point r 0 as the negative log likelihood of the local spherical subregion as:

E(r0)=log{P(r0)}=rκw(r)logPtar(r)rκ(1w(r))logPbck(r)logP(T(t+1))

Appendix A. shows how this expression can be evaluated with a given reconstruction data.

4. Level set surface evolution

The target surface can be modeled in the level sets framework [20] which is based on an Eulerian, initial value partial differential equation formulation. Essentially, instead of explicitly treating a surface as a set of finite number of points, the interface is embedded into a higher dimension function, the level set φ:ℜ3→:ℜ. Assuming Γ(t) to be the target surface (a balloon) at time instant t, the embedding level set is defined as φ(p,t)=±q, where p denotes a point in :ℜ3 and q signifies the shortest Euclidean distance of p from interface Γ(t). Generalized Jordan theorem states that a nonintersecting interface only splits the space into two disjoint bounded and unbounded subspaces. The plus and minus signs are respectively used for the background (exterior) and target (interior) regions. In this formulation, the surface of interest can be explicitly written as the iso-surface of the level set Γ(t=t 0)={p| φ(p,t=t 0)=0}.

The choice of level-set for modeling object boundaries is primarily inspired by the similarities between the movement of such organisms and the level set evolution. On one hand, due to the highly sophisticated interaction of minute organisms with their environment, the shape, volume, orientation and position of a microorganism can change dramatically in the course of time. On the other hand, level-set formulation provides a framework in which a 3D boundary can be embedded in a flexible balloon with simple evolution equations [17, 18], which do not impose any restrictions on the shape, size or topology of the target boundary, unless we choose to impose some restrictions, e.g. smoothness. In addition, level sets provide an easy approach for targets which may split or merge. This can be a challenge in other explicit boundary representations, e.g. polygon, Fourier descriptors, splines. Ref. [8] applies a stochastic optimization on a single plane for segmentation purposes. However, with level set approach, nothing stochastic is involved, i.e. with proper parameters each point on the contour is guaranteed to get closer to the actual target boundary at each step.

It can be shown [20] that if each point propagates normal to the interface (n⃗=∇φ/|∇φ|), then evolution of Γ can be modeled as a partial differential Hamilton-Jacobi equation, ∂φ(p,t)/∂t=F(p)‖φ‖, where F(p) is the speed function. In discrete space-time, the continuous operators can be replaced with their discrete counterparts yielding the update rule for level set:

φ(p,t+1)=φ(p,t)+F(p)ijkφ,

where i,j and k are indices spanning the 3D spatial grid. Tracking the target subspace in the hologram 3D reconstruction, necessitates minimization of the energy functional E(r) in Eq. (2). Thus it is required to compute the derivatives of E(r) with respect to r=(x,y,z). The corresponding Euler-Lagrange equations from calculus of variation results in:

Er=(rκw(r)logPtar(r)+rκ(1w(r))logPbck(r)+logP(T(t+1)))n,

where n⃗ is the outward normal to the target surface. The term log P(T (t+1)) acts as the a priori for the target subspace. We implement the a priori such as smoothness of target surface by setting it proportional to the divergence of the normal vector to the target subspace boundary:

logP(T(t+1))=.n=.(φ(p,t))(φ(p,t)),

This term essentially avoids the sharp corners (with high divergence of the normal vector) to become sharper, while it becomes negligible when the surface is almost flat. Thus, the resulting speed function for the level set update equation in (3) becomes:

F(p)=rκw(r)logPtar(r)rκ(1w(r))logPbck(r)+.(φ(p,t)φ(p,t)).

The speed function at each point drives the interface outward (in the direction of n⃗) if the target likelihood of the neighboring sphere κ is larger than the background likelihood (positive F(p)) and vice versa. Also, if any part of the initial surface coincides with the actual target surface, the speed function will yield 0 and no changes will affect the iso-surface.

5. Experimental results

We present experimental results for the proposed tracking algorithm. Seawater algae are photosynthetic organisms that are considered one of important species in form of single eukaryotic cell, filamentous and colonies in many aquatic ecosystems. Among them, freshwater diatom algae populations are indicators of pollution, water temperature, nutrient levels and salinity, among other things. They can also swim with their flagellum machinery. In order to simulate the motion of a microorganism, a hologram of a specimen of diatom algae is recorded by SEOL digital holography as described in section 2. The Argon laser operates at 514.5nm; CCD sensor has 2048×2048 pixels at 9µm pitch and the specimen is sandwiched between two 0.17mm cover slips. The size of diatom alga is appx. 30–60µm. The diffracted field is magnified by a 100x/0.8NA objective. One of the diatoms in the field of view is segmented automatically in the original hologram by the bivariate snake algorithm [8]. This segment is then both translated and rotated over the background in small steps to produce a set of holograms resembling the realistic swimming of the alga in seawater. We move the microorganism in 21 steps corresponding to the number of recorded holograms during motion.

Each recorded hologram, H (t)(x,y), is reconstructed in 16 planes, all in the vicinity of the target separated by Δz=1µm to produce the complex volumetric image V (t)(x,y,z). Eq. (6) is calculated at each frame and used in the level-set update rule, Eq. (3), to achieve a new level set. For evaluating voxel likelihoods in Eq. (6), a spherical neighborhood with d=20 voxels is used. The movie in Fig. 3 shows the magnitude of the focused plane in reconstruction space, i.e. Re{V (t)(x,y,z 0)}. A slice of the recovered target surface is marked with a white line. The interface conforms to the target as it evolves. This technique can simultaneously track and segment the target object that is beneficial for automatic recognition purposes [68]. Unlike Ref. [8], this technique utilizes a stack of reconstructions for tracking; thus, even if the object of interest moves out of the best focus plane, other reconstruction planes will provide the necessary data for tracking. In addition, it has been shown that the Bayesian based techniques do not require exact edge information [8], thus slightly out of focus organisms, which appear diffracted in the reconstruction plane, can also be tracked with a reasonable accuracy.

 

Fig 3. (1.91MB) A sequence of cross sections of the reconstructed volume from diatom algae being contoured by the iso-surface of the proposed 3D tracker. [Media 2]

Download Full Size | PPT Slide | PDF

The nature of most tracking methods is based on 1D signals or 2D real images. We have developed our approach based on 3D holographic techniques. The results are demonstrated with holographic microscopy and real microorganisms, for which we do not have the actual boundary information which is analogous to the ground truth data for 1D tracking. Thus, it is difficult to quantitatively evaluate, compare or interpret comparisons with 1D or 2D methods unless the position of the microorganisms is known. Holographic techniques can provide sub micron accuracy in imaging and reconstruction of objects which is the fundamental limit in the proposed 3D volume tracking system. With the advances in coherent light source and image sensor technologies, the holographic imaging resolution continues to improve.

6. Conclusions

We presented a new application for tracking microorganisms in holographic microscopy images. The statistics of the target versus background are used to evolve the target 3D surface. Using bivariate Gaussian distribution to model the volume data, we incorporate the correlation of wavefront’s amplitude and phase to devise a more accurate tracking solution. We showed that implicit level set object surface formulation integrated with Bayesian evolution rules provide a promising technique for tracking objects in 3D images that can be further investigated in the aspects of computational cost, robustness and accuracy. We wish to thank Inkyu Moon for help with optical experiments. This work has been supported by DARPA.

Appendix A.

It is possible to find the probabilistically optimal estimate of new target subspace, T̂(t+1), by maximizing the posterior probability P(T(t+1)|V (t+1),V (t),T (t)). Therefore the MAP estimator is:

T̂(t+1)=argmaxT3{P(T(t+1)V(t+1),V(t),T(t))}.

Posterior probability is switched with the likelihood function and prior probability as:

T̂(t+1)=argmaxT3{P(V(t+1)|V(t),T(t),T(t+1))×P(T(t+1)|V(t),T(t))}

Two terms are involved in the right hand side of (A2). First term exploits the given knowledge of target subspace at time t and t+1 to evaluate how fit the observed volume at time t+1 is. The second term can be thought of as a prior on possible transformations on the target subspace to evolve it from frame t to t+1. Since we only assume that the target exhibits smooth surfaces and that the topology of target is independent of previous frames, the prior term reduces to P(T (t+1)). This is a realistic assumption for biological microorganisms.

By assuming a bivariate Gaussian distribution of magnitude and phase for each voxel, first two terms in (6) can be easily evaluated numerically by voxel data as [8]:

rκw(r)logP(r)=N(w)log(2πσ̂φσ̂α1ρ̂2)+12(σ̂φ)2rκ(φ(r)μ̂φ)2
+12(σ̂α1ρ̂2)2rκ(α(r)μ̂αρ̂σ̂α(φ(r)μ̂φ)σ̂φ)2

in which ∊{tar,bck}; voxels inside κ that belong to region are either marked as target or background by binary window, w(.). Thus, N(w) denotes the number of voxels inside κ that belong to region . Note that in Ref. [8] the likelihood is calculated for only two interior and exterior regions due to the fact that the initial contour shape is arbitrarily chosen. However, in the recent work, the likelihoods are calculated over local subspaces, κ, in the entire volume. The parameters defining the bivariate distribution are sample mean µ, variance σ, and correlation ρ, calculated from the final target and background subspaces in the previous frame.

References and links

1. J.W. Goodman, Introduction to Fourier Optics3nd ed., Roberts & Company, Englewood Colorado (2005).

2. Y. Frauel, T.J. Naughton, O. Matoba, E. Tajahuerce, and B. Javidi, “Three-dimensional imaging and processing using computational holographic imaging,” Proceedings. of the IEEE 94, 636–653 (2006). [CrossRef]  

3. E. Tajahuerce, O. Matoba, and B. Javidi, “Shift-Invariant Three-Dimensional Object Recognition by Means of Digital Holography,” Appl. Opt. 40, 3877–3886 (2001) [CrossRef]  

4. O. Matoba, T. J. Naughton, Y. Frauel, N. Bertaux, and B. Javidi, “Real-Time Three-Dimensional Object Reconstruction by Use of a Phase-Encoded Digital Hologram,” Appl. Opt. 41, 6187–6192 (2002). [CrossRef]   [PubMed]  

5. T. Nomura, S. Murata, E. Nitanai, and T. Numata, “Phase-shifting digital holography with a phase difference between orthogonal polarizations,” Appl. Opt. 45, 4873–4877 (2006). [CrossRef]   [PubMed]  

6. B. Javidi, S. Yeom, I. Moon, and M. Daneshpanah, “Real-time automated 3D sensing, detection, and recognition of dynamic biological micro-organic events,” Opt. Express 14, 3806–3829 (2006). [CrossRef]   [PubMed]  

7. A. Stern and B. Javidi, “Theoretical analysis of three-dimensional imaging and recognition of microorganisms with a single-exposure on-line holographic microscope,” J. Opt. Soc. Am. A 24, 163–168 (2007). [CrossRef]  

8. M. DaneshPanah and B. Javidi “Segmentation of 3D holographic images using bivariate jointly distributed region snake,” Opt. Express 14, 5143–5153 (2006). [CrossRef]   [PubMed]  

9. P. Ferraro, G. Coppola, S. De Nicola, A. Finizio, and G. Pierattini, “Digital holographic microscope with automatic focus tracking by detecting sample displacement in real time,” Opt. Lett. 28, 1257–1259 (2003). [CrossRef]   [PubMed]  

10. F. Sadjadi, ed., Selected Papers on Automatic Target Recognition, SPIE-CDROM, (2000).

11. P. Refregier, Noise Theory and Application to Physics, Springer (2005)

12. F. Sadjadi and A. Mahalonobis, “Target adaptive polarimetric SAR target discrimination using MACH filters,” Appl. Opt. 45, 7365–7374 (2006).

13. H. Kwon and N. M. Nasrabadi, “Kernel RX-algorithm: a nonlinear anomaly detector for hyperspectral imagery,” IEEE Trans. Geosci. Remote Sens. , 43, 388–397 (2005). [CrossRef]  

14. A. Mahalonobis, R. Muise, and S. Stanfill, “Performance evaluation of quadratic correlation filters for target detection and discrimination in IR imagery,” Appl. Opt. 43, 5198–5205 (2004).

15. A. Blake and A. Yuille eds, Active Vision, MIT Press, Cambridge (1992).

16. F. Goudail and P. Refregier, “Optimal target tracking on image sequences with a deterministic background,” J. Opt. Soc. Am. A 14, 3197–3207 (1997) [CrossRef]  

17. S.C. Zhu and A. Yuille, “Region competition: unifying snakes, region growing, and bayes/mdl for multiband image segmentation,” IEEE Trans. Pattern Anal. Mach. Intell. 18, 884–900 (1996). [CrossRef]  

18. A. Yilmaz, X. Li, and M. Shah, “Contour based object tracking with occlusion handling in video acquired using mobile cameras,” IEEE Trans. Pattern Anal. Mach. Intell. 26, 1531–1536 (2004). [CrossRef]   [PubMed]  

19. M. Kass, A. Witkin, and D. Terzopoulos, “Snakes: Active Contour Models”, Proc. ICCV , 259–268 (1987).

20. J. Sethian, Level Set Methods: Evolving Interfaces in Geometry, Fluid Mechanics Computer Vision and Material Sciences. Cambridge Univ. Press (1999).

References

  • View by:
  • |
  • |
  • |

  1. J. W. Goodman, Introduction to Fourier Optics, 3nd ed., (Roberts & Company, Englewood, Colorado, 2005).
  2. Y. Frauel, T. J. Naughton, O. Matoba, E. Tajahuerce, and B. Javidi, "Three-dimensional imaging and processing using computational holographic imaging," Proceedings. of the IEEE 94, 636-653 (2006).
    [CrossRef]
  3. E. Tajahuerce, O. Matoba, and B. Javidi, "Shift-Invariant Three-Dimensional Object Recognition by Means of Digital Holography," Appl. Opt. 40, 3877-3886 (2001).
    [CrossRef]
  4. O. Matoba, T. J. Naughton, Y. Frauel, N. Bertaux, B. Javidi, "Real-time three-dimensional object reconstruction by use of a phase-encoded digital Hologram," Appl. Opt. 41, 6187-6192 (2002).
    [CrossRef] [PubMed]
  5. T. Nomura, S. Murata, E. Nitanai, and T. Numata, "Phase-shifting digital holography with a phase difference between orthogonal polarizations," Appl. Opt. 45, 4873-4877 (2006).
    [CrossRef] [PubMed]
  6. B. Javidi, S. Yeom, I. Moon, and M. Daneshpanah, "Real-time automated 3D sensing, detection, and recognition of dynamic biological micro-organic events," Opt. Express 14, 3806-3829 (2006).
    [CrossRef] [PubMed]
  7. A. Stern and B. Javidi, "Theoretical analysis of three-dimensional imaging and recognition of micro-organisms with a single-exposure on-line holographic microscope," J. Opt. Soc. Am. A 24, 163-168 (2007).
    [CrossRef]
  8. M. DaneshPanah and B. Javidi "Segmentation of 3D holographic images using bivariate jointly distributed region snake," Opt. Express 14, 5143-5153 (2006).
    [CrossRef] [PubMed]
  9. P. Ferraro, G. Coppola, S. De Nicola, A. Finizio, and G. Pierattini, "Digital holographic microscope with automatic focus tracking by detecting sample displacement in real time," Opt. Lett. 28, 1257-1259 (2003).
    [CrossRef] [PubMed]
  10. F. Sadjadi, ed., Selected Papers on Automatic Target Recognition, SPIE-CDROM, (2000).
  11. P. Refregier, Noise Theory and Application to Physics, (Springer, 2005)
  12. F. Sadjadi and A. Mahalonobis, "Target adaptive polarimetric SAR target discrimination using MACH filters," Appl. Opt. 45, 7365-7374 (2006).
  13. H. Kwon and N. M. Nasrabadi, "Kernel RX-algorithm: a nonlinear anomaly detector for hyperspectral imagery," IEEE Trans. Geosci. Remote Sens. 43, 388-397 (2005).
    [CrossRef]
  14. A. Mahalonobis, R. Muise, and S. Stanfill, "Performance evaluation of quadratic correlation filters for target detection and discrimination in IR imagery," Appl. Opt. 43, 5198-5205 (2004).
  15. A. Blake, A. Yuille eds, Active Vision, MIT Press, Cambridge (1992).
  16. F. Goudail and P. Refregier, "Optimal target tracking on image sequences with a deterministic background," J. Opt. Soc. Am. A 14, 3197-3207 (1997).
    [CrossRef]
  17. S. C. Zhu and A. Yuille, "Region competition: unifying snakes, region growing, and bayes/mdl for multiband image segmentation," IEEE Trans. Pattern Anal. Mach. Intell. 18, 884-900 (1996).
    [CrossRef]
  18. A. Yilmaz, X. Li, and M. Shah, "Contour based object tracking with occlusion handling in video acquired using mobile cameras," IEEE Trans. Pattern Anal. Mach. Intell. 26, 1531-1536 (2004).
    [CrossRef] [PubMed]
  19. M. Kass, A. Witkin, and D. Terzopoulos, "Snakes: Active Contour Models," Proc. ICCV, 259-268 (1987).
  20. J. Sethian, Level Set Methods: Evolving Interfaces in Geometry, Fluid Mechanics Computer Vision and Material Sciences, (Cambridge Univ. Press, 1999).

2007 (1)

2006 (5)

2005 (1)

H. Kwon and N. M. Nasrabadi, "Kernel RX-algorithm: a nonlinear anomaly detector for hyperspectral imagery," IEEE Trans. Geosci. Remote Sens. 43, 388-397 (2005).
[CrossRef]

2004 (2)

A. Mahalonobis, R. Muise, and S. Stanfill, "Performance evaluation of quadratic correlation filters for target detection and discrimination in IR imagery," Appl. Opt. 43, 5198-5205 (2004).

A. Yilmaz, X. Li, and M. Shah, "Contour based object tracking with occlusion handling in video acquired using mobile cameras," IEEE Trans. Pattern Anal. Mach. Intell. 26, 1531-1536 (2004).
[CrossRef] [PubMed]

2003 (1)

2002 (1)

2001 (1)

1997 (1)

1996 (1)

S. C. Zhu and A. Yuille, "Region competition: unifying snakes, region growing, and bayes/mdl for multiband image segmentation," IEEE Trans. Pattern Anal. Mach. Intell. 18, 884-900 (1996).
[CrossRef]

Bertaux, N.

Coppola, G.

Daneshpanah, M.

De Nicola, S.

Ferraro, P.

Finizio, A.

Frauel, Y.

Y. Frauel, T. J. Naughton, O. Matoba, E. Tajahuerce, and B. Javidi, "Three-dimensional imaging and processing using computational holographic imaging," Proceedings. of the IEEE 94, 636-653 (2006).
[CrossRef]

O. Matoba, T. J. Naughton, Y. Frauel, N. Bertaux, B. Javidi, "Real-time three-dimensional object reconstruction by use of a phase-encoded digital Hologram," Appl. Opt. 41, 6187-6192 (2002).
[CrossRef] [PubMed]

Goudail, F.

Javidi, B.

Kwon, H.

H. Kwon and N. M. Nasrabadi, "Kernel RX-algorithm: a nonlinear anomaly detector for hyperspectral imagery," IEEE Trans. Geosci. Remote Sens. 43, 388-397 (2005).
[CrossRef]

Li, X.

A. Yilmaz, X. Li, and M. Shah, "Contour based object tracking with occlusion handling in video acquired using mobile cameras," IEEE Trans. Pattern Anal. Mach. Intell. 26, 1531-1536 (2004).
[CrossRef] [PubMed]

Mahalonobis, A.

Matoba, O.

Moon, I.

Muise, R.

Murata, S.

Nasrabadi, N. M.

H. Kwon and N. M. Nasrabadi, "Kernel RX-algorithm: a nonlinear anomaly detector for hyperspectral imagery," IEEE Trans. Geosci. Remote Sens. 43, 388-397 (2005).
[CrossRef]

Naughton, T. J.

Naughton, T.J.

Y. Frauel, T. J. Naughton, O. Matoba, E. Tajahuerce, and B. Javidi, "Three-dimensional imaging and processing using computational holographic imaging," Proceedings. of the IEEE 94, 636-653 (2006).
[CrossRef]

Nitanai, E.

Nomura, T.

Numata, T.

Pierattini, G.

Refregier, P.

Sadjadi, F.

Shah, M.

A. Yilmaz, X. Li, and M. Shah, "Contour based object tracking with occlusion handling in video acquired using mobile cameras," IEEE Trans. Pattern Anal. Mach. Intell. 26, 1531-1536 (2004).
[CrossRef] [PubMed]

Stanfill, S.

Stern, A.

Tajahuerce, E.

Y. Frauel, T. J. Naughton, O. Matoba, E. Tajahuerce, and B. Javidi, "Three-dimensional imaging and processing using computational holographic imaging," Proceedings. of the IEEE 94, 636-653 (2006).
[CrossRef]

E. Tajahuerce, O. Matoba, and B. Javidi, "Shift-Invariant Three-Dimensional Object Recognition by Means of Digital Holography," Appl. Opt. 40, 3877-3886 (2001).
[CrossRef]

Yeom, S.

Yilmaz, A.

A. Yilmaz, X. Li, and M. Shah, "Contour based object tracking with occlusion handling in video acquired using mobile cameras," IEEE Trans. Pattern Anal. Mach. Intell. 26, 1531-1536 (2004).
[CrossRef] [PubMed]

Yuille, A.

S. C. Zhu and A. Yuille, "Region competition: unifying snakes, region growing, and bayes/mdl for multiband image segmentation," IEEE Trans. Pattern Anal. Mach. Intell. 18, 884-900 (1996).
[CrossRef]

Zhu, S. C.

S. C. Zhu and A. Yuille, "Region competition: unifying snakes, region growing, and bayes/mdl for multiband image segmentation," IEEE Trans. Pattern Anal. Mach. Intell. 18, 884-900 (1996).
[CrossRef]

Appl. Opt. (5)

IEEE Trans. Geosci. Remote Sens. (1)

H. Kwon and N. M. Nasrabadi, "Kernel RX-algorithm: a nonlinear anomaly detector for hyperspectral imagery," IEEE Trans. Geosci. Remote Sens. 43, 388-397 (2005).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell. (2)

S. C. Zhu and A. Yuille, "Region competition: unifying snakes, region growing, and bayes/mdl for multiband image segmentation," IEEE Trans. Pattern Anal. Mach. Intell. 18, 884-900 (1996).
[CrossRef]

A. Yilmaz, X. Li, and M. Shah, "Contour based object tracking with occlusion handling in video acquired using mobile cameras," IEEE Trans. Pattern Anal. Mach. Intell. 26, 1531-1536 (2004).
[CrossRef] [PubMed]

J. Opt. Soc. Am. A (2)

Opt. Express (2)

Opt. Lett. (1)

Proceedings. of the IEEE (1)

Y. Frauel, T. J. Naughton, O. Matoba, E. Tajahuerce, and B. Javidi, "Three-dimensional imaging and processing using computational holographic imaging," Proceedings. of the IEEE 94, 636-653 (2006).
[CrossRef]

Other (6)

J. W. Goodman, Introduction to Fourier Optics, 3nd ed., (Roberts & Company, Englewood, Colorado, 2005).

M. Kass, A. Witkin, and D. Terzopoulos, "Snakes: Active Contour Models," Proc. ICCV, 259-268 (1987).

J. Sethian, Level Set Methods: Evolving Interfaces in Geometry, Fluid Mechanics Computer Vision and Material Sciences, (Cambridge Univ. Press, 1999).

F. Sadjadi, ed., Selected Papers on Automatic Target Recognition, SPIE-CDROM, (2000).

P. Refregier, Noise Theory and Application to Physics, (Springer, 2005)

A. Blake, A. Yuille eds, Active Vision, MIT Press, Cambridge (1992).

Supplementary Material (2)

» Media 1: MOV (332 KB)     
» Media 2: MOV (1962 KB)     

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (3)

Fig 1.
Fig 1.

Single Exposure On Line digital holography setup. The diffracted field from the specimen is magnified by a 100x/0.8NA microscope objective.

Fig 2.
Fig 2.

Diatom alga with 100x/0.8NA microscope objective (a) (332 KB) Movie of magnitude reconstruction in 1µm steps along optical axis [Media 1], (b) phase of a single plane of reconstruction in focus. (c) Magnitude of three planes in the reconstruction volume.

Fig 3.
Fig 3.

(1.91MB) A sequence of cross sections of the reconstructed volume from diatom algae being contoured by the iso-surface of the proposed 3D tracker. [Media 2]

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

P ( r 0 ) = r κ P tar [ V ( l + 1 ) ( r ) V ( t ) , T ( t ) ] w ( r ) P bck [ V ( t + 1 ) ( r ) | V ( t ) , T ( t ) ] 1 w ( r ) ,
E ( r 0 ) = log { P ( r 0 ) } = r κ w ( r ) log P tar ( r ) r κ ( 1 w ( r ) ) log P bck ( r ) log P ( T ( t + 1 ) )
φ ( p , t + 1 ) = φ ( p , t ) + F ( p ) ijk φ ,
E r = ( r κ w ( r ) log P tar ( r ) + r κ ( 1 w ( r ) ) log P bck ( r ) + log P ( T ( t + 1 ) ) ) n ,
log P ( T ( t + 1 ) ) = . n = . ( φ ( p , t ) ) ( φ ( p , t ) ) ,
F ( p ) = r κ w ( r ) log P tar ( r ) r κ ( 1 w ( r ) ) log P bck ( r ) + . ( φ ( p , t ) φ ( p , t ) ) .
T ̂ ( t + 1 ) = arg max T 3 { P ( T ( t + 1 ) V ( t + 1 ) , V ( t ) , T ( t ) ) } .
T ̂ ( t + 1 ) = arg max T 3 { P ( V ( t + 1 ) | V ( t ) , T ( t ) , T ( t + 1 ) ) × P ( T ( t + 1 ) | V ( t ) , T ( t ) ) }
r κ w ( r ) log P ( r ) = N ( w ) log ( 2 π σ ̂ φ σ ̂ α 1 ρ ̂ 2 ) + 1 2 ( σ ̂ φ ) 2 r κ ( φ ( r ) μ ̂ φ ) 2
+ 1 2 ( σ ̂ α 1 ρ ̂ 2 ) 2 r κ ( α ( r ) μ ̂ α ρ ̂ σ ̂ α ( φ ( r ) μ ̂ φ ) σ ̂ φ ) 2

Metrics