## Abstract

We present a new phase unwrapping approach, which allows reconstruction of optically thick objects that are optically thin from at least one viewing angle, by considering the information stored in the object phase maps captured from consecutive angles. Our algorithm combines 1-D phase unwrapping in the angular dimension with conventional 2-D phase unwrapping, to achieve unwrapping of the object from the optically thick perspective. We thus obtain quantitative phase imaging of objects that were previously impossible to image in certain viewing angles. To demonstrate our approach, we present both numerical simulation and experimental results for quantitative phase imaging of biological cells.

© 2017 Optical Society of America

## 1. Introduction

Imaging biological cells *in vitro* is useful for both medical diagnosis and biological research. Yet, isolated cells *in vitro* are mostly transparent, and therefore cannot be imaged well via standard bright-field microscopy. Quantitative phase-contrast interferometric microscopy enables the imaging of isolated cells *in vitro*, without the use of cell staining, by measuring how much the light is delayed when passing through the sample [1, 2].

The reconstruction process includes the retrieval of the 2-D quantitative phase map from an interferogram [3], which is an ill posed problem. While retrieving the initial phase map from an interferogram is a fairly simple process [4, 5], the phase obtained is wrapped, meaning that it is bounded in the range [$0$,$2\pi $], and contains $2\pi $ jumps in spatial locations where the optical path difference (OPD) is larger than the illumination wavelength. For many optically thin objects, where the local spatial gradient in the actual phase map is low enough relative to the resolution of the sensor to prevent aliasing, the phase can be retrieved in a satisfactory manner by various digital algorithms [6–8]. Nevertheless, for optically thick objects, where this requirement is not met, it is impossible to differentiate between full multiplications of $2\pi $ from the information present in the 2-D wrapped phase map, resulting in an erroneous calculation of the final phase map.

In the field of synthetic aperture radar (SAR) interferometry, where the data usually consists of a stack of time-consecutive interferograms, it was previously suggested to utilize the time dimension in order to perform 1-D phase unwrapping along the time axis rather than 2-D phase unwrapping for each interferogram, to allow proper unwrapping even in cases of high gradients in the phase maps [9]. The concept was later developed to 3-D phase unwrapping in a 3-D space-time domain [10], based on the concept that in 3-D phase unwrapping more structural information is available than in the 2-D case or in the 1-D case, helping achieve a more reliable reconstruction.

In this paper, we suggest, for the first time to our knowledge, to use the angular dimension in order to earn information helpful for phase unwrapping of optically thick objects that cannot be reconstructed by conventional 2-D phase unwrapping algorithms. In this approach, we reconstruct the object phase for the angular view from which it is optically thick by utilizing multiple interferometric projections acquired from consecutive angles in small angular increments, containing at least one viewing angle where the object is optically thin. This is carried out by combining 1-D phase unwrapping in the angular dimension with conventional 2-D phase unwrapping in the image space of the phase maps. In this process, the information retrieved from the thin angular view enables reconstruction of phase maps of optically thick objects that cannot be straightforwardly reconstructed using 2-D phase information alone. Although the presented approach is general for angular phase-map acquisition, it is specifically useful for tomographic phase microscopy [11, 12], where multiple angular phase maps are acquired for 3-D refractive-index map reconstruction, enabling the reconstruction of quantitative phase projections even from the thick angular views.

## 2. Theory and existing approaches

The interferogram recorded by the camera contains the differential phase, as follows:

*E*and

_{s}*E*are the sample and reference complex waves, respectively, and

_{r}*φ*is the phase difference between them. As can be seen from Eq. (1), the interferogram does not capture the relative phase directly, but rather its cosine. Unlike the phase, which expresses the relative delay the light has accumulated in traversing the cell sample in relation with the clear medium, and as such is not a bounded entity, the cosine of the phase is a 2

*π*periodic, bounded function. Thus, retrieving the actual phase might not be a trivial task in locations where it exceeds 2

*π*. The relation between the actual, unwrapped phase map

*ψ(i,j)*and the wrapped phase map

*φ(i,j)*is given by:where

*m*(

*i*,

*j*) is an integer, which might be different for every pixel (

*i*,

*j*).

In general, conventional phase unwrapping algorithms are unable to resolve the unwrapped phase maps of objects with actual local spatial gradients with absolute values higher than $\pi $ radians. The source of this limitation is the demand for the samples to be uncorrupted by aliasing; According to the Nyquist sampling theorem, the sampling rate must always be at least twice the maximal frequency of the sampled function. In terms of sampling resolution, this means that the sampling distance must always be no longer than half of the smallest period of the sampled function. Since the interferogram contains a cosine function, the phase difference between adjacent sampling points of the actual phase cannot be higher than half of $2\pi $, thus no higher than $\pi $ [6]. As long as this condition is met, the sampling distance is small enough relative to the change in the phase, and there is no aliasing, meaning that the phase data can be fully retrieved by a phase unwrapping algorithm. If this condition is not met, the conventional algorithmic approach cannot retrieve the actual phase profile, as we have no knowledge of how many periods have passed between adjacent pixels. As a result, we might underestimate the phase difference to be bounded by$\pi $.

The fundamental principle of 1-D phase unwrapping is based on the assumption that the phase map is smooth enough and the sampling resolution is high enough such that the Nyquist theorem is fulfilled, and thus the actual phase difference between each two locations is within the range of [*-π, π*). Under this assumption, the phase map can be retrieved by moving along the phase contour and integrating the phase differences relative to a known boundary condition [13].

In 2-D phase unwrapping, there are many potential paths that can be integrated, and thus it is possible to overcome isolated problematic locations that do not meet this sampling condition, by choosing an integration path that does not stumble upon such points [7,8]. However, if the phase gradient in the transition from the background to the object is steep all around the object, as can occur in the images of optically thick objects, the sampling condition is not met for the entire boundary area. In such conditions, finding a correct path to integrate along may be impossible, resulting in failure of conventional phase unwrapping algorithms. Non-path based algorithms [6] would also inevitably fail in such a scenario, as there is not enough information in a single image in order to solve this inverse problem.

## 3. Proposed phase unwrapping algorithm

In the proposed approach, the phase unwrapping procedure uses information present in the angular dimension retrieved from interferograms taken at consecutive angles, in addition to the conventional 2-D information which is present in the object phase map.

There are two basic conditions that must be satisfied for the proposed algorithm to work properly. The first is the existence of a proper boundary condition, meaning that in addition to the phase map from the thick-dimension perspective, there is at least one additional phase map of the same imaged object, in which the object is sufficiently optically thin such that its phase map satisfies the sampling theorem, and thus can be properly retrieved using a conventional 2-D phase unwrapping algorithm. The second condition is that the Nyquist sampling theorem is met for corresponding pixels in the angular dimension for the rotation increment$\Delta \theta $.

Mathematically, this problem can be formulated as follows; Neglecting diffraction for simplicity, the 2-D phase map *ψ(x’,y’;θ)* of an object with a refractive-index distribution *n*(*x, y, z*) at viewing angle *θ* can be written as:

*n*

_{0}is the refractive-index of the surrounding medium, and

*h*is the maximal thickness of the sample. During angular projection acquisition, we assume that the (

*x', z'*) axes are rotated at angle $\theta $ around the

*y*axis, relative to a stationary coordinate system (

*x, y, z*), fulfilling:

The derivative of the phase in respect to the rotation angle $\theta $ can be computed using the chain rule as such:

In practice, the second term equals zero since *y'* is not a function of *θ* [Eq. (4)], and so does the third term since *ψ(x’,y’;θ)* is not a function of *z'*. Altogether, only the first term is left, yielding:

The first term in the right wing can be calculated from Eq. (3) according to Leibniz's rule to be:

*n*(

_{x}’*x’, y’, z’*) is the derivative of the refractive index of the object relative to the

*x'*coordinate.

The second term can be calculated from Eq. (4) to be:

which is exactly –*z*', according to Eq. (4).

From Eqs. (4), (6), (7), and (8), we then obtain the angular smoothness term, as follows:

In discrete representation, we can describe this derivative as follows:

The second condition that must be fulfilled for every *θ*, is

*θ*is, the larger the smoothness term [Eq. (9)] can be while still allowing the sampling condition to be met (meaning that a less smooth object is allowed). For most biological applications, we can assume that a small rotation increment will not change the phase map drastically, meaning that Eqs. (11) and (12) will be satisfied for a reasonably small Δ

*θ*. As an example, for a perfect spherical cell, the smoothness term will be zero (since the object is infinitely smooth), and thus for every rotation increment, the condition of Eq. (12) will be satisfied.

Given that these two conditions are satisfied, we can calculate the phase value of a pixel based on its value in a slightly different viewing angle. In other words, we can apply 1-D phase unwrapping in the angular dimension.

As we would still like to utilize the 2-D information stored in each separate phase map, especially for locations where the second condition is not met, the result of the 1-D phase unwrapping is combined with conventional 2-D phase unwrapping to achieve an optimal result. The implementation details of the proposed algorithm are summarized in Fig. 1..

As can be seen in Fig. 1, the input of the algorithm is a set of *N* wrapped phase maps, where at least one phase map can be properly unwrapped using a conventional 2-D phase unwrapping algorithm. This map, termed the optically-thin phase map, is selected based on the geometry of the object. In this paper, this is done manually.

In step 1 of the algorithm, conventional 2-D phase unwrapping should be applied to each phase map independently. For this aim, the algorithm proposed by Herráez et al. [8], based on sorting by reliability following a noncontinuous path, has proven to be especially effective, giving better results than many competing algorithms even in highly aliased data.

In step 2, we use the conventionally unwrapped phase maps (even if the conventional unwrapping has practically failed) to create binary masks indicating the area of the object in each of the phase maps. This is valid since even in cases of highly aliased data, the borders of the object are usually distinct from the background, even if not accurate in value, allowing a good estimation of the object area. Creating these binary masks can be done using a threshold separating the object from the background, followed by morphological closing used to fill empty spots caused by improper unwrapping and morphological opening to clean noises in the binary mask of the object that remained after thresholding [14]. Creating these binary masks is needed both for image registration and for step 6 elaborated below. As the 1-D phase unwrapping in the angular dimension is performed on matching pixels in phase maps acquired from different angles, special attention must be paid to the definition of matching pixels in the registration process. In our case, proper registration between phase maps captured from different angles would occur when the center of mass coincides with the center of the image. The binary masks can be used to find the center of mass of every phase map and

re-center each of the maps. This registration step would also be useful for tomographic reconstruction and 3-D refractive-index retrieval, if applied later.

In step 3, the phase maps are arranged in an array, such that the first element will be the optically-thin unwrapped phase map (the phase map that can be properly unwrapped conventionally, acting as a boundary condition). Following it, the other wrapped phase maps taken at consecutive angles are ordered, reaching up to the optically-thick angles (in which the object is optically thick and regular unwrapping algorithms fail). Each of the *N* phase maps in this array should already be properly registered, with the assistance of the binary masks created in step 2. The output of step 1 and the binary masks themselves should also be centered accordingly.

Steps 4-6 are then performed iteratively, for *k* = 2…*N*. In step 4, we apply 1-D phase unwrapping for each (*i*, *j*)'th pixel of the *k*'th matrix in the array, using the matching (*i*, *j*)'th pixel of the previous (*k*‒1)’th matrix, by integrating wrapped differences [13]. During the process of the 1-D phase unwrapping, as every value relies on the value from the previous phase image to be correct, it is useful to utilize the 2-D information present in each of the phase maps. Since it is possible that for certain locations the sampling condition will be satisfied in the *x*-*y* plane but not in the angular dimension, step 5 selects the maximal phase value yielded for the pixel (from the results returned by applying either 1-D angular phase unwrapping or regular 2-D phase unwrapping). This is valid assuming that in the case of under-sampling in the angular dimension, the phase will be underestimated when integrating from the optically-thin angle to the optically-thick angle, and in the case of under-sampling in the image dimension, the phase will also be underestimated. Hence, choosing the maximal value helps us choose the un-aliased dimension. In step 6, before finalizing the result and beginning to unwrap the next phase map in the array, we ensure that the boundaries of the object remain intact. To do that, the current unwrapped phase profile is multiplied pixel-wise by its respective binary mask created in step 2. It is also useful to apply a Gaussian low pass filter, to make up for small imperfections in registration. Once this is done, the resultant unwrapped phase map is positioned in the phase map array, replacing the respective wrapped phase map. At last, upon replacing all wrapped phase maps with their unwrapped versions, we obtain the final unwrapped stack of phase maps from all perspectives, including the optically-thick ones.

## 4. Results

In order to prove the validity of the proposed algorithm, we tested the algorithm on both simulated and experimental data. In both cases, the reconstruction process up to the phase unwrapping stage was performed using a conventional Fourier-space filtering algorithm that crops one of the cross-correlation terms [4], and the conventional 2-D phase unwrapping (step 1) was performed using the method suggested by Herráez et al. [8]. For comparison, we also implemented the 3-D version of Herráez's phase unwrapping algorithm [15].

#### 4.1 Simulation results

To demonstrate the superior performance of the algorithm for optically thick objects, we first conducted a numerical simulation. The input was a given refractive-index distribution of a 3-D test target imitating a yeast cell in suspension during reproduction by budding (cell division). The simulation imitated a cell rotated around its center of mass with an angular increment of 5° from 0° to 180°.

Figure 2(a) shows a number of 256×256-pixel phase maps taken from different angles. Figures 2(b)-2(d) show the respective reconstructed phase maps using the conventional 2-D phase unwrapping based on sorting by reliability following a noncontinuous path [8], applied to each phase map separately [Fig. 2(b)] in comparison to using 3-D phase unwrapping based on sorting by reliability following a noncontinuous path [15], applied to the entire set of phase maps [Fig. 2(c)], and to using the proposed angular phase unwrapping approach [Fig. 2(d)].

As is shown in Fig. 2, the conventional 2-D phase unwrapping algorithm [Fig. 2(b)] succeeded for the optically thin angles (0°, 50°, and 70°), but suffered from a substantial underestimation of the phase delay in the optically thick angular projections shown (85°, 100°, and 105°). The 3-D phase unwrapping algorithm [Fig. 2(c)] was able to realize high phase values well, but still failed in specific spatial locations. On the other hand, the new angular phase unwrapping algorithm [Fig. 2(d)] yields an estimation that is much closer to the actual phase map [Fig. 2(a)].

The reason for the failure of the 2-D phase unwrapping algorithm in the optically thick angles is the lack of information in isolated phase maps due to aliasing. For the 3-D phase unwrapping algorithm, if one dimension of the volume has a particularly high gradient, then the unwrapping including that dimension is prone to error [16], unlike the new algorithm which separates the 2-D dimension from the angular dimension, thus allowing for reliable phase unwrapping in one direction even if the other direction is aliased. Even though the information from all phase maps is considered in the 3-D algorithm and the phase unwrapping path is based on reliability, it lacks the external input regarding geometric understanding that is required to choose the correct integration path, and can only rely on estimating reliability from the data itself, causing optically thin phase maps to be corrupted, instead of utilizing the correct phase maps to resolve the problematic ones. We can see, for example, that for 0° and 50°, which are far enough from the problematic angles, the reconstruction result is similar to the one yielded by the 2-D phase unwrapping algorithm. But when we get closer to the optically thick angles infected with aliasing (70°), the quality of unwrapping is damaged. This influence from other phase maps is what helps this algorithm yield better results than its 2-D counterpart for the optically thick angles (85°, 100°, and 105°), though still infected by aliasing in edges. On the other hand, in the new angular phase unwrapping approach, the reliable phase map is chosen based on a geometrical understanding of the object, adding external information assisting in properly unwrapping optically thick phase maps without corrupting un-aliased phase maps.

For the above set of 37 256×256-pixel phase maps, the run time for the 2-D phase unwrapping was 571.9 sec while running on Matlab R2013b on a desktop computer equipped with Intel's core i7-4790 3.60 GHz 16.0 GB RAM 64 bit CPU. On the same computer, the 3‑D phase unwrapping run time was 26128.6 sec, and the new angular phase unwrapping algorithm run time was 572.3 sec (including the 2-D phase unwrapping run time, and 2 run times of the 1-D phase unwrapping algorithm: one from 0° to 90° and one from 180° to 90°).

In terms of computational complexity, for *L* phase maps of size *M* × *N*, the asymptotic complexity of both the 2-D phase unwrapping algorithm and the new angular phase unwrapping algorithm approaches *O*(*L·M·N·log*(*M·N*)). On the other hand, the asymptotic complexity of the 3-D phase unwrapping algorithm approaches *O*(*L·M·N·log*(*L·M·N*)), signifying a major advantage in computational complexity for the new algorithm.

#### 4.2 Experimental setup

In the experiment, we acquired angular off-axis interferometric projections of yeast cells (Saccharomyces Cerevisiae, longitudinal diameter range of 5–10 μm) in suspension during their reproduction by budding. The cells were rotated around their center of mass with an angular increment of 5°, ranging from 0° to 180°, while being trapped and manipulated by holographic optical tweezers [17]. The interferometric acquisition was done by an external off-axis interferometric module [18]. A simplified scheme of the experimental setup is shown in Fig. 3. In this setup, light from a diode-pumped laser is reflected to the sample by mirror M1 and then magnified by a 60× oil-immersion microscope objective. The enlarged image is projected by tube lens TL and mirror M2 onto the exit of the microscope, where an external off-axis interferometric module is positioned [1,18]. In this module, the magnified sample beam is split using beam splitter BS. One of the beams is spatially filtered using lenses L1 and L2 and pinhole P, which creates a reference beam that does not contain spatial modulation from the sample. The reference beam is back-reflected by mirror M3. The other beam from BS is projected through retro-reflector RR at a small angle, and together with the reference beam creates an off-axis interferogram on the camera. Off-axis image interferograms were acquired while rotating the trapped cell by holographic optical tweezers [17].

#### 4.3 Experimental results

Figure 4 shows selected phase maps from the set of 256×256-pixel phase maps taken at an angular increment of 5° across a range of 180° using the optical setup shown in Fig. 3, either reconstructed individually using the conventional 2-D phase unwrapping [Fig. 4(a)] or using the proposed angular phase unwrapping approach [Fig. 4(b)]. For some of the rotation angles, both budding cells are visible and separated; yet, for others one cell hides the second one, such that they appear to be one thick cell, creating the necessity for the algorithm due to the increased optical thickness of the sample.

As can be seen in Fig. 4, the conventional 2-D phase unwrapping algorithm [Fig. 4(a)] yields unsatisfactory results for the reconstruction of the optically-thick phase maps, exhibiting a distinct underestimation of the phase, while the new algorithm yields higher phase values [Fig. 4(b)], in agreement with the simulation results.

## 5. Discussion and conclusions

The new approach for angular phase unwrapping presented in this paper enables the phase unwrapping of optically thick objects, using the projections of the object from other perspectives, assuming that they meet two basic criteria. The first criterion is that for at least one viewing angle the object is optically thin, in the sense that the local spatial gradient does not exceed an absolute value of $\pi $ radians for the actual phase map acquired at this angle. The second requirement is that the phase gradient between most corresponding pixels in consecutive acquisition angles does not exceed an absolute value of $\pi $ radians, a requirement that can be achieved for most biological samples, provided that the angular increments are small enough. In fact, by utilizing the 1-D phase unwrapping for the angular dimension, instead of the conventional 2-D phase unwrapping approach separately on each phase projection, we transferred the $\pi $ radians gradient limitation from the *x*-*y* plane of the phase map, where we cannot control the gradient for a given image (other than increasing the sampling resolution), to the angular dimension, where we can control the gradient by rotating the object in smaller increments. As it is possible that for certain locations, the gradient will be sufficiently low in the *x-y* plane but not in the angular dimension, we choose the value of each pixel as the maximum result yielded by applying either 1-D phase unwrapping or 2-D phase unwrapping, a valid choice as long as we move from the optically-thin angle to the optically-thick angle.

Note that ordering the phase maps in an array going from the optically-thin to optically-thick phase maps is crucial for the success of the procedure, since when moving in this direction, the area taken by the objects decreases, such that pixels that used to belong to the object now become background pixels. Such pixels are of high risk of not meeting the sampling condition, since they experience the very same steep phase jump that made the conventional 2-D phase unwrapping fail. However, as we multiply the resultant unwrapped phase map by a binary mask zeroing the background pixels, the result is not damaged by the presence of aliasing in those pixels. Nevertheless, if we were to go in the opposite direction, from the optically-thick phase map to optically-thin phase maps, background pixels would turn into object pixels, creating aliasing that cannot be solved.

As this algorithm works on corresponding pixels in different angular phase projections, it requires proper registration between the different phase maps, as was elaborated in Section 3. It also requires acquisition over a continuous angular range from the optically-thin angle to the optically-thick angle in small angular increments, to allow continuity for the 1-D phase unwrapping in the angular dimension

To save computation time, it may be possible to perform edge detection on the wrapped phase maps to detect the object, and omit the background pixels for the unwrapping process.

Note that while this algorithm is most compatible with tomographic phase microscopy, where phase projections are already acquired from multiple angles for 3-D refractive-index reconstruction, it can be implemented for standard phase imaging as well, by capturing phase maps in small angular increments over an arc going from the optically-thin to optically-thick phase map, and it does not necessitate the coverage of the entire angular range, as is needed for tomography, as long as the conditions elaborated above are met.

The algorithm suggested here allows quantitative phase imaging of thick objects that were previously impossible to properly reconstruct, while still maintaining low computational complexity in comparison to 3-D phase unwrapping methods [15]. We expect this algorithm to be useful for imaging a variety of unspherical organisms (such as Euglena gracilis and blepharisma) or aggregations of several cells in suspension, which induce thick dimensions in certain perspectives. In addition, this algorithm is useful for quantitative phase imaging of fixed cells, since in this case the cells might be spread out and flat, and thus the thickness of the sample is significantly higher when the viewing angle is not perpendicular to the slide.

## Funding

Horizon 2020 European Research Council (ERC) Grant No. 678316.

## Acknowledgments

We thank Dr. Itay Barnea from the OMNI group in Tel Aviv University for useful biological discussions.

## References and links

**1. **N. T. Shaked, “Quantitative phase microscopy of biological samples using a portable interferometer,” Opt. Lett. **37**(11), 2016–2018 (2012). [CrossRef] [PubMed]

**2. **P. Girshovitz and N. T. Shaked, “Generalized cell morphological parameters based on interferometric phase microscopy and their application to cell life cycle characterization,” Biomed. Opt. Express **3**(8), 1757–1773 (2012). [CrossRef] [PubMed]

**3. **G. Dardikman, M. Habaza, L. Waller, and N. T. Shaked, “Video-rate processing in tomographic phase microscopy of biological cells using CUDA,” Opt. Express **24**(11), 11839–11854 (2016). [CrossRef] [PubMed]

**4. **P. Girshovitz and N. T. Shaked, “Real-time quantitative phase reconstruction in off-axis digital holography using multiplexing,” Opt. Lett. **39**(8), 2262–2265 (2014). [CrossRef] [PubMed]

**5. **P. Girshovitz and N. T. Shaked, “Fast phase processing in off-axis holography using multiplexing with complex encoding and live-cell fluctuation map calculation in real-time,” Opt. Express **23**(7), 8773–8787 (2015). [CrossRef] [PubMed]

**6. **D. C. Ghihlia and M. D. Pritt, *Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software* (Wiley, 1998).

**7. **R. M. Goldstein, H. A. Zebker, and C. L. Werner, “Satellite radar interferometry: two dimensional phase unwrapping,” Radio Sci. **23**(4), 713–720 (1988). [CrossRef]

**8. **M. A. Herráez, D. R. Burton, M. J. Lalor, and M. A. Gdeisat, “Fast two-dimensional phase-unwrapping algorithm based on sorting by reliability following a noncontinuous path,” Appl. Opt. **41**(35), 7437–7444 (2002). [CrossRef] [PubMed]

**9. **J. M. Huntley and H. Saldner, “Temporal phase-unwrapping algorithm for automated interferogram analysis,” Appl. Opt. **32**(17), 3074 (1993). [CrossRef] [PubMed]

**10. **M. Costantini, F. Malvarosa, F. Minati, L. Pietranera, and G. Milillo, “A three-dimensional phase unwrapping algorithm for processing of multitemporal SAR interferometric measurements,” in Proceedings of IEEE conference on Geoscience and Remote Sensing Symposium (IEEE, 2002), pp. 1741–1743. [CrossRef]

**11. **W. Choi, C. Fang-Yen, K. Badizadegan, S. Oh, N. Lue, R. R. Dasari, and M. S. Feld, “Tomographic phase microscopy,” Nat. Methods **4**(9), 717–719 (2007). [CrossRef] [PubMed]

**12. **F. Charrière, A. Marian, F. Montfort, J. Kuehn, T. Colomb, E. Cuche, P. Marquet, and C. Depeursinge, “Cell refractive index tomography by digital holographic microscopy,” Opt. Lett. **31**(2), 178–180 (2006). [CrossRef] [PubMed]

**13. **K. Itoh, “Analysis of the phase unwrapping algorithm,” Appl. Opt. **21**(14), 2470 (1982). [CrossRef] [PubMed]

**14. **R. A. Lotufo and E. R. Dougherty, *Hands- on Morphological Image Processing* (SPIE, 2003).

**15. **H. Abdul- Rahman, M. Gdeisat, D. Burton, and M. Lalor, “Fast three-dimensional phase-unwrapping algorithm based on sorting by reliability following a non-continuous path,” in *Optical Metrology, International Society for Optics and Photonics* (2005), pp. 32–40.

**16. **E. Barnhill, P. Kennedy, C. L. Johnson, M. Mada, and N. Roberts, “Real-time 4D phase unwrapping applied to magnetic resonance elastography,” Magn. Reson. Med. **73**(6), 2321–2331 (2015). [CrossRef] [PubMed]

**17. **M. Habaza, B. Gilboa, Y. Roichman, and N. T. Shaked, “Tomographic phase microscopy with 180° rotation of live cells in suspension by holographic optical tweezers,” Opt. Lett. **40**(8), 1881–1884 (2015). [CrossRef] [PubMed]

**18. **P. Girshovitz and N. T. Shaked, “Compact and portable low-coherence interferometer with off-axis geometry for quantitative phase microscopy and nanoscopy,” Opt. Express **21**(5), 5701–5714 (2013). [CrossRef] [PubMed]