## Abstract

This paper presents a method to unwrap phase pixel by pixel by solely using geometric constraints of the structured light system without requiring additional image acquisition or another camera. Specifically, an artificial absolute phase map, Φ* _{min}*, at a given

*virtual*depth plane

*z*=

*z*, is created from geometric constraints of the calibrated structured light system; the wrapped phase is pixel-by-pixel unwrapped by referring to Φ

_{min}*. Since Φ*

_{min}*is defined in the projector space, the unwrapped phase obtained from this method is absolute for each pixel. Experimental results demonstrate the success of this proposed novel absolute phase unwrapping method.*

_{min}© 2016 Optical Society of America

## 1. Introduction

Three-dimensional (3D) shape measurement has numerous applications including in-situ quality control in manufacturing and disease diagnoses in medical practices.

Among all 3D shape measurement techniques developed, using phase instead of intensity has the merits of robustness to sensor noise, robustness to surface reflectivity variations, and being able to achieve high spatial and/or temporal resolutions [1]. Over the years, numerous phase retrieval methods have been developed including the Fourier method [2], the Windowed Fourier method [3], the phase-shifting methods [4]. Overall, a typical fringe analysis method only provides phase values ranging from −*π* to +*π* with a modulus of 2*π*, and thus a phase unwrapping algorithm has to be employed to obtain the continuous phase map before 3D reconstruction.

Conventionally, there are two types of phase unwrapping methods: spatial phase unwrapping and temporal phase unwrapping. The spatial phase unwrapping detects 2*π* discontinuities from the phase map itself and removes them by adding or subtracting multiple *K*(*x*, *y*) of 2*π* accordingly. The integer number *K*(*x*, *y*) is often referred as fringe order. The book edited by Ghiglia and Pritt [5] summarizes numerous phase unwrapping algorithms with some being faster yet less robust and some being more robust yet slower; the review paper written by Su and Chen [6] covers a wide range of reliability-guided phase unwrapping algorithms. Regardless of the robustness and speed of a spatial phase unwrapping algorithm, it typically only generates a *relative phase* map: a phase map that is relative to a point on the phase map itself within a connected component; thus it is difficult for any spatial phase unwrapping method to be employed if multiple isolated objects are to be simultaneously measured in the *absolute* sense. Furthermore, the majority of spatial phase unwrapping algorithms fail if abrupt surface geometric shape changes introduce more than 2*π* phase changes from one pixel to its neighboring pixels.

Temporal phase unwrapping, in contrast, tries to fundamentally eliminate the problems associated with the spatial phase unwrapping by acquiring more information. In essence, instead of finding the number of 2*π*, or fringe order *K*(*x*, *y*), to be added to each pixel from phase values surrounding that pixel, temporal phase unwrapping finds fringe order *K*(*x*, *y*) by referring to additional captured information, such as more fringe patterns. In other words, temporal phase unwrapping looks for information acquired temporally instead of spatially. Over the years, numerous temporal phase unwrapping methods have been developed including two- or multi-frequency (or -wavelength) phase-shifting techniques [7–9], gray-coding plus phase-shifting methods [10, 11], spatial-coding plus phase-shifting method [12], and phase-coding plus phase-shifting methods [13–15]. Temporal phase unwrapping can provide *absolute phase* since the phase is unwrapped by referring to pre-defined information. The aforementioned temporal phase unwrapping methods work well to retrieve absolute phase, yet they require capture additional images for fringe order *K*(*x*, *y*) determination. Since more images are acquired, temporal phase unwrapping slows down measurement speeds, which is not desirable for high-speed applications.

To address the reduced acquisition speed limitation of conventional temporal phase unwrapping approaches, researchers attempted to add the second camera to a standard single-camera, single-projector structured light system for absolute phase unwrapping [16–19]. Because the second camera is available to capture images from another perspective, stereo geometric constraints and epipolar geometry can be used for fringe order *K*(*x*, *y*) determination without using conventional spatial or temporal phase unwrapping. Furthermore, because the projector projects encoded structured patterns on it, the phase information can be used to ease the stereo matching problem of a traditional dual camera stereo technique. Basically, a point on the left camera is constrained to match points on the right camera with the *same phase* value. Since the wrapped phase map is periodical and contains stripes, the possible candidates on the right camera are not unique. By applying the epipolar geometric constraint of the stereo vision cameras, the corresponding points are limited to a few points on an epipolar line (only one point per fringe period). Finally, the correct corresponding point can be determined by verifying with the second camera image, the calibration volume, along with other techniques. This approach has been proven successful for absolute complex geometry capture. However, it usually requires global backward and forward checking to select the correct corresponding point out of many candidate points. Because a global searching is required, its computation speed is slow, and it is difficult to measure objects with sharp changing surface geometries. Furthermore, such a system requires accurately calibrating three sensors (two cameras and one projector), which is usually nontrivial.

To overcome limitations of the approach that requires global backward and forward searching, Lohry et al. [20] developed a method that combines with the conventional stereo approach to speed up the whole process. The proposed method includes two stages: 1) using a stereo matching algorithm to obtain the *coarse disparity* map to avoid global searching and checking; and 2) using local wrapped phase to further refine the coarse disparity to achieve higher measurement accuracy. To obtain more accurate disparity maps but not increasing the number of images used, the approach proposed by Lohry et al. [20] embedded a statistical pattern into the regular fringe pattern. This method does not require any geometric constraint imposed by the projector, and thus no projector calibration is required, further simplifying system development. However, due to the pixel-by-pixel disparity refinement, the processing speed is still limited. In general, it is still difficult for any of these methods to achieve real-time processing without significant hardware level program implementation and optimization. And because of the use of a second camera, they all increase hardware cost and algorithm complexity.

This paper proposes a novel absolute phase unwrapping method that determines absolute phase solely through geometric constraints of the structured light system without requiring another camera, more fringe patterns, or global search. Since no additional images are required, the measurement speeds are not compromised for 3D shape measurement; and because no global searching is required, the processing speed can be high. In brief, an artificial absolute phase map, Φ* _{min}*, at a given depth

*z*=

*z*is created from geometric constraints of the structured light system. For the proposed method, the wrapped phase is unwrapped pixel by pixel through referring to the artificially created phase map Φ

_{min}*. Since Φ*

_{min}*is defined in the projector space, the unwrapped phase obtained from this method is absolute. Experimental results demonstrate the success of this proposed novel absolute phase unwrapping method, despite its limited working depth range.*

_{min}Section 2 explains the principles of the proposed absolute phase unwrapping method. Section 3 presents experimental results to validate the proposed method and illustrate its limitations. Section 4 discusses the merits and limitations of the proposed absolute phase unwrapping method, and finally, Section 5 summarizes the paper.

## 2. Principle

This section thoroughly explains the principle of the proposed method. Specifically, we will present the standard pinhole camera model, and then detail the proposed pixel-by-pixel absolute phase unwrapping method through theoretical derivations and graphical illustrations.

#### 2.1. Three-step phase-shifting algorithm

Using phase instead of intensity for 3D optical metrology is advantageous since it is more robust to noise and surface reflectivity variations. Over the years, many fringe analysis techniques were developed to retrieve phase information including Fourier method and various phase-shifting methods [4]. Compared to other phase retrieval methods (e.g., Fourier or Windowed Fourier), phase-shifting methods have the advantage of measurement accuracy and robustness. Without loss of generality, this research uses a three-step phase-shifting algorithm for phase retrieval as an example to verify the performance of our proposed absolute phase unwrapping algorithm. Three phase-shifted fringe images with equal phase shifts can be mathematically written as

*I*′(

*x*,

*y*) is the average intensity,

*I*″(

*x*,

*y*) is intensity modulation, and

*ϕ*is the phase to be solved for. Solving Eqs.eqn:I1–eqn:I3 simultaneously leads to

*π*to

*π*with 2

*π*discontinuities. To remove 2

*π*discontinuities, a spatial or temporal phase unwrapping algorithm can be used. Phase unwrapping essentially determines integer number

*K*(

*x*,

*y*) for each point such that the unwrapped phase can be obtained using the following equation Here

*K*(

*x*,

*y*) is often referred as fringe order. If

*K*(

*x*,

*y*) is pre-defined in an absolute sense (such as those obtained from a temporal phase unwrapping algorithm), the unwrapped phase Φ(

*x*,

*y*) is

*absolute phase*. A spatial phase unwrapping typically yields

*K*(

*x*,

*y*) that is relative to one point on the wrapped phase map, and thus the spatial phase unwrapping can only generate

*relative phase*. It is important to note that we denote Φ(

*x*,

*y*) as the unwrapped phase of

*ϕ*(

*x*,

*y*) for this entire paper.

Instead of using a conventional temporal phase unwrapping method to obtain the absolute phase map by capturing more fringe images, we propose a new method to obtain the absolute phase map pixel by pixel solely by using geometric constraints of the structured light system without requiring any additional image acquisition or the second camera.

#### 2.2. Structured light system model

We first discuss the modeling of structured light system since it is critical to understanding the proposed method on how to use geometric constraints for pixel-by-pixel absolute phase unwrapping. We use a well-known pinhole model to describe the imaging system. This model essentially describes the projection from 3D world coordinates (*x ^{w}*,

*y*,

^{w}*z*) to 2D imaging coordinates (

^{w}*u*,

*v*). The linear pinhole model can be mathematically represented as,

*r*and

_{ij}*t*respectively represents the rotation and the translation from the world coordinate system to the lens coordinate system;

_{i}*s*is a scaling factor;

*f*and

_{u}*f*respectively describes the effective focal lengths;

_{v}*γ*is the skew factor of

*u*and

*v*axes; (

*u*

_{0},

*v*

_{0}) is the principle point, the intersection of the optical axis with the imaging plane.

To simplify mathematical representation, we define the projection matrix **P** as

**P**can be estimated through a well-established camera calibration approach.

The same lens model for the camera is applicable to the projector since the projector can be treated as the inverse of a camera [21]. If the camera and the projector calibration is performed under the same world coordinate system, i.e., define the same world coordinate system, the projection matrix for the camera and the projector will be physically correlated. For simplicity, we typically coincide the world coordinate system with the camera lens coordinate system or the projector lens coordinate system. Therefore, we will have two sets of equations with one for the camera and the other for the projector lens

**represents projector, superscript**

^{p}**presents camera, and**

^{c}*denotes the transpose operation of a matrix.*

^{t}After structured light system calibration, the projection matrices, **P ^{c}** and

**P**, are known. Equations eqn:cameraModel-eqn:projectorModel provide 6 equations with 7 unknowns (

^{p}*s*,

^{c}*s*,

^{p}*x*,

^{w}*y*,

^{w}*z*,

^{w}*u*,

^{p}*v*) for each camera pixel (

^{p}*u*,

^{c}*v*), and one additional constraint equation is required to solve all unknowns uniquely. For example, to recover (

^{c}*x*,

^{w}*y*,

^{w}*z*) coordinates for a 3D shape measurement system, the absolute phase can be used for a phase-shifting method [21]. The absolute phase, Φ(

^{w}*x*,

*y*), essentially creates a one-to-many mapping constraint equation that maps one point on the camera image plane (

*u*,

^{c}*v*) to a line,

^{c}*u*or

^{p}*v*, on the projector image plane with exactly the same phase value.

^{p}Assume that fringe patterns vary sinusoidally along *u ^{p}* direction and remain constant along

*v*direction. If absolute phase Φ is known for any given point,

^{p}*u*can be solved as

^{p}*u*= 0 and increases with

^{p}*u*. Here,

^{p}*T*is the fringe period in pixels.

#### 2.3. Absolute phase unwrapping using minimum phase map

Figure 1 graphically illustrates that using simple geometric optics and pinhole models of the lenses, the camera sensor plane can be mapped to the projector sensor plane if the object plane is a flat surface that is precisely placed at *z ^{w}* =

*z*. Once the mapped region is found on the projector sensor plane, the corresponding phase map can be pre-defined. Therefore, for the virtually defined

_{min}*z*plane, the corresponding phase Φ

_{min}*can be precisely created. In this paper, we propose to use the artificially created phase map Φ*

_{min}*for absolute phase unwrapping.*

_{min}Mathematically, for a given camera pixel (*u ^{c}*,

*v*), if we know

^{c}*z*value, all seven unknowns including (

^{w}*u*,

^{p}*v*) can be uniquely solved using Eqs. eqn:cameraModel-eqn:projectorModel. If (

^{p}*u*,

^{p}*v*) is known, the corresponding absolute phase value for that camera pixel (

^{p}*u*,

^{c}*v*) can be uniquely defined as

^{c}*T*pixels, and the fringe patterns vary along

*u*direction sinusoidally.

^{p}Therefore, for a virtual measurement plane at *z ^{w}* =

*z*

_{0}, one artificial absolute phase map can be defined pixel by pixel. If

*z*=

^{w}*z*

_{0}=

*z*is the closest depth of interest, we define this artificially created phase map as the minimum phase map Φ

_{min}*, which apparently is a function of*

_{min}*z*, fringe period

_{min}*T*, and projection matrices, i.e.,

As aforementioned, once a structured light system is calibrated under the same world coordinate system, the projection matrices **P*** ^{c}* and

**P**

*are known. Given*

^{p}*z*, we can solve for the corresponding

_{min}*x*and

^{w}*y*for each camera pixel (

^{w}*u*,

^{c}*v*) by simultaneously solving Eqs. (9)–(10),

^{c}**P**

*in*

^{c}*i*-th row and

*j*-th column. With known (

*x*,

^{w}*y*), Eq. eqn:projectorModel yields the corresponding (

^{w}*u*,

^{p}*v*) for each camera pixel

^{p}Once (*u ^{p}*,

*v*) is calculated, we can determine the absolute phase value Φ

^{p}*(*

_{min}*u*,

^{c}*v*) corresponding to

^{c}*z*for that pixel using Eq. eqn:absphase. Because Φ

_{min}*(*

_{min}*u*,

^{c}*v*) is created pixel to pixel on the camera imaging sensor, such a phase map can be used to unwrap the phase map pixel by pixel. And since this phase is defined on the projector space, the obtained unwrapped phase by referring to Φ

^{c}*(*

_{min}*u*,

^{c}*v*) is absolute.

^{c}Figure 2 illustrates the basic concept of using the minimum phase to correct 2*π* discontinuities. Assume the region on the projector that a camera captures at *z* = *z _{min}* is shown in the red dashed window, the wrapped phase,

*ϕ*

_{1}, directly obtained from three phase-shifted fringe patterns has one 2

*π*discontinuities, as shown in Fig. 2(a). The corresponding Φ

*is the continuous phase (or unwrapped phase) on the projector space, as shown in Fig. 2(b). The cross sections of the phase maps are shown in Fig. 2(c). This example shows that if the camera phase is below Φ*

_{min}*, 2*

_{min}*π*should be added to the camera wrapped phase for phase unwrapping. And if the wrapped phase

*ϕ*is captured at

*z*>

*z*as illustrated in the solid blue windowed region, 2

_{min}*π*should also be added to unwrap the phase if the wrapped phase is below Φ

*.*

_{min}Figure 3 illustrates the cases to unwrap 3 and 4 periods camera captured phase maps. Figure 3(a) shows a case where there are two 2*π* discontinuous locations, Point *A* and Point *B*. Between Point *A* and Point *B*, the phase difference Φ* _{min}* –

*ϕ*is larger than 0 but less than 2

*π*; and on the right of Point

*B*, the phase difference is larger than 2

*π*. Therefore, 2

*π*should be added to unwrap the point between Point

*A*and Point

*B*, and 4

*π*should be added on the right side of Point

*B*.

For cases with 4 fringe periods, as shown in Fig. 3(b), if 0 < Φ* _{min}* –

*ϕ*< 2

*π*(i.e., between Point

*A*and Point

*B*), 2

*π*should be added; 2

*π*< Φ

*–*

_{min}*ϕ*< 4

*π*(i.e., between Point

*B*and Point

*C*), 4

*π*should be added; and 4

*π*< Φ

*–*

_{min}*ϕ*< 6

*π*(i.e., beyond

*C*), 6

*π*should be added.

In general, the fringe order *K* for each pixel must satisfy the following condition

*K*can be determined as Here,

*ceil*[] is the ceiling operator that gives the nearest upper integer number.

## 3. Experiment

To verify the performance of the proposed temporal phase unwrapping method, we developed a structured light system, shown in Fig. 4, that includes one single CCD camera (Model: The Imaging Source DMK 23U618) with an 8 mm focal length lens (Model: Computar M0814-MP2) and one digital light processing (DLP) projector (Model: Dell M115HD). The camera resolution is 640 × 480. The lens is a 2/3-inch lens with an aperture of F/1.4. The projector’s native resolution is 1280 × 800 with a focal length of 14.95 mm fixed lens having an aperture of F/2.0. The projection distance ranges from 0.97 m to 2.58 m. The system was calibrated using the method developed by Li et al. [22] and the camera lens coordinate system was chosen as the world coordinate system for both the camera and the projector.

We tested the proposed absolute phase unwrapping method by measuring a single object. Figure 5 shows the results. In this and all following experiments, the fringe period used is 20 pixels, and three equally phase-shifted fringe patterns are captured. Figure 5(a) shows the photograph of the object to be measured, indicating complex 3D geometric structures. Figure 5(b) shows one of three captured fringe patterns. From three phase-shifted fringe patterns, the wrapped phase is then computed, as shown in Fig. 5(c). The phase map contains many periods of fringe patterns and thus has to be unwrapped before 3D reconstruction. We then generated the minimum phase map Φ* _{min}* at depth

*z*= 880 mm, as shown in Fig. 5(d). Using the minimum phase map, we can determine fringe order for the wrapped phase map shown in Fig. 5(c), from which the unwrapped phase can be obtained. Figure 5(e) shows the unwrapped phase map. Since the unwrapped phase is absolute phase, we can use the calibration data to reconstruct 3D geometry using the method discussed by Zhang and Huang [21]. Figure 5(f) shows the recovered 3D geometry, which is continuous and smooth, suggesting the proposed absolute phase unwrapping works well for single 3D object measurement.

_{min}Since the proposed phase unwrapping method can obtain absolute phase, it should be possible to simultaneously measure multiple isolated objects. To verify this capability, we measured two isolated 3D objects shown in Fig. 6(a). Figure 6(b) shows one fringe pattern, and Figure 6(c) shows the wrapped phase map. Using the same minimum phase map shown in Fig. 5(d), we generated the unwrapped phase as shown in Fig. 6(d). Finally, 3D geometry can be recovered as shown in Fig. 6(e). Clearly, both objects are properly reconstructed. This experiment demonstrates that two isolated complex objects can indeed be properly measured using the proposed method, confirming that the proposed phase unwrapping method can perform pixel-by-pixel phase unwrapping.

We also experimentally compared our proposed absolute phase unwrapping method with a conventional temporal phase unwrapping method. Figures 7–8 show the results. In this experiment, we used 7 binary patterns to determine fringe order *K*(*x*, *y*) that were used to temporarily unwrap the phase obtained from three phase-shifted fringe patterns [23]. Figure 7(a) shows the experimental object photograph. Again, we used two isolated 3D objects. Figure 7(b) shows the wrapped phase map from these phase-shifted fringe patterns. Figure 7(c) shows the unwrapped phase map by applying the conventional temporal phase unwrapping method. Since the system is calibrated, 3D shape was further reconstructed from the unwrapped phase map. Figure 7(d) shows the 3D result rendered in shaded mode. It is obvious that there are phase unwrapping artifacts (i.e., spikes) if no filtering is applied. This is a very common problem associated with any temporal phase unwrapping approach due to sampling error and camera noise [24]. In this research, we simply apply a median filter to locate those incorrectly unwrapped phase points and adjust them using the approach detailed by Karpinsky et al. [25]. Figure 7(e) shows the unwrapped phase, and Fig. 7(f) shows the final 3D reconstruction after applying a 11 × 11 median filter. As anticipated, the spiky noisy points are effectively reduced.

We then used our proposed approach to unwrap the phase map shown in Fig. 7(b) with the minimum phase map shown in Fig. 8(a). The unwrapped phase and 3D reconstruction are shown in Fig. 8(b)–8(c). It should be noted that no filtering was applied, and the result shows no spiky noise. This experiment demonstrated that our proposed method is actually more robust than temporal phase unwrapping. This is because the proposed method determines fringe order by referring to an artificially generated ideal and noise-free phase map Φ* _{min}*. In contrast, the conventional temporal phase unwrapping method determines fringe order by referring to other camera captured information that inherently contains noise.

To further visualize the difference between the unwrapped phase using our proposed method and the conventional temporal phase unwrapping method, the same cross section of two unwrapped phase maps shown in Figs. 7(e) and 8(b) are plotted in Fig. 8(d). They overlap well with each other on the object surface, further verifying that the phase obtained from our proposed phase unwrapping method is absolute.

Finally, we measured a large depth range sphere to compare the difference between our approach and the conventional temporal phase unwrapping approach. Figure 9 shows the results. For a large depth range measurement, the proposed method fails to correctly measure the overall object surface, shown in Fig. 9(a) and Fig. 9(c); yet the conventional temporal phase unwrapping method works well, shown in Fig. 9(b) and Fig. 9(d), indicating that the proposed method does not have the same measurement capacities as the conventional temporal phase unwrapping algorithm.

To understand the depth range limitation of the proposed method, we need understand how the phase is unwrapped if the object surface point is far away from the *z _{min}* plane. Figure 10 illustrates the maximum depth range, Δ

*z*, that the proposed method can handle. Point

_{max}*A*on the

*z*plane and Point

_{min}*B*on the object plane are imaged to the same point by the camera, yet they are projected from different points on the projector. If Point

*A*and Point

*B*have more than 2

*π*phase difference from projected patterns, the proposed method fails to determine correct fringe order.

Assuming the angle between projection direction and camera capture direction is *θ*, and the spatial span of one projected fringe period is Δ*y*, from simple trigonometrical derivations, we can find that the maximum depth range that our proposed method can handle is

This strong limitation is practically reasonable. For example, considering the experimental system we used for all our experiments, the angle between the projector optical axis and the camera optical axis is approximately *θ* = 10°. If we project horizontal fringe patterns with a fringe period of 20 pixels, which is approximately Δ*y* = 20/800 = 0.025 = 2.5% of the overall range of the projection area along *y* or vertical direction. Here 800 is the overall height of projector sensor in pixels. For this case, the depth range is limited to Δ*z _{max}* = Δ

*y*/tan

*θ*= 0.14 = 14%. Furthermore, since our camera only captures approximately 3/4 of the projector’s projection area, the overall maximum depth range is approximately 0.14 × 4/3 = 0.19 = 19% of sensing range of the camera, which is pretty good. If the camera is sensing 300 mm along

*y*axis, the overall depth range the proposed method is approximately 58 mm, which is reasonable for many applications. To further increase the maximum depth range, one can increase fringe period, or decrease the angle between the projector and the camera.

## 4. Discussion

This proposed pixel-wise absolute phase unwrapping method has the following advantages:

*High-speed 3D shape measurement*. Unlike traditional temporal phase unwrapping method, the proposed absolute phase unwrapping method does not require any additional image acquisition, and thus it is more suitable for high-speed applications.*High-speed processing*. The proposed method is inherently a pixel operation that does not refer to neighboring pixels or using any filters; the processing speed is fast especially if it is implemented on a parallel processor (e.g., graphics processing unit, GPU).*Simple system setup*. Unlike those state-of-art methods using one more camera without requiring more image acquisition, the proposed method does not change the single-projector and single-camera structured light system set up, and thus it can be directly employed by any conventional structured light system.*Simultaneous multiple objects measurement*. Similar to temporal phase unwrapping method, the proposed absolute phase unwrapping is pixel by pixel, and thus can be used to measure multiple objects at exactly the same time, as demonstrated by the experimental data in Sec. 3.*Robustness in fringe order determination*. The phase unwrapping artifacts (i.e. spikes) are minimum without any filtering, indicating that fringe order determination is very robust. This is because the proposed method determines fringe order by referring to an artificially generated ideal absolute phase map Φwithout any noise. In comparison, the conventional temporal phase unwrapping method determines fringe order by referring to other camera captured information that contains noise._{min}

However, this proposed absolute phase unwrapping method is not trouble free, as demonstrated in our experimental data (Fig. 9). The major limitations are:

*Confined measurement depth range*. As mentioned above, the maximum measurement depth range that the proposed approach can handle is within 2*π*changes in phase domain from the object plane to the minimum phase generation plane. In other words, any point on the object surface should not be too far away from*z*such that it will cause more than 2_{min}*π*changes. This is practically reasonable since the overall maximum depth range for our measurement system is approximately 19% of the camera’s overall sensing range.*Good z*. Since the maximum depth range is limited by the distance from_{min}estimation*z*plane to object plane, more accurate use of_{min}*z*plane leads to larger depth measurement range; and incorrect use of_{min}*z*plane could lead to incorrect phase unwrapping. In our research, we coincide the world coordinate system with the camera lens coordinate. By doing so,_{min}*z*plane has the minimum_{min}*z*value for 3D reconstruction. By doing so, one can estimate^{w}*z*of interest by a variety of means, one of which being the use of a ruler to measure the distance from the closet object point to the camera lens._{min}

Even with these limitations, the proposed pixel-by-pixel absolute phase unwrapping without the use of any additional image or hardware can substantially benefit the optical metrology field, especially for applications where high-speed absolute 3D shape measurement is required.

## 5. Summary

This paper has presented a method to unwrap phase pixel by pixel by referring to the artificial minimum phase map created solely using geometric constraints of the structured light system. Unlike conventional temporal phase unwrapping algorithms that require one to capture more images, the proposed absolute phase unwrapping method requires no additional image acquisition. Compared with those absolute phase measurement methods that use one additional camera, the proposed method does not require any additional camera to obtain absolute phase. Since it does not require any additional image acquisition or another camera, the proposed method has the advantage of measurement speed without increasing system complexity or cost. Experimental results demonstrated the success of our proposed pixel-by-pixel absolute phase unwrapping method. Despite its confined depth range, the proposed method is of significance to applications where high-speed 3D absolute shape measurement is necessary.

## Acknowledgments

We would like to thank Beiwen Li, Chufan Jiang, and Bogdan Vlahov for proofreading and critiquing the entire paper. We also thank other students for their kind discussions.

This study was sponsored by the National Science Foundation (NSF) under grant numbers CMMI-1521048. The views expressed in this paper are those of the authors and not necessarily those of the NSF.

## References and links

**1. **S. Zhang, “Recent progresses on real-time 3-D shape measurement using digital fringe projection techniques,” Opt. Laser Eng. **48**, 149–158 (2010). [CrossRef]

**2. **M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. **22**, 3977–3982 (1983). [CrossRef] [PubMed]

**3. **Q. Kemao, “Windowed fourier transform for fringe pattern analysis,” Appl. Opt. **43**, 2695–2702 (2004). [CrossRef] [PubMed]

**4. **D. Malacara, ed., *Optical Shop Testing* (John Wiley and Sons, 2007), 3rd ed. [CrossRef]

**5. **D. C. Ghiglia and M. D. Pritt, eds., *Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software* (John Wiley and Sons, 1998).

**6. **X. Su and W. Chen, “Reliability-guided phase unwrapping algorithm: a review,” Opt. Laser Eng. **42**, 245–261 (2004). [CrossRef]

**7. **Y.-Y. Cheng and J. C. Wyant, “Two-wavelength phase shifting interferometry,” Appl. Opt. **23**, 4539–4543 (1984). [CrossRef] [PubMed]

**8. **Y.-Y. Cheng and J. C. Wyant, “Multiple-wavelength phase shifting interferometry,” Appl. Opt. **24**, 804–807 (1985). [CrossRef]

**9. **D. P. Towers, J. D. C. Jones, and C. E. Towers, “Optimum frequency selection in multi-frequency interferometry,” Opt. Lett. **28**, 1–3 (2003). [CrossRef]

**10. **G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: Analysis and compensation of the systematic errors,” Appl. Opt. **38**, 6565–6573 (1999). [CrossRef]

**11. **Q. Zhang, X. Su, L. Xiang, and X. Sun, “3-D shape measurement based on complementary gray-code light,” Opt. Laser Eng. **50**, 574–579 (2012). [CrossRef]

**12. **Y. Li, H. Jin, and H. Wang, “Three-dimensional shape measurement using binary spatio-temporal encoded illumination,” J. Opt. A, Pure Appl. Opt. **11**, 075502 (2009). [CrossRef]

**13. **Y. Wang and S. Zhang, “Novel phase coding method for absolute phase retrieval,” Opt. Lett. **37**, 2067–2069 (2012). [CrossRef] [PubMed]

**14. **C. Zhou, T. Liu, S. Si, J. Xu, Y. Liu, and Z. Lei, “Phase coding method for absolute phase retrieval with a large number of codewords,” Opt. Express **20**, 24139–24150 (2012). [CrossRef]

**15. **C. Zhou, T. Liu, S. Si, J. Xu, Y. Liu, and Z. Lei, “An improved stair phase encoding method for absolute phase retrieval,” Opt. Laser Eng. **66**, 269–278 (2015). [CrossRef]

**16. **Z. Li, K. Zhong, Y. Li, X. Zhou, and Y. Shi, “Multiview phase shifting: a full-resolution and high-speed 3d measurement framework for arbitrary shape dynamic objects,” Opt. Lett. **38**, 1389–1391 (2013). [CrossRef] [PubMed]

**17. **K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Laser Eng. **51**, 1213–1222 (2013). [CrossRef]

**18. **C. Bräuer-Burchardt, P. Kühmstedt, and G. Notni, “Code minimization for fringe projection based 3d stereo sensors by calibration improvement,” Tech. rep., arXiv (2014). (available at arXiv:1404.7298).

**19. **K. Song, S. Hu, X. Wen, and Y. Yan, “Fast 3D shape measurement using fourier transform profilometry without phase unwrapping,” Opt. Laser. Eng. **84**, 74–81 (2016). [CrossRef]

**20. **W. Lohry, V. Chen, and S. Zhang, “Absolute three-dimensional shape measurement using coded fringe patterns without phase unwrapping or projector calibration,” Opt. Express **22**, 1287–1301 (2014). [CrossRef] [PubMed]

**21. **S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. **45**, 083601 (2006). [CrossRef]

**22. **B. Li, N. Karpinsky, and S. Zhang, “Novel calibration method for structured light system with an out-of-focus projector,” Appl. Opt. **53**, 3415–3426 (2014). [CrossRef] [PubMed]

**23. **S. Zhang, “Flexible 3d shape measurement using projector defocusing: Extended measurement range,” Opt. Lett. **35**, 931–933 (2010).

**24. **C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Laser. Eng. **84**, 84–103 (2016). [CrossRef]

**25. **N. Karpinsky, M. Hoke, V. Chen, and S. Zhang, “High-resolution, real-time three-dimensional shape measurement on graphics processing unit,” Opt. Eng. **53**, 024105 (2014). [CrossRef]