## Abstract

We propose a hybrid computational framework to reduce motion-induced measurement error by combining the Fourier transform profilometry (FTP) and phase-shifting profilometry (PSP). The proposed method is composed of three major steps: Step 1 is to extract continuous relative phase maps for each isolated object with single-shot FTP method and spatial phase unwrapping; Step 2 is to obtain an absolute phase map of the entire scene using PSP method, albeit motion-induced errors exist on the extracted absolute phase map; and Step 3 is to shift the continuous relative phase maps from Step 1 to generate final absolute phase maps for each isolated object by referring to the absolute phase map with error from Step 2. Experiments demonstrate the success of the proposed computational framework for measuring multiple isolated rapidly moving objects.

© 2016 Optical Society of America

## 1. Introduction

The rapidly evolving three-dimensional (3D) shape measurement technologies have enjoyed a wide applications ranging from industrial inspection to biomedical science. The non-contact structured light technology has been increasingly appealing to researchers in many different fields due to its flexibility and accuracy [1]. Yet a crucial challenge for this field of research is to perform accurate 3D shape measurement of dynamically moving or deformable objects, which typically introduces measurement errors caused by object motion.

To alleviate the measurement errors induced by object motion, it is desirable to reduce the number of fringe images required to reconstruct 3D geometry. The approaches that minimize the number of projection patterns include single-shot Fourier transform profilometry (FTP) [2], 1 + 1 FTP approach [3], *π*-shift FTP approach [4], 2 + 1 phase-shifting approach [5] and three-step phase-shifting profilometry (PSP) [6,7]. The approach that requires least number of projection patterns is the standard FTP approach which extracts phase information within a single-shot fringe image. This property of FTP approach is extremely advantageous when rapid motion is present in the measured scene. Most single-shot FTP approaches adopt spatial phase unwrapping, which detects 2*π* discontinuities solely from the wrapped phase map itself and removes them by adding or subtracting integer *k*(*x, y*) multiples of 2*π*. This integer number *k*(*x, y*) is often called *fringe order*. However, a fundamental limitation for spatial phase unwrapping is that the obtained unwrapped phase map is *relative*. This is simply because the phase value on a spatially unwrapped phase map is dependent on the phase value of the starting point within a connected component. As a result, it cannot handle scenes with spatially isolated object. Although researchers have come up with approaches that embed markers into the projected single pattern [8–10], the absolute phase retrieval could be problematic if the embedded markers are not clear on an isolated object.

Temporal phase unwrapping, which obtains insights for fringe order *k*(*x, y*) by acquiring additional information, has the advantage of robust absolute phase recovery especially for static scenes. Some widely adopted techniques include multi-frequency (or -wavelength) phase shifting techniques [11–15], binary [16] or Gray [17] stripe coding strategies and phase coding strategies [18–20]. These approaches commonly require many additional fringes (e.g. typically more than three) to determine the fringe order for absolute phase retrieval, which is undesirable for dynamic scene measurements.

To address this limitation of temporal phase unwrapping, Zuo et al. [21] proposed a four pattern strategy to further reduce the total number of patterns; Wang et al. [22] combined spatial and temporal phase unwrapping within phase shifting plus gray coding framework. These approaches can function well under majority cases especially when the object movement is not rapid. However, Zuo’s approach [21] still requires the imaged scene to remain stationary within the four consecutively captured frames, and Wang’s approach [22] requires the measured objects to remain stationary within the first three phase shifted fringes of the entire fringe sequence, which are not valid assumptions if the object motion is extremely rapid. Cong et al. [23] proposed a Fourier-assisted PSP approach which corrects the phase shift error caused by motion assisted by FTP approach, yet in this particular research, the marker points are used to detect fringe order, which could encounter similar problems as all marker based approaches mentioned above. Recently, our research group [24,25] proposed to obtain absolute phase map with the assist of geometric constraints. Hyun and Zhang [24] proposed an enhanced two-frequency method to reduce the noise and improve the robustness of conventional two-frequency phase unwrapping, yet it still uses six images as required by conventional two-frequency method, where speed is still a major concern. An et al. [25] introduced an absolute phase recovery framework that solely uses geometric constraints to perform phase unwrapping. This method does not require capturing additional images to determine the fringe order, and was lately combined with FTP to perform single-shot absolute phase recovery [26]. However, this single-shot approach cannot handle object depth variations that exceed 2*π* in phase domain [25], meaning that the measurement depth range could be quite constrained since FTP method typically requires using a high-frequency pattern.

Apart from reducing motion-induced errors by modifying the phase computational frameworks, researchers are also seeking alternative solutions by adding more hardware. The approaches that use more than one camera have been proven successful by several reported research works [27–30]. The fundamental mechanism behind this type of approaches lies in the fact that any motion-induced measurement error will simultaneously appear in different cameras, and thus the correspondence detection won’t be affected. However, the cost of adding another camera could be expensive especially when high measurement quality is required. Moreover, only the sampled area that is viewed by all imaging sensors (i.e. two cameras, one projector) can be reconstructed.

In this research, we propose a hybrid computational framework for motion-induced error reduction. Our proposed approach uses a total of 4 patterns to conduct absolute 3D shape measurement. First of all, we perform single-shot phase extraction using FTP with a high frequency pattern projection. Then, we identify each isolated object, and obtain continuous relative phase map for each object through spatial phase unwrapping. To determine the rigid shift from relative to absolute phase map, we use low-frequency three-step phase shifted patterns to find extra information: Essentially, we use low-frequency three-step phase shifted patterns plus geometric constraints to produce absolute phase map, yet phase errors caused by motion between the three frames are inevitable. However, we can obtain insights by finding the most common integer fringe order shift *k _{s}* from this PSP extracted phase map to FTP continuous relative phase map. Our proposed method combines spatial and temporal phase unwrapping, in which we use spatial phase unwrapping to reduce motion-induced errors, and temporal phase unwrapping to obtain absolute phase map. Our proposed method does not involve any additional hardware for motion-induced error reduction. Experiments have demonstrated the success of our proposed computational framework for measuring multiple spatially isolated objects with rapid motion.

Section 2 introduces the relevant theoretical background and the framework of our proposed research; Section 3 illustrates the experimental validations of our proposed research; Section 4 discusses the strengths and limitations of our proposed computational framework, and Section 5 summaries our proposed research.

## 2. Principle

In this section, we will introduce the relevant theoretical foundations of this proposed framework, which include the principles of FTP, PSP, phase unwrapping with geometric constraints, the motion-induced error in PSP method to be addressed in this research, as well as our proposed hybrid absolute phase computational framework.

#### 2.1. Fourier transform profilometry (FTP)

The basic principles of FTP approach can be expressed as follows: In theory, a typical fringe image can be represented as

*I*′ (

*x, y*) denotes the average intensity,

*I*″ (

*x, y*) stands for the intensity modulation, and

*ϕ*(

*x, y*) is the phase information to be extracted. According to the well-known Euler’s formula, Eq. (2) can be re-formulated as

*Re*[

*I*(

_{f}*x, y*)] and

*Im*[

*I*(

_{f}*x, y*) respectively represent the real and the imaginary part of the final image

*I*(

_{f}*x, y*). Consequently, Eq. (4) produces a wrapped phase map with 2

*π*discontinuities. To obtain a continuous phase map without 2

*π*jumps, a spatial or temporal phase unwrapping approach can be applied. In general, the key for phase unwrapping is to determine the integer fringe order

*k*(

*x, y*) for each pixel which removes 2

*π*discontinuities. The relationship between a wrapped phase map and an unwrapped phase map can be expressed as

#### 2.2. Phase shifting profilometry (PSP)

PSP method, different from single-shot FTP method, uses a set of phase shifted fringe images for phase computation. For a three-step phase-shifting approach that requires least number of phase shifting steps, the fringe images used can be described as

*ϕ*(

*x, y*) can be extracted by simultaneously solving Eqs. (6)–(8):

*ϕ*(

*x, y*) obtained here has 2

*π*discontinuities. Similarly, we can adopt a spatial or temporal phase unwrapping framework to obtain the unwrapped phase map.

#### 2.3. Phase unwrapping using geometric constraint

As recently proposed by An et al. [25], one of the methods that removes 2*π* discontinuities on a wrapped phase map is by using geometric constraints. Figure 1 illustrates the fundamental principle of this type of methods. Suppose the region that the camera captures on the CCD censor is a flat plane located at *z ^{w}* =

*z*, which is the closet measurement depth plane of interest, the same region can be mapped to the projector DMD sensor which creates a pixelate artificial absolute phase map Φ

_{min}*. This generated phase map Φ*

_{min}*can be used to locate 2*

_{min}*π*discontinuities on a wrapped phase map. The detailed procedures of

*z*determination and Φ

_{min}*generation can be found in [25].*

_{min}Figure 2 shows the conceptual idea of phase unwrapping using artificial absolute phase map Φ* _{min}*. Suppose when

*z*=

^{w}*z*, a camera captures the region shown inside the red dashed window on the projector [see Fig. 2(a)], in which the wrapped phase

_{min}*ϕ*

_{1}has 2

*π*discontinuities. The corresponding unwrapped phase Φ

*is shown in the red dashed box in Fig. 2(b). Figure 2(c) shows the cross sections of both phase maps, from which it depicts that 2*

_{min}*π*should be added to the wrapped phase when the phase

*ϕ*

_{1}is below Φ

*. The same idea also applies when the phase*

_{min}*ϕ*is captured at

*z*>

*z*as illustrated in the solid blue window, where 2

_{min}*π*is added to the wrapped phase

*ϕ*if below Φ

*.*

_{min}Now we consider a more general case, as shown in Fig. 2(d), where the captured camera image contains more fringe periods, different fringe order *k*(*x, y*) should be added to the wrapped phase *ϕ* depending on its difference with Φ* _{min}*. The fringe order

*k*(

*x, y*) can be determined as follows:

*ceil*[] operator returns the closest upper integer value.

#### 2.4. Motion-induced error in PSP

PSP works really well under the assumption that the object is quasi-static during the process of capturing multiple phase-shifted fringe images. However, the object movement can cause measurement error if the fundamental assumption of a phase-shifting method is violated. If the sampling speed of the system is fast enough, this type of error is not obvious. However, when the sampling speed cannot keep up with object movement, this type of error is pronounced and can be dominant.

We simulated this motion-introduced error by moving a unit sphere by 2% of the overall span (e.g., diameter of the sphere) along y direction for each additional fringe pattern capture (i.e., the sphere moves 10% for six fringe patterns). In this simulation, we adopted a two-frequency phase-shifting algorithm. Figures 3(a)–3(f) show six fringe patterns. Figure 3(g) shows the reconstructed 3D geometry. Apparently, the sphere surface is not smooth because of its movement.

To better visualize difference between the reconstructed 3D result and the ideal sphere, we took one cross section of the sphere and overlaid it with the ideal circle, as shown in Fig. 3(h). And Fig. 3(i) shows the difference between these two. Clearly, there are periodical structural error on the object surface, which is very similar to the nonlinearity error, or phase shift error. However, for this simulation, the sole source of error is caused by object movement, which is defined as *motion-induced error*.

#### 2.5. Proposed hybrid absolute phase computational framework

The major type of error that this paper aims at addressing is the measurement error caused by object motion. As discussed in the previous subsection, the motion-introduced error of PSP method is caused by the violations of the fundamental assumption of phase-shifting method: object remain static during the capture of required number of phase shifted fringe patterns. It is well known that the FTP method can extract phase information within one single-shot fringe image, which is extremely advantageous when measuring scenes with high speed motion, yet the pixel-by-pixel absolute phase retrieval problem remains nontrivial for FTP approaches without capturing any additional fringe patterns. We recently proposed to use geometric constraints of the structured light system to perform pixel-wise absolute phase unwrapping [26], yet is depth range is confined to a small range [25].

To address enhance the capability of our previously proposed method [26] by extending its depth range to a substantially large range, we propose a hybrid computational framework that combines FTP with PSP to address this limitation. We first perform single-shot FTP and spatial phase unwrapping to produce continuous relative phase map Φ* _{r}* for each spatially isolated object. Suppose we have an additional set of low frequency three-step phase shifted patterns, the phase extracted from this set of three patterns can be unwrapped by artificial phase map Φ

*to produce a rough absolute phase map Φ*

_{min}*, yet measurement errors are present owing to the object motion within the three frames. However, if under the assumption that the motion-induced errors is not predominant on the entire phase map, we can still take advantage of this phase map to find the rigid fringe order shift*

_{e}*k*from relative phase map to the final absolute phase map.

_{s}By using three additional phase-shifted fringe patterns with a lower frequency, the proposed method increases the depth range of our previous method [26]. For example, if the angle between camera and projector optical axes is around *θ* = 13° and the overall projection range is 400 mm, the proposed method can handle approximately 348 mm depth range for a noise-free system [25]. In contrast, the method proposed in [26] is confined to approximately 27 mm, which is approximately 13 times smaller than our proposed method.

Figure 4 illustrates the procedures of our proposed hybrid absolute phase retrieval framework, in which a set of four patterns are used to retrieve absolute phase map. The first step is to use a single-shot high frequency fringe pattern to perform FTP, in which image segmentation is used to separate each isolated object, and spatial phase unwrapping [31] is used to unwrap the phase extracted FTP for each object to create a continuous relative phase map Φ* _{r}*. To obtain absolute phase map Φ

*, we need to determine the constant rigid shift*

_{a}*k*in fringe order between absolute phase map Φ

_{s}*and the relative phase map Φ*

_{a}*.*

_{r}*round*() operator selects the closest integer number. To detect this constant rigid fringe order shift

*k*, we use another set of single-period three-step phase shifted low frequency patterns to obtain some insight. We extract a rough absolute phase map Φ

_{s}*with motion error through three-step phase-shifting plus geometric constraints from this set of three patterns, then we can use the area without significant phase errors (e.g. big phase jumps) to detect the constant rigid shift*

_{e}*k*.

_{s}We compute the difference map *k _{e}* in fringe order between Φ

*and Φ*

_{r}*as*

_{e}*u*and Δ

*v*is to compensate for the object motion between adjacent fringe images, which can be roughly estimated by detecting the center pixel movement of the bounding boxes for each isolated object between different frames.

Assuming that the motion-induced error is not predominant on *k _{e}* map, we then determine the actual constant shift

*k*by finding the most common integer number on

_{s}*k*map.

_{e}*mode*[] operator selects the most common value. Finally, the absolute phase map Φ

*can be extracted by Once the absolute phase map is obtained, the 3D reconstruction can be performed using the system model and calibration method as described in [32].*

_{a}## 3. Experiments

We set up a structured light system, shown in Fig. 5, to test the effectiveness of our computational framework. The system includes a digital-light-processing (DLP) projector (model: LightCrafter 4500) and a high-speed CMOS camera (model: Phantom V9.1). The projector resolution is 912 × 1140 pixels. In all our experiments, we adopted the binary defocusing technique [33] to generate quasi-sinusoidal profile by projecting 1-bit binary pattern with projector defocusing. The projector image refreshing rate was set at 1500 Hz. We set the camera resolution at 1024 × 768 pixels with an image acquisition speed of 1500 Hz which is synchronized with pattern projection. The lens attached to the camera has a focal length of 24 mm with an aperture of f/1.8. The system is calibrated using the method described in [32].

To demonstrate the effectiveness of our proposed framework regarding motion-induced error reduction, we compare our proposed method with a PSP based method. In this research, we use enhanced two-frequency PSP method [24] since it only requires 6 fringe patterns. The enhanced two-frequency PSP method essentially uses two-frequency phase shifted patterns: the wrapped phase obtained from low frequency patterns is unwrapped using geometric constraints (see Section 2.3), and then this obtained phase map sequentially unwraps high frequency wrapped phase to obtain final absolute phase map. We projected a sequence of six patterns: three phase-shifted high frequency square binary patterns with a fringe period of *T* = 18 pixels (denoted as
${I}_{1}^{h}$,
${I}_{2}^{h}$ and
${I}_{3}^{h}$), and three phase-shifted low frequency binary dithered patterns with a fringe period of *T ^{l}* = 228 pixels [34] (denoted as
${I}_{1}^{l}$,
${I}_{2}^{l}$ and
${I}_{3}^{l}$). Enhanced two-frequency PSP method uses all six patterns for 3D reconstruction, and our proposed method only uses the last four patterns (i.e.
${I}_{3}^{h}$ and
${I}_{1}^{l}-{I}_{3}^{l}$) for 3D reconstruction.

We first measured two free-falling ping-pong balls using the enhanced two-frequency PSP method [24]. Figures 6(a)–6(f) show a sequence of 6 continuously captured fringe images. For better visualization of object movement during the capture of six fringe patterns, we cropped the left ball in the six fringe images, and then drew a reference line (red) and a circle around the contour of the ball correspondingly, as shown in Figs. 6(g)–6(l). It is very obvious that the object moves a lot even for such high-speed capture. Since phase-shifting method requires the movement to be small, it is difficult for phase-shifting method to perform high quality 3D shape measurement. Figure 7(a) shows the retrieved absolute phase map, from which one can visually observe some motion artifacts around the boundaries of the spheres. The reconstructed 3D geometries, shown in Fig. 7(b), clearly depicts significant errors (e.g. large jumps, spikes) especially around the edges the spheres. Besides spikes, one can also observe that the object motion produces apparent artifacts along the direction of phase shifting (e.g. some vertical stripes on surface), which is very similar to the motion-induced errors introduced in Section 2.4.

We then implemented our proposed computational framework using the last four fringe images of a entire sequence (i.e.
${I}_{3}^{h}$ and
${I}_{1}^{l}-{I}_{3}^{l}$). The first step is to perform single-shot FTP to extract a wrapped phase map. Figure 8(b) shows the obtained wrapped phase map from the a single-shot fringe image
${I}_{3}^{h}$ [Fig. 8(a)] with high-frequency pattern (*T* = 18 pixels) projection. We then identified the two segmented balls and separately performed spatial phase unwrapping [31] for each ball. Figure 8(c) shows unwrapped phase map Φ* _{r}* for the entire scene.

The next step is to obtain an absolute phase map Φ* _{e}* with motion-induced error using PSP method. Figure 9 shows an example of Φ

*map extraction. Figure 9(a) shows one of the three-step phase shifted fringe images ( ${I}_{1}^{l}$) with low frequency patterns (*

_{e}*T*

^{l}= 228 pixels) projection. By applying three-step phase shifting (see Section 2.2), we obtain a wrapped phase map as shown in Fig. 9(b). As one can see, the motion between three phase shifted fringes causes apparent errors on the extracted phase map especially around the boundaries of the balls. By applying phase unwrapping method based on geometric constraints (see Section 2.3), we can obtain an absolute phase map with motion-induced error as shown in Fig. 9(c). Albeit motion-induced measurement errors are present, this phase map can still be used to obtain insight of the rigid fringe order shift

*k*.

_{s}The final step is to find the rigid fringe order shift *k _{s}* from relative phase map Φ

*to absolute phase map Φ*

_{r}*. With the continuous relative phase map Φ*

_{a}*[shown in Fig. 8(c)] and the absolute phase map with error Φ*

_{r}*[shown in Fig. 9(c)], Eq. (13) yields a difference map in fringe order*

_{e}*k*, shown in Fig. 9(d). We then plot the histograms of the difference map

_{e}*k*for each ball as shown in Fig. 10. We use Eq. (14) to find the bins of peak values on each histogram and pick the corresponding integer number to be the actual rigid shift

_{e}*k*for each ball. Then, we shift the relative phase Φ

_{s}*using Eq. (15) to obtain the final absolute phase map. Figure 11(a) shows the final absolute phase map obtained using our proposed method. Figure 11(b) shows 3D geometry reconstructed from the absolute phase map. Clearly, our proposed method works well for spatially isolated objects with the existence of rapid motion, and no significant motion-induced errors appear on the reconstructed 3D geometries. The associated Visualization 1 shows comparing result of the entire captured sequence. The video clearly shows that PSP method produces significant motion-induced errors, yet our proposed method consistently works well.*

_{r}To further compare the performance of our proposed method against the conventional two-frequency PSP method, we pick one of the two spheres (i.e. left sphere) and perform further analysis. Figure 12 shows the comparison of 3D results between proposed method and PSP based method. Figures 12(a) and 12(e) show the reconstructed 3D geometries using these two methods, from which we can see that the ball is well recovered using our proposed method, yet the result obtained from PSP based method has significant errors (e.g. big jumps, spikes) especially on the top and bottom of the sphere, which is caused by vertical object motion. Also, the object motion produces apparent artifacts along the direction of phase shifting (e.g. vertical creases). Since the ping-pong ball has well-defined geometry (i.e. a sphere with 40 mm in diameter), we then performed sphere fitting on both reconstructed 3D geometries and obtained residual errors as shown in Figs. 12(b) and 12(f). The root-mean-square (RMS) errors for proposed method and PSP approach are 0.26 mm and 6.92 mm respectively, which indicates that our proposed method can well reconstruct the 3D geometry of a rapidly moving ball, yet PSP method fails to provide reasonable result.

To better illustrate the differences, we took a cross section of sphere fitting and residual errors from both results. The corresponding plots are respectively shown in Figs. 12(c)–12(d) and Figs. 12(g) – 12(h). We removed the big outliers for PSP result in Fig. 12(b) on the cross section plots for better visualization purpose. Note that the error structure of Fig. 12(d) is very similar to the motion-introduced error from our simulation result, shown in Fig. 3(i). These results again demonstrate that the reconstructed geometry obtained from our proposed framework agree well with the actual sphere, and the error is quite small. However, the result obtained from PSP method deviates quite a bit from the actual sphere, and the residual error is quite large and with big artifacts on the edges of the sphere. This experiment clearly shows the significance of our proposed computational framework in terms of motion-induced error reduction.

To further evaluate the performance of our proposed computational framework, we drastically increased the number of ping-pong balls within the scene and measured the motion of all balls. Figure 13 and its associated video ( Visualization 2) demonstrate the measurement results of many free-falling ping-pong balls, where Figs. 13(a) and 13(b) respectively show a sample frame of the texture and the corresponding 3D geometries. The measurement result demonstrates that our proposed computational framework performs well under the scenes with a large number of rapidly moving spatially isolated objects. This experiment further proves the success and robustness of our proposed computational framework.

One may notice that on the reconstructed 3D geometries, some artifacts still appear when the black characters on the balls show up in the captured scene. An example is shown in Fig. 14, which is the zoom-in view of the ball selected in the red bounding boxes of the pictures in Fig. 13. Some artifacts appear when the characters on the ball appear as shown in the blue bounding boxes in Figs. 14(a)–14(b). This is caused by the inherent limitation of FTP method: it does not function well when rich texture variation is present. To alleviate this problem, one can incorporate our proposed framework with more sophisticated windowed Fourier transform [35,36] or wavelet transform profilometry [37,38].

## 4. Discussion

Our proposed computational framework has the following advantages compared to other absolute phase retrieval frameworks.

*Resistance to measurement errors caused by rapid object motion.*Since the final absolute phase map is generated by shifting the spatially unwrapped single-shot FTP phase map, it is resistant to phase errors caused by rapid object movements, and thus reduces measurement errors induced by motion.*Absolute 3D shape measurement of multiple rapidly moving objects within a large depth range.*As shown in experiments, our proposed framework is capable of recovering absolute 3D geometries for many spatially isolated objects with rapid motion, which is difficult for existing frameworks to do so especially when object displacement is quite significant between frames. Comparing with our previous method, the depth sensing range of the proposed method is approximately 13 times of that achieved by our previously proposed method [26].

However, our proposed framework also has some inherent limitations, and the performance could be affected under the following conditions:

*Measurement of complex surface geometry or texture.*Since the phase extracted from single-shot FTP finally determines the phase quality and thus measurement quality, therefore, some inherent limitations of standard FTP approach remain in our proposed method. Namely, under circumstances where there are rich local surface geometric or texture variations, the measurement qualities are reduced because of the difficultly of accurately retrieving the carrier phase in FTP.*Existence of abrupt geometric discontinuities.*As introduced in Section 2.5, spatial phase unwrapping is involved at the first step of absolute phase retrieval. Therefore, if there exist abrupt geometric discontinuities on a single object or between overlapping objects, the performance of our proposed computational framework could be affected.

## 5. Summary

In this research, we proposed a computational framework that reduces motion-induced measurement errors by combining FTP and PSP approach. This framework uses a high-frequency pattern to perform FTP which extracts phase information within single-shot fringe image, then spatial phase unwrapping is applied to each isolated object to obtain continuous relative phase map. Finally, by referring to the absolute phase map with errors obtained from a set of low frequency phase shifted patterns, we shift the relative phase maps for each object to produce final absolute phase map. Experiments have demonstrated the effectiveness of our computational framework for measuring multiple rapidly moving isolated objects without adding additional hardware. Comparing with the previous method of its kind, the proposed method increase substantially increase the depth sensing range (i.e., 13 times).

## Funding

National Science Foundation (NSF) Directorate for Engineering (ENG) (CMMI-1523048).

## Acknowledgments

The authors would like to thank other members in their research group. In particular, we thank Jae-Sang Hyun for his assistance in hardware setup.

## References and links

**1. **J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics **3**, 128–160 (2011). [CrossRef]

**2. **M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-d object shapes,” Appl. Opt. **22**, 3977–3982 (1983). [CrossRef]

**3. **H. Guo and P. Huang, “3-d shape measurement by use of a modified fourier transform method,” Proc. SPIE **7066**, 70660E (2008). [CrossRef]

**4. **L. Guo, X. Su, and J. Li, “Improved fourier transform profilometry for the automatic measurement of 3d object shapes,” Opt. Eng. **29**, 1439–1444 (1990). [CrossRef]

**5. **S. Zhang and S.-T. Yau, “High-speed three-dimensional shape measurement system using a modified two-plus-one phase-shifting algorithm,” Opt. Eng. **46**, 113603 (2007). [CrossRef]

**6. **K. Creath, “Phase-measurement interferometry techniques,” Prog. Opt. **26**, 349–393 (1988). [CrossRef]

**7. **P. S. Huang and S. Zhang, “Fast three-step phase-shifting algorithm,” Appl. Opt. **45**, 5086–5091 (2006). [CrossRef]

**8. **H. Guo and P. S. Huang, “Absolute phase technique for the fourier transform method,” Opt. Eng. **48**, 043609 (2009). [CrossRef]

**9. **Y. Xiao, X. Su, Q. Zhang, and Z. Li, “3-d profilometry for the impact process with marked fringes tracking,” Optoelectron. Eng. **34**, 46–52 (2007).

**10. **B. Budianto, P. Lun, and T.-C. Hsung, “Marker encoded fringe projection profilometry for efficient 3d model acquisition,” Appl. Opt. **53**, 7442–7453 (2014). [CrossRef]

**11. **Y.-Y. Cheng and J. C. Wyant, “Two-wavelength phase shifting interferometry,” Appl. Opt. **23**, 4539–4543 (1984). [CrossRef]

**12. **Y.-Y. Cheng and J. C. Wyant, “Multiple-wavelength phase shifting interferometry,” Appl. Opt. **24**, 804–807 (1985). [CrossRef]

**13. **D. P. Towers, J. D. C. Jones, and C. E. Towers, “Optimum frequency selection in multi-frequency interferometry,” Opt. Lett. **28**, 1–3 (2003). [CrossRef]

**14. **Y. Wang and S. Zhang, “Superfast multifrequency phase-shifting technique with optimal pulse width modulation,” Opt. Express **19**, 5143–5148 (2011).

**15. **M. Servin, J. M. Padilla, A. Gonzalez, and G. Garnica, “Temporal phase-unwrapping of static surfaces with 2-sensitivity fringe-patterns,” Opt. Express **23**, 15806–15815 (2015). [CrossRef]

**16. **S. Zhang, “Flexible 3d shape measurement using projector defocusing: Extended measurement range,” Opt. Lett. **35**, 931–933 (2010).

**17. **J. Pan, P. S. Huang, and F.-P. Chiang, “Color-coded binary fringe projection technique for 3-d shape measurement,” Opt. Eng. **44**, 023606 (2005). [CrossRef]

**18. **Y. Wang and S. Zhang, “Novel phase coding method for absolute phase retrieval,” Opt. Lett. **37**, 2067–2069 (2012). [CrossRef]

**19. **C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. **51**, 953–960 (2013). [CrossRef]

**20. **Y. Xing, C. Quan, and C. Tay, “A modified phase-coding method for absolute phase retrieval,” Opt. Lasers Eng. **87**, 97–102 (2016). [CrossRef]

**21. **C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express **20**, 19493–19510 (2012). [CrossRef]

**22. **Y. Wang, S. Zhang, and J. H. Oliver, “3-d shape measurement technique for multiple rapidly moving objects,” Opt. Express **19**, 5149–5155 (2011). [CrossRef]

**23. **P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process **9**, 396–408 (2015). [CrossRef]

**24. **J.-S. Hyun and S. Zhang, “Enhanced two-frequency phase-shifting method,” Appl. Opt. **55**, 4395–4401 (2016). [CrossRef]

**25. **Y. An, J.-S. Hyun, and S. Zhang, “Pixel-wise absolute phase unwrapping using geometric constraints of structured light system,” Opt. Express **24**, 18445–18459 (2016). [CrossRef]

**26. **B. Li, Y. An, and S. Zhang, “Single-shot absolute 3d shape measurement with fourier transform profilometry,” Appl. Opt. **55**, 5219–5225 (2016). [CrossRef]

**27. **K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Lasers Eng. **51**, 1213–1222 (2013). [CrossRef]

**28. **W. Lohry, V. Chen, and S. Zhang, “Absolute three-dimensional shape measurement using coded fringe patterns without phase unwrapping or projector calibration,” Opt. Express **22**, 1287–1301 (2014). [CrossRef]

**29. **W. Lohry and S. Zhang, “High-speed absolute three-dimensional shape measurement using three binary dithered patterns,” Opt. Express **22**, 26752–26762 (2014). [CrossRef]

**30. **S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using gobo projection,” Opt. Lasers Eng. **87**, 90–96 (2016). [CrossRef]

**31. **S. Zhang, X. Li, and S.-T. Yau, “Multilevel quality-guided phase unwrapping algorithm for real-time three-dimensional shape reconstruction,” Appl. Opt. **46**, 50–57 (2007). [CrossRef]

**32. **B. Li, N. Karpinsky, and S. Zhang, “Novel calibration method for structured light system with an out-of-focus projector,” Appl. Opt. **53**, 3415–3426 (2014). [CrossRef]

**33. **S. Lei and S. Zhang, “Flexible 3-d shape measurement using projector defocusing,” Opt. Lett. **34**, 3080–3082 (2009). [CrossRef]

**34. **Y. Wang and S. Zhang, “Three-dimensional shape measurement with binary dithered patterns,” Appl. Opt. **51**, 6631–6636 (2012). [CrossRef]

**35. **Q. Kemao, “Windowed fourier transform for fringe pattern analysis,” Appl. Opt. **43**, 2695–2702 (2004). [CrossRef]

**36. **Q. Kemao, “Two-dimensional windowed fourier transform for fringe pattern analysis: Principles, applications and implementations,” Opt. Lasers Eng. **45**, 304–317 (2007). [CrossRef]

**37. **P. Sandoz, “Wavelet transform as a processing tool in white-light interferometry,” Opt. Lett. **22**, 1065–1067 (1997). [CrossRef]

**38. **J. Zhong and J. Weng, “Spatial carrier-fringe pattern analysis by means of wavelet transform: wavelet transform profilometry,” Appl. Opt. **43**, 4993–4998 (2004). [CrossRef]