Abstract

Fourier-transform profilometry (FTP) and phase-shifting profilometry (PSP) are two mainstream fringe projection techniques widely used for three-dimensional (3D) shape measurement. The former is well known for its single-shot nature and the latter for its higher measurement resolution and precision. However, when it comes to measuring the dynamic objects, neither approach is able to produce high-resolution, high-accuracy measurement results that are free from any depth ambiguities and motion-related artifacts. Furthermore, for scenes consisting of both static and dynamic objects, a trade-off between measurement precision and efficiency has to be made, suggesting that using a single approach can yield only suboptimal results. To this end, we propose a novel hybrid Fourier-transform phase-shifting profilometry method to integrate the advantages of both approaches. The motion vulnerability of multi-shot PSP can be overcome, or at least significantly alleviated, through the combination of single-shot FTP, while the high accuracy of PSP can also be preserved when the object is motionless. We design a phase-based, pixel-wise motion detection strategy that can accurately outline the moving object regions from their motionless counterparts. The final measurement result is obtained by fusing the determined regions where the PSP or FTP is applied correspondingly. To validate the proposed hybrid approach, we develop a real-time 3D shape measurement system for measuring multiple isolated moving objects. Experimental results demonstrate that our method achieves significantly higher precision and better robustness compared with conventional approaches where PSP or FTP is applied separately.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Optical non-contact three-dimensional (3D) shape measurement techniques have been widely used in industrial inspection, reverse engineering, medical plastics, heritage digitalization, and many other aspects [1, 2]. In the past few decades, many different 3D imaging technologies have been proposed, such as stereo vision [3, 4], time-of-flight [5, 6], and fringe projection profilometry (FPP) [2,7–10]. Among them, the FPP method has become one of the most popular 3D surface measurement technologies due to its advantages in terms of high measurement accuracy, simple hardware configuration, and being easy to implement. With the recent advances in high speed imaging sensors and digital projection technology, it now becomes possible to achieve high-precision, high-speed real-time 3D shape measurement of dynamic scenes such as fast moving object, rotating, or vibrating non-rigid bodies [11,12]. High-speed 3D dynamic shape measurement can be extensively applied in such fields as biomechanics, on-line inspection, solid mechanics, robot navigation, and human-computer interaction, where precise, accurate, high-speed 3D information acquisition is desirable or even a must.

As its nomination, the FPP technique is basically to project the fringe pattern(s) onto target object and to capture the deformed fringe pattern(s) by using a digital camera from a different perspective. The depth of the object surface is encoded in the phase of the distorted fringe, which can be demodulated by using fringe analysis techniques. With the phase map, one can establish the unique correspondence between a camera pixel and a projector pixel, and then the 3D shape of the object can be reconstructed by utilizing optical triangulation based on the pre-calibrated geometric parameters of the FPP system.

Many fringe analysis techniques have been proposed to extract the phase distribution from the distorted fringe(s), such as phase-shifting profilometry (PSP) [13–15], Fourier-transform profilometry (FTP) [16], windowed Fourier-transform profilometry (WFP) [17,18], and wavelet-transform profilometry (WTP) [19, 20]. Among these techniques, the FTP and PSP are two mainstream fringe analysis approaches widely used for 3D shape measurement. FTP retrieves phase information from only one single high-frequency fringe image, so it is well-suited to snapshot 3D shape data acquisition [21]. Essentially, a band-pass filter is applied in the Fourier domain to extract the fundamental spectral component. However, it relies on the precondition that the fundamental frequency component, which carries the phase information of the object, is separable from zero-order background. In practice, this precondition can be easily violated when the measured surface contains sharp edges, discontinuities, and/or large surface reflectivity variations. The measurement accuracy and dynamic range can be improved by the π phase-shifting FTP [22], modified FTP [23], or background normalized FTP [24]. On the other hand, PSP typically require a minimum of three fringe images to provide high-accuracy pixel-wise phase measurement. The main advantages of PSP techniques are their higher spatial resolution, measurement accuracy, and robustness towards ambient illumination and varying surface reflectivity [25, 26]. However, when measuring dynamic scenes, motion will lead to phase distortion artifacts, especially when the object motion during the interframe time gap is non-negligible. Strictly speaking, the motion-induced phase error is an intrinsic and inevitable problem of PSP, since the phase information is spread over multiple fringe images.

During recent years, dynamic 3D shape measurement using FPP has attracted a great deal of research interest, which can be roughly categorized in the following three directions: (1) Increasing the speed of hardware (projector and camera); (2) Reducing the number of required patterns per 3D reconstruction; (3) Improving the measurement quality and reducing the motion artifacts. The first direction focuses on the projector defocusing technique that allows the projector to cast binary fringe patterns close to the sinusoidal ones to maximize the speed of the projector (from 120 Hz to the level of kHz [27] or even 10 kHz) [24,28]. When the high-speed projector is synchronized with a high-speed camera, motion-induced error on the phase reconstruction can be reduced accordingly, depending on the frame rate of the projector-camera pair. The second direction is to improve the measurement efficiency, i.e., to reduce the number of patterns required per measurement. Essentially, the dominant challenge affecting the measurement efficiency in both PSP and FTP is the phase ambiguity resulting from the periodicity of the sinusoidal signal. To recover the absolute phase, a common practice is to use temporal phase unwrapping (TPU) algorithms with the help of Gray-code patterns or multi-wavelength fringes [29–31]. However, a large number of additional patterns are required, which are used for phase unwrapping purpose only while contributing nothing to the measurement accuracy. To address this issue, several composite phase shifting schemes (e.g., dual-frequency PSP [32], bi-frequency PSP [33], and 2+2 PSP [34]) have been proposed, which can solve the phase ambiguity problem without significantly increasing the number of projected patterns. However, in order to guarantee the reliability of phase unwrapping, only relevantly low-frequency fringes can be used, which limits the measurement accuracy. Recently, stereo phase unwrapping (SPU) methods based on geometric constraint [35,36] can be used to solve the phase ambiguity problem without using additional auxiliary patterns [37–39]. Although typically more than one cameras should be used, for the reason of high efficiency and practicability, the SPU methods are still best suitable to high-speed real-time 3D shape measurement of dynamic scenes.

Besides increasing the hardware speed and reducing the number of patterns, the third direction focuses on enhancing the dynamic measurement capability with use of post-processing algorithm. Weise et al. [37] exploited a least-square fitting to estimate the motion-induced phase offset. Lu et al. [40] refined the unknown phase offset caused by the 3D movement of a rigid object and presented an iterative least-squares algorithm to estimate the phase map. Although these methods can accurately estimate motion-induced phase shifts, the compensation is homogeneous for every pixel, which is fragile for multiple objects in different kinds of motion. Feng et al. [41] proposed to apply the homogeneous phase-shift extraction method to solve the nonhomogeneous motion artifact problem. This method still assumes that the phase shift error within a single segmented object is homogeneous, so it may not work well for dynamically deformable objects. Li et al. [42] proposed a hybrid method to reduce motion-induced error by combining the FTP with the PSP. However, the measurement accuracy is compromised because the absolute phase is retrieved from the FTP. Cong et al. [43] proposed a Fourier-assisted PSP approach. This method preserves the high precision of PSP to a certain extent, but also introduces the global error since the phase shift is obtained by subtracting two phases retrieved by FTP. Liu et al. [44] determined the motion-induced phase shift error by leveraging the projector pinhole model. Although the algorithm effectively compensate for the phase shift error introduced by rigid and non-rigid motion, the process of multiple iterations makes the operation complicated and cannot be applied to real-time scenes. In addition, Zhang et al. [45,46] proposed the methods of combining single-frame method and multi-frame method, where the motion regions of the object are detected, and different imaging methods are used according to different regions. In contrast to other methods, the method is more targeted, i.e., the motion compensation algorithm is only performed in the motion regions, while the high precision of multi-frame method in the still regions is preserved. However, only local rather than pixel-by-pixel motion state can be judged by this method, because this method is susceptible to ambient light noise.

The goal of this paper is to develop a new 3D shape measurement technique which can measure 3D shapes robustly for rigid and non-rigid objects in complex scenes consisting of both static and dynamic objects. To this end, we propose a novel hybrid Fourier-transform phase-shifting profilometry method for motion-induced error reduction. First of all, four patterns are projected onto the measured objects, from which the absolute phase information can be simultaneously calculated by both PSP and FTP with the assistance of adaptive depth constraints, which can unambiguously unfold the phase without additional auxiliary fringes. Then, the motion regions of the measured objects are automatically detected with a pixel-wise motion detection strategy developed by us, where the motion state is determined based on phase. At last, the results of PSP and FTP are fused according to the detected motion regions in the way where the motion-induced error of PSP can be compensated for through the combination of FTP, while the high precision of PSP can be preserved when the object is motionless. In order to prove the feasibility of the proposed method, a real-time 3D shape measurement system of 30 fps for measuring multiple isolated moving objects is developed. Experiments show that our method can integrate the advantages of PSP and FTP to achieve higher measurement precision than traditional real-time algorithms based on PSP or FTP.

2. Principle

2.1. Basic principle of 3-step PSP and SPU

A typical 3D imaging system based on SPU is composed of two cameras and one projector [37].The fringe patterns are projected onto the measured object, and then modulated by the object, and finally captured by the two cameras. A phase map can be obtained through the captured images to search the sub-pixel matching points and reconstruct the depth information of the measured object. Taking three-step phase-shifting fringe patterns for example, the captured patterns can be expressed by the following formulas:

I1c(uc,vc)=Ac(uc,vc)+Bc(uc,vc)cos(Φc(uc,vc)),
I2c(uc,vc)=Ac(uc,vc)+Bc(uc,vc)cos(Φc(uc,vc)+2π3),
I3c(uc,vc)=Ac(uc,vc)+Bc(uc,vc)cos(Φc(uc,vc)+4π3),
where the superscript c denotes the camera, (uc, vc) is a point in the camera, I1c, I2c and I3c represent the three captured fringe patterns, Ac is the average intensity map, Bc is the amplitude intensity map, and Φc is the absolute phase map. Because of the truncation feature of the arctan function, only the wrapped phase can be obtained with Eqs. (1)(3):
ϕc(uc,vc)=arctan(3(I2c(uc,vc)I3c(uc,vc))2I1c(uc,vc)I2c(uc,vc)I3c(uc,vc)),
where ϕc represents the wrapped phase. The absolute phase map and the wrapped phase map satisfy the following relation:
Φc(uc,vc)=ϕc(uc,vc)+2kc(uc,vc)π,kc(uc,vc)[0,N1],
where kc is the fringe order, and N denotes the number of fringes. The process of obtaining the fringe orders is called phase unwrapping.

The principle of SPU is shown in Fig. 1. For an arbitrary point oc1 in the Camera 1, it has N possible fringe orders corresponding to N possible absolute phases, with which the horizontal coordinates of the corresponding points in the projector can be obtained. Then the N corresponding 3D candidates can be retrieved by the parameter matrices derived from calibration parameters between the Camera 1 and the projector. The calibration parameters of the projector and camera are calibrated based on MATLAB Calibration toolbox, and optimized with bundle adjustment [47–49]. The retrieved N 3D candidates, just like the intersection points between the blue lines from the projector and the black line from the Camera 1 in Fig. 1, can be projected into the Camera 2 to get their corresponding 2D candidates just like the red and green points in the Camera 2 in Fig. 1. There is a correct matching point which should have the more similar wrapped phase to oc1 among these 2D candidates. Then a phase similarity check will be carried out to find the matching point, and the phase ambiguity of oc1 will also be removed. After unwrapping the wrapped phase, the 3D information of the measured object can be reconstructed with the matching points in Camera 2 and the parameter matrices derived from calibration parameters between the two cameras. It should be noted that the increase of cameras can improve the stability of the phase unwrapping. For example, the 2D candidates of the second camera can continue to be projected onto a third camera for phase consistency checking to further exclude incorrect candidates. For the sake of simplicity, the principle description in this paper is presented based on a dual camera model.

 figure: Fig. 1

Fig. 1 The principle of SPU.

Download Full Size | PPT Slide | PDF

However, conventional SPU is not enough to robustly eliminate phase ambiguities when high-frequency fringes are used. Depth constraint [35, 50], which is a process of predicting the measurement volume of the system and excluding the 3D candidates outside this range, is a popular method to improve the stability of SPU. The strategy of adaptive depth constraint (ADC) [38] can update the pixel-wise depth range automatically according to the real-time measurement results. Through ADC, the maximum and minimum depths of the 3D information of the previous reconstruction of all the pixels in the rectangle with side length p and centered on each pixel are found and then expanded outward by d mm to serve as the depth range of each pixel at the current time. Here p and d represent two constants related to the velocity of the object, which are 10 and 20 respectively in this paper. As the ADC method can provide a tighter and more accurate depth range automatically for each point, most false 3D candidates can be excluded before the phase similarity check, which significantly improves the stability of SPU and is especially suitable for the measurement where the object is moving constantly. So we utilize this method to assist SPU to improve the stability of the measurement in this work. Although PSP based on SPU can robustly remove phase ambiguity in the static scene, the motion caused phase error and unwrapping error in motion introduced scene make it necessary to compensate for the motion-induced error of PSP.

2.2. Composite phase retrieval method based on FTP and PSP

In this work, we propose a new composite phase retrieval method based on FTP and PSP. The motion will cause additional phase changes between frames, which breaks the basic assumptions of PSP. To reduce the sensitivity to motion, PSP with as few fringe patterns as possible should be applied. Therefore, three-step phase-shifting method is chosen in this work. We use the second fringe of the three-step phase-shifting images to perform FTP (the reason for choosing the second one will be discussed below). Meanwhile, in order to improve the measurement precision of FTP, we use the background-normalized Fourier transform profilometery (BNFTP) [24], where a pure white map is cast the light intensity of which is equal to the average intensity of the three-step phase-shifting fringes. With the projection of the pure white image, the captured image can be expressed as:

I0c(uc,vc)=Ac(uc,vc).
By taking the normalized difference between I2c and I0c, the zero-frequency term as well as the effect of surface reflectivity variations can be effectively removed:
Inc(uc,vc)=I2c(uc,vc)I0c(uc,vc)I0c(uc,vc)+b,
where Inc indicates the normalized image, and b is a small constant to prevent divide-by-zero error. With the subtraction and normalization of the background image, the effect of zero-order as well as surface reflectivity variations is removed before the Fourier transform, and the spectrum overlap in the frequency domain can be prevented or significantly alleviated. Then Fourier transform is applied on the normalized image Inc to extract the phase information. Considering both the efficiency of PSP and the precision of FTP, the fringe strategy selected in this work is a 1+3 strategy which is one pure white map and three three-step phase-shifting fringe patterns.

For the sake of the precision of the fused results, it is significant to select the appropriate one from the three-step phase-shifting fringe patterns to implement FTP and fuse the result with PSP. In the motion scene, the fringe patterns captured by the camera can be expressed as:

Im1c(uc,vc)=Ac(uc,vc)+Bc(uc,vc)cos(Φc(uc,vc)+ΔΦ1c(uc,vc)),
Im2c(uc,vc)=Ac(uc,vc)+Bc(uc,vc)cos(Φc(uc,vc)+2π3),
Im3c(uc,vc)=Ac(uc,vc)+Bc(uc,vc)cos(Φc(uc,vc)+4π3+ΔΦ2c(uc,vc)),
where ΔΦ1c and ΔΦ2c are the motion-induced phase offsets with respect to the middle image. For translational motion, ΔΦ1c and ΔΦ2c satisfy the following relationship [37]:
ΔΦ1c(uc,vc)>0>ΔΦ2c(uc,vc),
or
ΔΦ1c(uc,vc)<0<ΔΦ2c(uc,vc).
In this case, the phase solved by PSP fluctuates along the result of FTP of the second fringe pattern, as shown in Fig. 2, where Fig. 2(a) is the result obtained in the case where ΔΦ1c is π6 and ΔΦ2c is π6, Fig. 2(b) is in the case where ΔΦ1c is π4 and ΔΦ2c is π6, and the original phase shifts have been subtracted from the FTP phases. As can be seen from Fig. 2, when the object is moving at a constant or varying speed towards/away the camera, the PSP result always fluctuates around the FTP result of the second fringe pattern. So in this case, the PSP result of the moving part of the object should be replaced with the FTP result of the second fringe pattern.

 figure: Fig. 2

Fig. 2 The phases solved by PSP and the phases solved by FTP of the three three-step phase-shift fringe patterns in the case where the object moves in translation.

Download Full Size | PPT Slide | PDF

For the rotary motion, it can also be regarded as a special translational motion that can be split into a plurality of discrete motions of short time intervals. In each short time interval, the object can be divided into two parts along the center of rotation, each of which performs a speed-changing translational motion in the opposite direction.

For an object making reciprocating translational motion, if a group of fringe patterns captured happens to contain images with two opposite directions of motion, then ΔΦ1c and ΔΦ2c meet the following relationship:

ΔΦ1c(uc,vc)>0,ΔΦ2c(uc,vc)>0,
or
ΔΦ1c(uc,vc)<0,ΔΦ2c(uc,vc)<0.
In this state of motion, the phase solved by PSP fluctuates along the FTP result of the third fringe image, as shown in Fig. 3, where Fig. 3(a) is the result obtained in the case where ΔΦ1c is π4 and ΔΦ2c is π6 and Fig. 3(b) is in the case where ΔΦ1c is π4 and ΔΦ2c is π6. However, changing direction of movement only occurs for a moment, and the camera does not necessarily capture a group of patterns containing two motion directions during one reconstruction. Therefore, this state of motion can be almost ignored. Finally, the second fringe pattern is selected to perform FTP.

 figure: Fig. 3

Fig. 3 The phases solved by PSP and the phases solved by FTP of the three three-step phase-shift fringe patterns in the case where the object changes direction of motion.

Download Full Size | PPT Slide | PDF

The entire algorithm process is discussed next. Firstly, the wrapped phases are respectively calculated by PSP and FTP. SPU is implemented to achieve phase unwrapping of the FTP phase with the assistance of ADC, while the ambiguity of the wrapped phase of PSP is removed by means of the absolute FTP phase. Then, the matching points of the Camera 1 of PSP and FTP can be respectively obtained. Finally, the results of PSP and FTP are fused by the determined motion regions of the object. For the still regions, the result of PSP is reserved, while the result of the motion regions, is replaced with that of FTP. The whole algorithm flow is shown in Fig. 4.

 figure: Fig. 4

Fig. 4 Fusion algorithm flow diagram (The dark blue region indicates the result of the PSP, the dark red region indicates the result of the FTP and the dark green region indicates the combined result).

Download Full Size | PPT Slide | PDF

The steps of the whole algorithm are summarized as follows:

  • Step 1: Obtain the wrapped phases by PSP and FTP respectively;
  • Step 2: Unwrap the wrapped phase of FTP by SPU, with the assistance of ADC;
  • Step 3: Unwrap the wrapped phase of PSP by means of the absolute phase of FTP;
  • Step 4: Obtain the matching points of the Camera 1 of PSP and FTP respectively;
  • Step 5: Fuse the matching points of PSP and FTP with the detected motion areas and achieve 3D reconstruction;
  • Step 6: Update the range of the next cyclic dynamic depth constraint with the obtained depth information;
  • Step 7: Return back to Step 1 and repeat the above process;

3. Motion detection and judgment

The key to the proposed algorithm is how to analyze the state of motion of the object pixel by pixel. The motion judgment can be started from two aspects: (1) judging by the most direct change of the 3D data; (2) judging the movement by the change of the data related to the 3D data. In the first aspect, the 3D data between the current and previous moment can be compared to determine the motion regions of the object, which is called the 3D data comparison method (3DDCM) in this paper. It is the most direct and simple method for judging object motion. However, the process of firstly calculating the 3D points, then judging the motion state and finally achieving the fusion is time consuming, which is not suitable for real-time processing. Therefore, consideration should be given to the second aspect where the fusion of FTP and PSP can be achieved before obtaining the 3D data.

In the second aspect, some researchers [45, 46] have proposed a simple frame difference method (FDM), where the intensity change between the current and previous captured images is considered to detect if there is any motion in the scene. However, only local rather than pixel-by-pixel motion state is judged by this method because of the susceptibility to ambient light noise. As is known to us, the measured 3D coordinates of the object will change with the motion of it. In the FPP system, the 3D data of the measured object is closely related to its phase information. So in this paper, we propose the phase frame different method (PFDM), which is a motion detection method based on phase differentials from frame to frame. The specific process of judging the movement is as follows. Firstly, the absolute value of the phase difference ΔΦc introduced by motion between the FTP phases of time t and time t − Δt is calculated (the reason why we choose the FTP phases of two adjacent frames to make the difference will be discussed below):

ΔΦc(uc,vc,t)=|Φc(uc,vc,t)Φc(uc,vc,tΔt)|=|dΦc(uc,vc,t)dtΔt|,
where dΦC(uC,vC,t)dt represents the rate of change of phase of the point (uc, vc) at time t. Then, the phase change of each point is compared with a certain threshold. If the degree of phase change of a point is greater than the threshold, it is considered to be in motion, otherwise judged to be stationary:
flagc(uc,vc,t)={0,ΔΦc(uc,vc,t)<Th1,ΔΦc(uc,vc,t)Th,
where Th indicates the threshold for determining the motion, and flagc is the motion state diagram of the output, with output 1 representing the moving point, and output 0 representing the stationary point. In order to eliminate some individual stationary points judged to be moving and to correct some individual undetermined motion points, we perform Gaussian filtering on the output motion state map. After filtering, if the output of a certain point is less than 1/2, it is considered to be stationary, otherwise judged to be in motion.

It is now discussed why two FTP phases of the adjacent time are used for comparison. Firstly, the FTP phase and the PSP phase cannot be compared, since FTP extracts the phase in the frequency domain using the filtering window, the result obtained by which is smoother than that by PSP. Therefore, even when measuring the static scene, the FTP result deviates from the PSP result. So theoretically, only two PSP phases or two FTP phases can be used for comparison. In the motion scene, the phase obtained by PSP is the inaccurate phase with motion ripples [41], while that obtained by FTP is more precise. Taking the motion of increasing phase as an example, the differences between two FTP phases at the adjacent time are proportional to the phase changes, as shown in Fig. 5(a). However, when comparing the phases of PSP, if there is a region in which the trough of the phase at the current time is compared with the peak of the phase at the previous moment, such as the regions within the red circles in Fig. 5(b), the phase differences in these regions are generally smaller than those of other regions, which are more difficult to detect. So in this work, the FTP phases at two adjacent moments are used to perform PFDM.

 figure: Fig. 5

Fig. 5 Difference between PFDM by two FTP phases and PFDM by two PSP phases.

Download Full Size | PPT Slide | PDF

In order to accurately determine the motion regions, it is necessary to distinguish the noise-induced and motion-induced phase changes. In other words, the threshold of motion judgment needs to be reasonably designed. The process of the determination of the threshold is as shown in Fig. 6. Firstly, we measure a flat panel in a static environment and obtain its absolute phase. We record the phase values of a fixed area as shown by the red dashed box in Fig. 6 of successive n moments t0tn−1, and then sum and average them. The resulting average phase can be approximated as the phase value measured without ambient light noise. Then the histogram of the difference between the average phase and the phase at time tn−1 is obtained. Areas with a value of less than e in the histogram are removed, such as the red areas of the histogram in Fig. 6, and then the two areas with the greatest difference are acquired, such as the green ones of the histogram in Fig. 6. Here e represents a smaller constant, which is 10 in this work. The absolute values are obtained for these two regions, and the upper limit of the larger region is taken as the phase threshold, which is the maximum phase fluctuation caused by ambient light in the static state. As long as the phase change of a pixel is greater than the maximum fluctuation in the static state, the pixel is considered to be moving.

 figure: Fig. 6

Fig. 6 Threshold determination process.

Download Full Size | PPT Slide | PDF

4. Simulation

In this section, we simulate motion in different noise environments. A series of three-step phase-shifting fringe patterns with a size of 480 × 240 and a frequency of 10 and pure white maps are generated. To explore the motion state of the object, we adopt improved Zhang’s method [45,46] which is called the fringe frame difference method (FFDM) in this paper, the PFDM using PSP phase, and the PFDM using FTP phase.

FFDM uses three fringe patterns to implement FDM and takes the or operation to judge the motion. Figures 7 and 8 are the simulation results of FFDM and PFDM, respectively. It can be seen from Fig. 7(a) that the FDM using one image has different sensitivities to motion at different locations, and the utilization of three images alleviates but cannot solve this problem as shown in Fig. 7(b). Figure 7(c) shows the intensity variation caused by the movement of the phase at a speed of π/150 per frame in a noisy environment. Figure 7(d) is another perspective of Fig. 7(c), from which the phenomenon of that the sensitivity of FFDM to motion varies with position can be found. We use the threshold selection method similar to the Section 3 to determine the motion. Firstly, the captured images which are the first patterns of 3-step fringe patterns of n consecutive moments in the static environment are summed and averaged. Then the mean pattern approximate as the ambient-noise-free image is substrated from the captured image of current moment to obtain the histogram of the difference. After removing a few edge regions of the histogram, the threshold which is the maximum light intensity fluctuation resulting from noise in the static state can be obtained. We define the accuracy of motion judgment as the ratio of the pixels judged to be moving to the actual motion pixels. Figure 7(e) displays the relationship between the phase change rate and the accuracy of motion detection using FFDM under different noise conditions. Figure 8(a) shows the motion-induced change of PSP phase in the absence of noise, where we can find that the PFDM using PSP phase has the similar problem with FFDM. In contrast, the PFDM using FTP phase is almost identical in sensitivity to the same movement as shown in Fig. 8(b). Figures 8(c) and 8(d) display that the distributions of phase change are uniform compared to Fig. 7(d), where the PSP phase has almost no similar phenomenon to that in a noise-free environment due to the insensitivity to ambient light noise. The thresholds of PSP and FTP phase are selected respectively in the proposed way. Figures 8(e) and 8(f) show the relationships between the accuracy of motion judgment of PFDM using PSP phase and FTP phase and the speed of phase change under different noise. In comparison with Fig. 7(e), it is obvious that PFDM is superior to FFDM. Comparing Fig. 8(e) with Fig. 8(f), it can be seen that the motion judgment using the FTP phase is more accurate than that using the PSP phase.

 figure: Fig. 7

Fig. 7 Simulation results of FFDM. (a) Intensity changes of the first image of the three-step phase-shifting fringe images in a noise-free environment when the phase changes at a speed of π/150 per frame. (b) Intensity changes of three fringe images in a noise-free environment when the phase changes at a speed of π/150 per frame. (c) The intensity changes in a noisy environment when the phase changes at a speed of π/150 per frame. (d) Another perspective of (c). (e) The relationship between the accuracy of motion judgment and the speed of phase change under different noise.

Download Full Size | PPT Slide | PDF

 figure: Fig. 8

Fig. 8 Simulation results of PFDM. (a) The changes of the PSP phase in a noise-free environment when the phase changes at a speed of π/150 per frame. (b) The changes of the FTP phase in a noise-free environment when the phase changes at a speed of π/150 per frame. (c) The changes of the PSP phase in a noisy environment when the phase changes at a speed of π/150 per frame. (d) The changes of the FTP phase in a noisey environment when the phase changes at a speed of π/150 per frame. (e) The relationship between the accuracy of motion judgment of PFDM using PSP phases and the speed of phase change under different noise. (f) The relationship between the accuracy of motion judgment of PFDM using FTP phases and the speed of phase change under different noise.

Download Full Size | PPT Slide | PDF

5. Experiment

A quad-camera real-time 3D imaging system based on the proposed method is established. The system is shown in Fig. 9, where Fig. 9(a) is the outline structure, and Fig. 9(b) is the internal structure of the system. In this system, three monochrome cameras are used for SPU. A color camera is used for colorful texture. The three monochrome cameras whose model is Basler acA640-750um and the color camera whose model is Basler acA640-750uc are shown in Fig. 9(b) such as the Camera 1, Camera 2, Camera 4 and Camera 3. These cameras have a maximum resolution of 640 × 480, equipped with 12mm Computar lenses. The projector is LightCrafter 4500Pro with the resolution of 912 × 1140. In our experiments, the projection speed is 100 Hz, and all the cameras are synchronized by the trigger signal from the projector and 48-period phase-shifting fringe patterns are used.

 figure: Fig. 9

Fig. 9 The quad-camera color real-time 3D imaging system.

Download Full Size | PPT Slide | PDF

We firstly compare the performance of PFDM and FFDM. Then two complex motion scenes are designed to provide quantitative evaluation to verify the feasibility of the proposed algorithm. Besides, two experiments on complex rigid and non-rigid objects are implemented to give qualitative evaluation to prove the accuracy of the proposed method for judging the motion regions of complex rigid and non-rigid objects. Finally, the real-time performance of the algorithm is verified by the real-time system.

5.1. Comparison of two methods of motion judgment

In the first experiment, four different scenes are designed to compare the performance of FFDM and PFDM. In each scene, an object from rest to motion is measured and 8 groups of images are captured, with which FFDM and PFDM are applied to estimate the motion state of the measured object and that of 3DDCM is taken as the standard value. In the first scene, we measured a flat plate under translational motion as shown in Fig. 10(a). It can be seen from Fig. 11(a) that PFDM is obviously superior to FFDM. Figures 10(b) and 10(c) display the motion regions detected by two methods, from which we can also draw the above conclusion that higher accuracy of motion judgment can be obtained by PFDM. The second scene is the rotation of a flat plate as shown in Fig. 10(d). Figure 11(b) compares the results and Figs. 10(e) – 10(f) show the motion regions judged by two methods. These three figures convey a message to us that PFDM is still superior to FFDM for the rotational motion of rigid objects. An object with complex surface is measured in the third scene as shown in Fig. 10(g). The measurement results are shown in Fig. 11(c) and Figs. 10(h) – 10(i). These results indicate that PFDM performs better than FFDM in judging the motion of objects with complex surface. In the last scene, a freely moving hand is measured as shown in Fig. 10(j). Figure 11(d) is the comparison of the accuracy of FFDM and PFDM to determine motion and Figs. 10(k)–10(l) display the motion regions detected by FFDM and PFDM. Obviously, for the motion of non-rigid objects, the detection accuracy of PFDM is still better than that of FFDM. The specific measurement results are shown in Visualization 1.

 figure: Fig. 10

Fig. 10 The motion regions determined by FFDM and PFDM (See Visualization 1 for the whole results). (a) The first measurement scene (A flat plate in translational motion). (b) The result of FFDM of the first scene. (c) The result of PFDM of the first scene. (d) The second measurement scene (A flat plate in rotational motion). (e) The result of FFDM of the second scene. (f) The result of PFDM of the second scene. (g) The third measurement scene. (h) The result of FFDM of the third scene (A complex object in translational motion). (i) The result of PFDM of the third scene. (j) The fourth measurement scene (A hand in arbitrary motion). (k) The result of FFDM of the fourth scene. (l) The result of PFDM of the fourth scene.

Download Full Size | PPT Slide | PDF

 figure: Fig. 11

Fig. 11 The accuracy of motion judgment of FFPM and PFDM. (a) The result of the first scene. (b) The result of the second scene. (c) The result of the third scene. (d) The result of the fourth scene.

Download Full Size | PPT Slide | PDF

5.2. Quantitative evaluation

In the second experiment, two complex motion scenes are designed. The first scene contains both moving and stationary objects, which are a stationary ceramic plate and a precision ball for translational motion. The measurement results are shown in Fig. 12, where Fig. 12(a) is the captured background map, Fig. 12(b) is the detected motion areas, Figs. 12(c), 12(e), 12(f), 12(g) and 12(h) are the measurement results of the traditional PSP, and Figs. 12(d), 12(i), 12(j), 12(k) and 12(l) are those measured by the proposed method. As can be seen from Fig. 12(b), PFDM can accurately determine the motion region. For the moving ball, the sphere fittings on the measured data of the ball are performed. Figures 12(e) and 12(i) are the errors between the data measured by PSP and our method and the fitted data, and Figs. 12(f) and 12(j) are the histograms of Figs. 12(e) and 12(i). The RMS of the measurement error of the traditional PSP is 64um, and that of our method is 44um. For the non-moving plate, the plane fittings on the measured plate data of a certain area, as shown in the area inside the red dotted frame in Fig. 12(a) are performed. Figures 12(g) and 12(k) are the errors between the measured data of the two methods and the fitted data and Figs. 12(h) and 12(l) are the histograms of Figs. 12(g) and 12(k). The measurement results show that the proposed method can completely preserve the high precision of PSP for stationary objects, and the measurement precision is the same as that of PSP, which is 37um. The specific measurement results are shown in Visualization 2.

 figure: Fig. 12

Fig. 12 Measurement results in the first complex scene (See Visualization 2 for the whole results). (a) The captured all-white map. (b) The detected motion areas. (c) The results measured by the conventional PSP. (d) The results measured by our method. (e) The error distribution of the precision ball data in (c). (g) The error distribution of the flat plate data in (c). (i) The error distribution of the data of the precision ball in (d). (k) The error distribution of the flat plate data in (d). (f), (h), (j), (l) The histograms of (e), (g), (i), (k).

Download Full Size | PPT Slide | PDF

In the second scene, the measured object is a rotating flat plate. For the rotating object, the portion near the center of rotation can be considered to be stationary, and those on both sides of the center of rotation are moving, which presents two different motion states. The measurement results are shown in Fig. 13, where Fig. 13(a) is the background map, Fig. 13(b) is the detected motion areas, Figs. 13(c), 13(e) and 13(g) are the results measured by the traditional PSP, and Figs. 13(d), 13(f) and 13(h) are those of the proposed method. The plane fittings on the measured plate data of a certain area, as shown in the area inside the red dotted frame in Fig. 13(a) are performed. Figures 13(e) and 13(f) are the errors between the measured data of the two methods and the fitted data and Figs. 13(g) and 13(h) are the histograms of Figs. 13(e) and 13(f), respectively. It can be seen from Fig. 13(b) that PFDM judges the center of the object rotation to be stationary, and the two sides of the center of the rotation are judged to be moving. The RMS of the measurement error of the conventional PSP is 89um. The results measured by our method combine the results of PSP and FTP, in which the data in the red dashed box as shown in Fig. 13(f) is that of PSP while that of other region is the result of FTP. The RMS of the measurement error of our method is 40um, which is better than that of the traditional PSP. The specific measurement results are shown in Visualization 3.

 figure: Fig. 13

Fig. 13 Measurement results in the second complex scenario (see Visualization 3 for the whole results). (a) The background map collected. (b) The detected motion areas. (c) The results measured by the conventional PSP. (e) The error distribution of the flat plate data of (c). (g)The histogram of (e). (d) The results measured by our method. (f) The error distribution of the flat plate data in (e). (h) The histogram of (f).

Download Full Size | PPT Slide | PDF

5.3. Qualitative evaluation

In this experiment, we design two complicated scenes. In the first scene, two rigid objects with complex surface are measured, one of which is at rest and the other is in motion. The measurement results are shown in Fig. 14, where Figs. 14(a) and 14(c) are the results of PSP and our method respectively. It can be seen from the measurement results that there are obvious motion ripples in the moving model of Fig. 14(a). The fusion method proposed in this work can judge the moving model and replace the PSP result of it with that of FTP, as shown in Figs. 14(b) and 14(c). For the non-moving model, the result of PSP is completely adopted. The specific measurement results are shown in Visualization 4.

 figure: Fig. 14

Fig. 14 The measurement results of the first scene (see Visualization 4 for the whole results). (a) The measurement results of PSP. (b) The motion areas determined by PFDM. (c) The measurement results of our method.

Download Full Size | PPT Slide | PDF

In the second scene, a moving hand is measured, which is a non-rigid body with complex motion. Figure 15 shows the measured results, where Figs. 15(a) and 15(c) are the results of PSP and our method respectively, and Fig. 15(b) is the motion regions detected by PFDM. We can see from the measurement results that the proposed method can eliminate the motion-induced ripples of non-rigid object. The specific measurement results are shown in Visualization 5.

 figure: Fig. 15

Fig. 15 The measurement results of the second scene (see Visualization 5 for the whole results). (a) The measurement results of PSP. (b) The motion areas determined by PFDM. (c) The measurement results of our method.

Download Full Size | PPT Slide | PDF

5.4. Real-time experiments

In the final experiment, the quad-camera real-time 3D imaging system based on the proposed method is used to measure the small fan and David model in motion by PSP and our method respectively. We use a HP Z230 computer (Intel Xeon E3-1226 v3 CPU, NVIDIA Quadro K2200 GPU). The visual interface is developed with Qt and all core algorithms are written based on CUDA. CUDA is a parallel computing architecture developed by NVIDIA, and it allows programs to access the memory of the graphical device and take advantage of the GPU to process data, with which the entire process of our algorithm can be done solely within the GPU. In order to display the 3D data provided by the GPU and CUDA, OpenGL (Open Graphics Library) is also needed for the purpose of fast and reliable display, which is a software interface to the graphic device, and is used to draw pixel or vertices on the screen. By using the OpenGL interoperability with CUDA, the 3D data generated by the CUDA part can be shown on screen rapidly. The real-time measurement results are shown in Fig. 16. It can be observed from the measurement results that for the motion regions, the results of PSP have obvious motion ripples, while those of our method have no motion ripples in which the results of PSP are replaced by those of FTP. About 30 fps reconstructed speed can be achieved by our method. The real-time measurement processes and results of PSP and our method can be found in Visualization 6 and Visualization 7.

 figure: Fig. 16

Fig. 16 The real-time measurement processes and results based on (a) PSP (see Visualization 6 for the whole process) and (b) our method (see Visualization 7 for the whole process).

Download Full Size | PPT Slide | PDF

6. Conclusion

In this paper, we have presented a novel hybrid Fourier-transform phase-shifting profilometry method for rigid and non-rigid objects in the complex scenes which contain both static and dynamic motions. Firstly, we use a 1+3 fringe strategy to retrieve the phase information of the measured objects with both PSP and FTP assisted by ADC which can efficiently and unambiguously achieve phase unwrapping of the high frequency fringes. Then, we develop a phase-based motion detection strategy to accurately determine the motion state of each pixel. Finally, based on the detected regions of motion, we combine the results of PSP and FTP, i.e., compensating for the motion-induced error of PSP using FTP in motion regions, and completely retaining the result of PSP for the stationary regions. Several experiments have verified that the proposed method can obtain more precise 3D reconstruction than traditional real-time strategy based on PSP or FTP. Compared to other motion-compensation methods such as Lu’s method [40], which can significantly eliminate the effects of 3D motion on PSP by redefining the motion-affected fringe pattern and the use of an iterative least-squares algorithm but is only applicable to a single movement of non-rigid objects, our approach is suitable for objects under both rigid and non-rigid motion, showing the higher feasibility for more kinds of scenes, and easier to implement.

There are several aspects that need to be further improved in the proposed method, which we will leave for future consideration. First, when the object is starting to move, the motion-induced ripple of PSP is relatively small. In this case, even if PSP is affected by the motion, the precision of it is still better than FTP. So a more appropriate threshold should be considered in order to ignore the motion within the tolerable range of PSP. Second, in this work, we use the FTP retrieved phase values for the motion region to avoid the motion ripple. The measurement accuracy, though much improved compared to the motion-suffered PSP, is still much lower than the PSP when the object is motionless. Nevertheless, it should be noted that in FTP phase retrieval, only one out of the three fringe patterns is adopted and the other two are not used. How to further improve the measurement quality by making full use of the three fringe patterns is an another interesting direction for further investigation.

Funding

National Natural Science Fund of China (61722506, 61705105, 111574152); National Key R&D Program of China (2017YFF0106403); Final Assembly ‘13th Five-Year Plan’ Advanced Research Project of China (30102070102); Equipment Advanced Research Fund of China (61404150202), The Key Research and Development Program of Jiangsu Province, China (BE2017162); Outstanding Youth Foundation of Jiangsu Province of China (BK20170034); National Defense Science and Technology Foundation of China (0106173); ‘Six Talent Peaks’ project of Jiangsu Province, China (2015-DZXX-009); ‘333 Engineering’ research project of Jiangsu Province, China (BRA2016407, BRA2015294); Fundamental Research Funds for the Central Universities (30917011204, 30916011322); Open Research Fund of Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense (3091601410414); China Postdoctoral Science Foundation (2017M621747), and Jiangsu Planned Projects for Postdoctoral Research Funds (1701038A).

References

1. J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010). [CrossRef]  

2. J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics 3, 128–160 (2011). [CrossRef]  

3. D. Marr and T. Poggio, “A computational theory of human stereo vision,” Proc. R. Soc. Lond. B 204, 301–328 (1979). [CrossRef]   [PubMed]  

4. S. D. Cochran and G. Medioni, “3-d surface description from binocular stereo,” IEEE Trans. Pattern Anal. Mach. Intell 10, 981–994 (1992). [CrossRef]  

5. S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (tof) cameras: A survey,” IEEE Sensors J. 11, 1917–1926 (2011). [CrossRef]  

6. Y. Cui, S. Schuon, D. Chan, S. Thrun, and C. Theobalt, “3d shape scanning with a time-of-flight camera,” in 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), pp. 1173–1180.

7. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48, 133–140 (2010). [CrossRef]  

8. K. L. Boyer and A. C. Kak, “Color-encoded structured light for rapid active ranging,” IEEE Trans. Pattern Anal. Mach. Intell 1, 14–28 (1987). [CrossRef]   [PubMed]  

9. L. Zhang, B. Curless, and S. M. Seitz, “Rapid shape acquisition using color structured light and multi-pass dynamic programming,” in Proceedings. First International Symposium on 3D Data Processing Visualization and Transmission, (2002), pp. 24–36.

10. D. Scharstein and R. Szeliski, “High-accuracy stereo depth maps using structured light,” in 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings, vol. 1 (2003), pp. 195–202.

11. S. Zhang, “Recent progresses on real-time 3d shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48, 149–158 (2010). [CrossRef]  

12. S. Van der Jeught and J. J. Dirckx, “Real-time structured light profilometry: a review,” Opt. Lasers Eng. 87, 18–31 (2016). [CrossRef]  

13. C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018). [CrossRef]  

14. V. Srinivasan, H.-C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-d diffuse objects,” Appl. Opt. 23, 3105–3108 (1984). [CrossRef]  

15. S. Feng, Y. Zhang, Q. Chen, C. Zuo, R. Li, and G. Shen, “General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique,” Opt. Lasers Eng. 59, 56–71 (2014). [CrossRef]  

16. X. Su and Q. Zhang, “Dynamic 3-d shape measurement method: a review,” Opt. Lasers Eng. 48, 191–204 (2010). [CrossRef]  

17. Q. Kemao, “Two-dimensional windowed fourier transform for fringe pattern analysis: principles, applications and implementations,” Opt. Lasers Eng. 45, 304–317 (2007). [CrossRef]  

18. Q. Kemao, “Windowed fourier transform for fringe pattern analysis,” Appl. Opt. 43, 2695–2702 (2004). [CrossRef]   [PubMed]  

19. J. Zhong and J. Weng, “Spatial carrier-fringe pattern analysis by means of wavelet transform: wavelet transform profilometry,” Appl. Opt. 43, 4993–4998 (2004). [CrossRef]  

20. L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, “Comparison of fourier transform, windowed fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry,” Opt. Lasers Eng. 48, 141–148 (2010). [CrossRef]  

21. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. 72, 156–160 (1982). [CrossRef]  

22. J. Li, X. Su, and L. Guo, “Improved fourier transform profilometry for the automatic measurement of three-dimensional object shapes,” Opt. Eng. 29, 1439–1445 (1990). [CrossRef]  

23. H. Guo and P. S. Huang, “3-d shape measurement by use of a modified fourier transform method,” Proc. SPIE 7066, 70660E (2008).

24. C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro fourier transform profilometry (μftp): 3d shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018). [CrossRef]  

25. J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” JOSA A 20, 106–115 (2003). [CrossRef]  

26. X.-Y. Su, G. Von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. 98, 141–150 (1993). [CrossRef]  

27. Q. Zhang and X. Su, “High-speed optical measurement for the drumhead vibration,” Opt. Express 13, 3110–3116 (2005). [CrossRef]  

28. S. Zhang, D. Van Der Weide, and J. Oliver, “Superfast phase-shifting method for 3-d shape measurement,” Opt. Express 18, 9684–9689 (2010). [CrossRef]  

29. C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016). [CrossRef]  

30. G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. 38, 6565–6573 (1999). [CrossRef]  

31. Z. Zhang, C. E. Towers, and D. P. Towers, “Time efficient color fringe projection system for 3d shape and color using optimum 3-frequency selection,” Opt. Express 14, 6444–6455 (2006). [CrossRef]  

32. K. Liu, Y. Wang, D. L. Lau, Q. Hao, and L. G. Hassebrook, “Dual-frequency pattern scheme for high-speed 3-d shape measurement,” Opt. Express 18, 5229–5244 (2010). [CrossRef]  

33. C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013). [CrossRef]  

34. C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20, 19493–19510 (2012). [CrossRef]  

35. C. Bräuer-Burchardt, C. Munkelt, M. Heinze, P. Kühmstedt, and G. Notni, “Using geometric constraints to solve the point correspondence problem in fringe projection based 3d measuring systems,” in International Conference on Image Analysis and Processing, (2011), pp. 265–274.

36. X. Liu and J. Kofman, “High-frequency background modulation fringe patterns based on a fringe-wavelength geometry-constraint model for 3d surface-shape measurement,” Opt. Express 25, 16618–16628 (2017). [CrossRef]   [PubMed]  

37. T. Weise, B. Leibe, and L. V. Gool, “Fast 3d scanning with automatic motion compensation,” in 2007 IEEE Conference on Computer Vision and Pattern Recognition, (2007), pp. 1–8.

38. T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26, 22440–22456 (2018). [CrossRef]  

39. T. Tao, Q. Chen, S. Feng, Y. Hu, M. Zhang, and C. Zuo, “High-precision real-time 3d shape measurement based on a quad-camera system,” J. Opt. 20, 014009 (2017). [CrossRef]  

40. L. Lu, J. Xi, Y. Yu, and Q. Guo, “Improving the accuracy performance of phase-shifting profilometry for the measurement of objects in motion,” Opt. Lett. 39, 6715–6718 (2014). [CrossRef]  

41. S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018). [CrossRef]  

42. B. Li, Z. Liu, and S. Zhang, “Motion-induced error reduction by combining fourier transform profilometry with phase-shifting profilometry,” Opt. Express 24, 23289–23303 (2016). [CrossRef]  

43. P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process. 9, 396–408 (2015). [CrossRef]  

44. Z. Liu, P. C. Zibley, and S. Zhang, “Motion-induced error compensation for phase shifting profilometry,” Opt. Express 26, 12632–12637 (2018). [CrossRef]  

45. Z. Yang, Z. Xiong, Y. Zhang, J. Wang, and F. Wu, “Depth acquisition from density modulated binary patterns,” in Proceedings of the IEEE conference on computer vision and pattern recognition, (2013), pp. 25–32.

46. Y. Zhang, Z. Xiong, and F. Wu, “Hybrid structured light for scalable depth sensing,” 2012 19th IEEE Int. Conf. on Image Process. pp. 17–20 (2012).

47. Y. Yin, X. Peng, A. Li, X. Liu, and B. Z. Gao, “Calibration of fringe projection profilometry with bundle adjustment strategy,” Opt. Lett. 37, 542–544 (2012). [CrossRef]   [PubMed]  

48. Z. Zhang, “Review of single-shot 3d shape measurement by phase calculation-based fringe projection techniques,” Opt. Lasers Eng. 50, 1097–1106 (2012). [CrossRef]  

49. P. Wang, J. Wang, J. Xu, Y. Guan, G. Zhang, and K. Chen, “Calibration method for a large-scale structured light measurement system,” Appl. Opt. 56, 3995–4002 (2017). [CrossRef]  

50. Z. Li, K. Zhong, Y. F. Li, X. Zhou, and Y. Shi, “Multiview phase shifting: a full-resolution and high-speed 3d measurement framework for arbitrary shape dynamic objects,” Opt. Lett. 38, 1389–1391 (2013). [CrossRef]  

References

  • View by:

  1. J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
    [Crossref]
  2. J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics 3, 128–160 (2011).
    [Crossref]
  3. D. Marr and T. Poggio, “A computational theory of human stereo vision,” Proc. R. Soc. Lond. B 204, 301–328 (1979).
    [Crossref] [PubMed]
  4. S. D. Cochran and G. Medioni, “3-d surface description from binocular stereo,” IEEE Trans. Pattern Anal. Mach. Intell 10, 981–994 (1992).
    [Crossref]
  5. S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (tof) cameras: A survey,” IEEE Sensors J. 11, 1917–1926 (2011).
    [Crossref]
  6. Y. Cui, S. Schuon, D. Chan, S. Thrun, and C. Theobalt, “3d shape scanning with a time-of-flight camera,” in 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), pp. 1173–1180.
  7. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48, 133–140 (2010).
    [Crossref]
  8. K. L. Boyer and A. C. Kak, “Color-encoded structured light for rapid active ranging,” IEEE Trans. Pattern Anal. Mach. Intell 1, 14–28 (1987).
    [Crossref] [PubMed]
  9. L. Zhang, B. Curless, and S. M. Seitz, “Rapid shape acquisition using color structured light and multi-pass dynamic programming,” in Proceedings. First International Symposium on 3D Data Processing Visualization and Transmission, (2002), pp. 24–36.
  10. D. Scharstein and R. Szeliski, “High-accuracy stereo depth maps using structured light,” in 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings, vol. 1 (2003), pp. 195–202.
  11. S. Zhang, “Recent progresses on real-time 3d shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48, 149–158 (2010).
    [Crossref]
  12. S. Van der Jeught and J. J. Dirckx, “Real-time structured light profilometry: a review,” Opt. Lasers Eng. 87, 18–31 (2016).
    [Crossref]
  13. C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
    [Crossref]
  14. V. Srinivasan, H.-C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-d diffuse objects,” Appl. Opt. 23, 3105–3108 (1984).
    [Crossref]
  15. S. Feng, Y. Zhang, Q. Chen, C. Zuo, R. Li, and G. Shen, “General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique,” Opt. Lasers Eng. 59, 56–71 (2014).
    [Crossref]
  16. X. Su and Q. Zhang, “Dynamic 3-d shape measurement method: a review,” Opt. Lasers Eng. 48, 191–204 (2010).
    [Crossref]
  17. Q. Kemao, “Two-dimensional windowed fourier transform for fringe pattern analysis: principles, applications and implementations,” Opt. Lasers Eng. 45, 304–317 (2007).
    [Crossref]
  18. Q. Kemao, “Windowed fourier transform for fringe pattern analysis,” Appl. Opt. 43, 2695–2702 (2004).
    [Crossref] [PubMed]
  19. J. Zhong and J. Weng, “Spatial carrier-fringe pattern analysis by means of wavelet transform: wavelet transform profilometry,” Appl. Opt. 43, 4993–4998 (2004).
    [Crossref]
  20. L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, “Comparison of fourier transform, windowed fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry,” Opt. Lasers Eng. 48, 141–148 (2010).
    [Crossref]
  21. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. 72, 156–160 (1982).
    [Crossref]
  22. J. Li, X. Su, and L. Guo, “Improved fourier transform profilometry for the automatic measurement of three-dimensional object shapes,” Opt. Eng. 29, 1439–1445 (1990).
    [Crossref]
  23. H. Guo and P. S. Huang, “3-d shape measurement by use of a modified fourier transform method,” Proc. SPIE 7066, 70660E (2008).
  24. C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro fourier transform profilometry (μftp): 3d shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018).
    [Crossref]
  25. J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” JOSA A 20, 106–115 (2003).
    [Crossref]
  26. X.-Y. Su, G. Von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. 98, 141–150 (1993).
    [Crossref]
  27. Q. Zhang and X. Su, “High-speed optical measurement for the drumhead vibration,” Opt. Express 13, 3110–3116 (2005).
    [Crossref]
  28. S. Zhang, D. Van Der Weide, and J. Oliver, “Superfast phase-shifting method for 3-d shape measurement,” Opt. Express 18, 9684–9689 (2010).
    [Crossref]
  29. C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
    [Crossref]
  30. G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. 38, 6565–6573 (1999).
    [Crossref]
  31. Z. Zhang, C. E. Towers, and D. P. Towers, “Time efficient color fringe projection system for 3d shape and color using optimum 3-frequency selection,” Opt. Express 14, 6444–6455 (2006).
    [Crossref]
  32. K. Liu, Y. Wang, D. L. Lau, Q. Hao, and L. G. Hassebrook, “Dual-frequency pattern scheme for high-speed 3-d shape measurement,” Opt. Express 18, 5229–5244 (2010).
    [Crossref]
  33. C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013).
    [Crossref]
  34. C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20, 19493–19510 (2012).
    [Crossref]
  35. C. Bräuer-Burchardt, C. Munkelt, M. Heinze, P. Kühmstedt, and G. Notni, “Using geometric constraints to solve the point correspondence problem in fringe projection based 3d measuring systems,” in International Conference on Image Analysis and Processing, (2011), pp. 265–274.
  36. X. Liu and J. Kofman, “High-frequency background modulation fringe patterns based on a fringe-wavelength geometry-constraint model for 3d surface-shape measurement,” Opt. Express 25, 16618–16628 (2017).
    [Crossref] [PubMed]
  37. T. Weise, B. Leibe, and L. V. Gool, “Fast 3d scanning with automatic motion compensation,” in 2007 IEEE Conference on Computer Vision and Pattern Recognition, (2007), pp. 1–8.
  38. T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26, 22440–22456 (2018).
    [Crossref]
  39. T. Tao, Q. Chen, S. Feng, Y. Hu, M. Zhang, and C. Zuo, “High-precision real-time 3d shape measurement based on a quad-camera system,” J. Opt. 20, 014009 (2017).
    [Crossref]
  40. L. Lu, J. Xi, Y. Yu, and Q. Guo, “Improving the accuracy performance of phase-shifting profilometry for the measurement of objects in motion,” Opt. Lett. 39, 6715–6718 (2014).
    [Crossref]
  41. S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018).
    [Crossref]
  42. B. Li, Z. Liu, and S. Zhang, “Motion-induced error reduction by combining fourier transform profilometry with phase-shifting profilometry,” Opt. Express 24, 23289–23303 (2016).
    [Crossref]
  43. P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process. 9, 396–408 (2015).
    [Crossref]
  44. Z. Liu, P. C. Zibley, and S. Zhang, “Motion-induced error compensation for phase shifting profilometry,” Opt. Express 26, 12632–12637 (2018).
    [Crossref]
  45. Z. Yang, Z. Xiong, Y. Zhang, J. Wang, and F. Wu, “Depth acquisition from density modulated binary patterns,” in Proceedings of the IEEE conference on computer vision and pattern recognition, (2013), pp. 25–32.
  46. Y. Zhang, Z. Xiong, and F. Wu, “Hybrid structured light for scalable depth sensing,” 2012 19th IEEE Int. Conf. on Image Process. pp. 17–20 (2012).
  47. Y. Yin, X. Peng, A. Li, X. Liu, and B. Z. Gao, “Calibration of fringe projection profilometry with bundle adjustment strategy,” Opt. Lett. 37, 542–544 (2012).
    [Crossref] [PubMed]
  48. Z. Zhang, “Review of single-shot 3d shape measurement by phase calculation-based fringe projection techniques,” Opt. Lasers Eng. 50, 1097–1106 (2012).
    [Crossref]
  49. P. Wang, J. Wang, J. Xu, Y. Guan, G. Zhang, and K. Chen, “Calibration method for a large-scale structured light measurement system,” Appl. Opt. 56, 3995–4002 (2017).
    [Crossref]
  50. Z. Li, K. Zhong, Y. F. Li, X. Zhou, and Y. Shi, “Multiview phase shifting: a full-resolution and high-speed 3d measurement framework for arbitrary shape dynamic objects,” Opt. Lett. 38, 1389–1391 (2013).
    [Crossref]

2018 (5)

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro fourier transform profilometry (μftp): 3d shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018).
[Crossref]

T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26, 22440–22456 (2018).
[Crossref]

S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018).
[Crossref]

Z. Liu, P. C. Zibley, and S. Zhang, “Motion-induced error compensation for phase shifting profilometry,” Opt. Express 26, 12632–12637 (2018).
[Crossref]

2017 (3)

2016 (3)

B. Li, Z. Liu, and S. Zhang, “Motion-induced error reduction by combining fourier transform profilometry with phase-shifting profilometry,” Opt. Express 24, 23289–23303 (2016).
[Crossref]

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

S. Van der Jeught and J. J. Dirckx, “Real-time structured light profilometry: a review,” Opt. Lasers Eng. 87, 18–31 (2016).
[Crossref]

2015 (1)

P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process. 9, 396–408 (2015).
[Crossref]

2014 (2)

L. Lu, J. Xi, Y. Yu, and Q. Guo, “Improving the accuracy performance of phase-shifting profilometry for the measurement of objects in motion,” Opt. Lett. 39, 6715–6718 (2014).
[Crossref]

S. Feng, Y. Zhang, Q. Chen, C. Zuo, R. Li, and G. Shen, “General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique,” Opt. Lasers Eng. 59, 56–71 (2014).
[Crossref]

2013 (2)

C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013).
[Crossref]

Z. Li, K. Zhong, Y. F. Li, X. Zhou, and Y. Shi, “Multiview phase shifting: a full-resolution and high-speed 3d measurement framework for arbitrary shape dynamic objects,” Opt. Lett. 38, 1389–1391 (2013).
[Crossref]

2012 (3)

2011 (2)

S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (tof) cameras: A survey,” IEEE Sensors J. 11, 1917–1926 (2011).
[Crossref]

J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics 3, 128–160 (2011).
[Crossref]

2010 (7)

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48, 133–140 (2010).
[Crossref]

S. Zhang, “Recent progresses on real-time 3d shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48, 149–158 (2010).
[Crossref]

X. Su and Q. Zhang, “Dynamic 3-d shape measurement method: a review,” Opt. Lasers Eng. 48, 191–204 (2010).
[Crossref]

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, “Comparison of fourier transform, windowed fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry,” Opt. Lasers Eng. 48, 141–148 (2010).
[Crossref]

K. Liu, Y. Wang, D. L. Lau, Q. Hao, and L. G. Hassebrook, “Dual-frequency pattern scheme for high-speed 3-d shape measurement,” Opt. Express 18, 5229–5244 (2010).
[Crossref]

S. Zhang, D. Van Der Weide, and J. Oliver, “Superfast phase-shifting method for 3-d shape measurement,” Opt. Express 18, 9684–9689 (2010).
[Crossref]

2008 (1)

H. Guo and P. S. Huang, “3-d shape measurement by use of a modified fourier transform method,” Proc. SPIE 7066, 70660E (2008).

2007 (1)

Q. Kemao, “Two-dimensional windowed fourier transform for fringe pattern analysis: principles, applications and implementations,” Opt. Lasers Eng. 45, 304–317 (2007).
[Crossref]

2006 (1)

2005 (1)

2004 (2)

2003 (1)

J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” JOSA A 20, 106–115 (2003).
[Crossref]

1999 (1)

1993 (1)

X.-Y. Su, G. Von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. 98, 141–150 (1993).
[Crossref]

1992 (1)

S. D. Cochran and G. Medioni, “3-d surface description from binocular stereo,” IEEE Trans. Pattern Anal. Mach. Intell 10, 981–994 (1992).
[Crossref]

1990 (1)

J. Li, X. Su, and L. Guo, “Improved fourier transform profilometry for the automatic measurement of three-dimensional object shapes,” Opt. Eng. 29, 1439–1445 (1990).
[Crossref]

1987 (1)

K. L. Boyer and A. C. Kak, “Color-encoded structured light for rapid active ranging,” IEEE Trans. Pattern Anal. Mach. Intell 1, 14–28 (1987).
[Crossref] [PubMed]

1984 (1)

1982 (1)

1979 (1)

D. Marr and T. Poggio, “A computational theory of human stereo vision,” Proc. R. Soc. Lond. B 204, 301–328 (1979).
[Crossref] [PubMed]

Alenya, G.

S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (tof) cameras: A survey,” IEEE Sensors J. 11, 1917–1926 (2011).
[Crossref]

Asundi, A.

C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro fourier transform profilometry (μftp): 3d shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018).
[Crossref]

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Asundi, A. K.

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, “Comparison of fourier transform, windowed fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry,” Opt. Lasers Eng. 48, 141–148 (2010).
[Crossref]

Boyer, K. L.

K. L. Boyer and A. C. Kak, “Color-encoded structured light for rapid active ranging,” IEEE Trans. Pattern Anal. Mach. Intell 1, 14–28 (1987).
[Crossref] [PubMed]

Bräuer-Burchardt, C.

C. Bräuer-Burchardt, C. Munkelt, M. Heinze, P. Kühmstedt, and G. Notni, “Using geometric constraints to solve the point correspondence problem in fringe projection based 3d measuring systems,” in International Conference on Image Analysis and Processing, (2011), pp. 265–274.

Carocci, M.

Chan, D.

Y. Cui, S. Schuon, D. Chan, S. Thrun, and C. Theobalt, “3d shape scanning with a time-of-flight camera,” in 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), pp. 1173–1180.

Chen, K.

Chen, Q.

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro fourier transform profilometry (μftp): 3d shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018).
[Crossref]

T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26, 22440–22456 (2018).
[Crossref]

S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018).
[Crossref]

T. Tao, Q. Chen, S. Feng, Y. Hu, M. Zhang, and C. Zuo, “High-precision real-time 3d shape measurement based on a quad-camera system,” J. Opt. 20, 014009 (2017).
[Crossref]

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

S. Feng, Y. Zhang, Q. Chen, C. Zuo, R. Li, and G. Shen, “General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique,” Opt. Lasers Eng. 59, 56–71 (2014).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20, 19493–19510 (2012).
[Crossref]

Cochran, S. D.

S. D. Cochran and G. Medioni, “3-d surface description from binocular stereo,” IEEE Trans. Pattern Anal. Mach. Intell 10, 981–994 (1992).
[Crossref]

Cong, P.

P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process. 9, 396–408 (2015).
[Crossref]

Cui, Y.

Y. Cui, S. Schuon, D. Chan, S. Thrun, and C. Theobalt, “3d shape scanning with a time-of-flight camera,” in 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), pp. 1173–1180.

Curless, B.

L. Zhang, B. Curless, and S. M. Seitz, “Rapid shape acquisition using color structured light and multi-pass dynamic programming,” in Proceedings. First International Symposium on 3D Data Processing Visualization and Transmission, (2002), pp. 24–36.

Dirckx, J. J.

S. Van der Jeught and J. J. Dirckx, “Real-time structured light profilometry: a review,” Opt. Lasers Eng. 87, 18–31 (2016).
[Crossref]

Feng, F.

C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20, 19493–19510 (2012).
[Crossref]

Feng, S.

T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26, 22440–22456 (2018).
[Crossref]

S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018).
[Crossref]

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro fourier transform profilometry (μftp): 3d shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018).
[Crossref]

T. Tao, Q. Chen, S. Feng, Y. Hu, M. Zhang, and C. Zuo, “High-precision real-time 3d shape measurement based on a quad-camera system,” J. Opt. 20, 014009 (2017).
[Crossref]

S. Feng, Y. Zhang, Q. Chen, C. Zuo, R. Li, and G. Shen, “General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique,” Opt. Lasers Eng. 59, 56–71 (2014).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20, 19493–19510 (2012).
[Crossref]

Fernandez, S.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

Foix, S.

S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (tof) cameras: A survey,” IEEE Sensors J. 11, 1917–1926 (2011).
[Crossref]

Gao, B. Z.

Geng, J.

J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics 3, 128–160 (2011).
[Crossref]

Gool, L. V.

T. Weise, B. Leibe, and L. V. Gool, “Fast 3d scanning with automatic motion compensation,” in 2007 IEEE Conference on Computer Vision and Pattern Recognition, (2007), pp. 1–8.

Gorthi, S. S.

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48, 133–140 (2010).
[Crossref]

Gu, G.

S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20, 19493–19510 (2012).
[Crossref]

Guan, C.

J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” JOSA A 20, 106–115 (2003).
[Crossref]

Guan, Y.

Guo, H.

H. Guo and P. S. Huang, “3-d shape measurement by use of a modified fourier transform method,” Proc. SPIE 7066, 70660E (2008).

Guo, L.

J. Li, X. Su, and L. Guo, “Improved fourier transform profilometry for the automatic measurement of three-dimensional object shapes,” Opt. Eng. 29, 1439–1445 (1990).
[Crossref]

Guo, Q.

Halioua, M.

Hao, Q.

Hassebrook, L. G.

K. Liu, Y. Wang, D. L. Lau, Q. Hao, and L. G. Hassebrook, “Dual-frequency pattern scheme for high-speed 3-d shape measurement,” Opt. Express 18, 5229–5244 (2010).
[Crossref]

J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” JOSA A 20, 106–115 (2003).
[Crossref]

Heinze, M.

C. Bräuer-Burchardt, C. Munkelt, M. Heinze, P. Kühmstedt, and G. Notni, “Using geometric constraints to solve the point correspondence problem in fringe projection based 3d measuring systems,” in International Conference on Image Analysis and Processing, (2011), pp. 265–274.

Hu, Y.

T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26, 22440–22456 (2018).
[Crossref]

S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018).
[Crossref]

T. Tao, Q. Chen, S. Feng, Y. Hu, M. Zhang, and C. Zuo, “High-precision real-time 3d shape measurement based on a quad-camera system,” J. Opt. 20, 014009 (2017).
[Crossref]

Huang, L.

T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26, 22440–22456 (2018).
[Crossref]

C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro fourier transform profilometry (μftp): 3d shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018).
[Crossref]

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, “Comparison of fourier transform, windowed fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry,” Opt. Lasers Eng. 48, 141–148 (2010).
[Crossref]

Huang, P. S.

H. Guo and P. S. Huang, “3-d shape measurement by use of a modified fourier transform method,” Proc. SPIE 7066, 70660E (2008).

Ina, H.

Kak, A. C.

K. L. Boyer and A. C. Kak, “Color-encoded structured light for rapid active ranging,” IEEE Trans. Pattern Anal. Mach. Intell 1, 14–28 (1987).
[Crossref] [PubMed]

Kemao, Q.

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, “Comparison of fourier transform, windowed fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry,” Opt. Lasers Eng. 48, 141–148 (2010).
[Crossref]

Q. Kemao, “Two-dimensional windowed fourier transform for fringe pattern analysis: principles, applications and implementations,” Opt. Lasers Eng. 45, 304–317 (2007).
[Crossref]

Q. Kemao, “Windowed fourier transform for fringe pattern analysis,” Appl. Opt. 43, 2695–2702 (2004).
[Crossref] [PubMed]

Kobayashi, S.

Kofman, J.

Kühmstedt, P.

C. Bräuer-Burchardt, C. Munkelt, M. Heinze, P. Kühmstedt, and G. Notni, “Using geometric constraints to solve the point correspondence problem in fringe projection based 3d measuring systems,” in International Conference on Image Analysis and Processing, (2011), pp. 265–274.

Lau, D. L.

Leibe, B.

T. Weise, B. Leibe, and L. V. Gool, “Fast 3d scanning with automatic motion compensation,” in 2007 IEEE Conference on Computer Vision and Pattern Recognition, (2007), pp. 1–8.

Li, A.

Li, B.

Li, J.

J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” JOSA A 20, 106–115 (2003).
[Crossref]

J. Li, X. Su, and L. Guo, “Improved fourier transform profilometry for the automatic measurement of three-dimensional object shapes,” Opt. Eng. 29, 1439–1445 (1990).
[Crossref]

Li, R.

S. Feng, Y. Zhang, Q. Chen, C. Zuo, R. Li, and G. Shen, “General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique,” Opt. Lasers Eng. 59, 56–71 (2014).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013).
[Crossref]

Li, Y. F.

Li, Z.

Liu, H.-C.

Liu, K.

Liu, X.

Liu, Z.

Llado, X.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

Lu, L.

Marr, D.

D. Marr and T. Poggio, “A computational theory of human stereo vision,” Proc. R. Soc. Lond. B 204, 301–328 (1979).
[Crossref] [PubMed]

Medioni, G.

S. D. Cochran and G. Medioni, “3-d surface description from binocular stereo,” IEEE Trans. Pattern Anal. Mach. Intell 10, 981–994 (1992).
[Crossref]

Munkelt, C.

C. Bräuer-Burchardt, C. Munkelt, M. Heinze, P. Kühmstedt, and G. Notni, “Using geometric constraints to solve the point correspondence problem in fringe projection based 3d measuring systems,” in International Conference on Image Analysis and Processing, (2011), pp. 265–274.

Notni, G.

C. Bräuer-Burchardt, C. Munkelt, M. Heinze, P. Kühmstedt, and G. Notni, “Using geometric constraints to solve the point correspondence problem in fringe projection based 3d measuring systems,” in International Conference on Image Analysis and Processing, (2011), pp. 265–274.

Oliver, J.

Pan, B.

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, “Comparison of fourier transform, windowed fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry,” Opt. Lasers Eng. 48, 141–148 (2010).
[Crossref]

Peng, X.

Poggio, T.

D. Marr and T. Poggio, “A computational theory of human stereo vision,” Proc. R. Soc. Lond. B 204, 301–328 (1979).
[Crossref] [PubMed]

Pribanic, T.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

Qian, J.

Rastogi, P.

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48, 133–140 (2010).
[Crossref]

Rodella, R.

Salvi, J.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

Sansoni, G.

Scharstein, D.

D. Scharstein and R. Szeliski, “High-accuracy stereo depth maps using structured light,” in 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings, vol. 1 (2003), pp. 195–202.

Schuon, S.

Y. Cui, S. Schuon, D. Chan, S. Thrun, and C. Theobalt, “3d shape scanning with a time-of-flight camera,” in 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), pp. 1173–1180.

Seitz, S. M.

L. Zhang, B. Curless, and S. M. Seitz, “Rapid shape acquisition using color structured light and multi-pass dynamic programming,” in Proceedings. First International Symposium on 3D Data Processing Visualization and Transmission, (2002), pp. 24–36.

Shen, G.

S. Feng, Y. Zhang, Q. Chen, C. Zuo, R. Li, and G. Shen, “General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique,” Opt. Lasers Eng. 59, 56–71 (2014).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013).
[Crossref]

Shi, Y.

Srinivasan, V.

Su, X.

X. Su and Q. Zhang, “Dynamic 3-d shape measurement method: a review,” Opt. Lasers Eng. 48, 191–204 (2010).
[Crossref]

Q. Zhang and X. Su, “High-speed optical measurement for the drumhead vibration,” Opt. Express 13, 3110–3116 (2005).
[Crossref]

J. Li, X. Su, and L. Guo, “Improved fourier transform profilometry for the automatic measurement of three-dimensional object shapes,” Opt. Eng. 29, 1439–1445 (1990).
[Crossref]

Su, X.-Y.

X.-Y. Su, G. Von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. 98, 141–150 (1993).
[Crossref]

Szeliski, R.

D. Scharstein and R. Szeliski, “High-accuracy stereo depth maps using structured light,” in 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings, vol. 1 (2003), pp. 195–202.

Takeda, M.

Tao, T.

C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro fourier transform profilometry (μftp): 3d shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018).
[Crossref]

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26, 22440–22456 (2018).
[Crossref]

S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018).
[Crossref]

T. Tao, Q. Chen, S. Feng, Y. Hu, M. Zhang, and C. Zuo, “High-precision real-time 3d shape measurement based on a quad-camera system,” J. Opt. 20, 014009 (2017).
[Crossref]

Theobalt, C.

Y. Cui, S. Schuon, D. Chan, S. Thrun, and C. Theobalt, “3d shape scanning with a time-of-flight camera,” in 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), pp. 1173–1180.

Thrun, S.

Y. Cui, S. Schuon, D. Chan, S. Thrun, and C. Theobalt, “3d shape scanning with a time-of-flight camera,” in 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), pp. 1173–1180.

Torras, C.

S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (tof) cameras: A survey,” IEEE Sensors J. 11, 1917–1926 (2011).
[Crossref]

Towers, C. E.

Towers, D. P.

Van der Jeught, S.

S. Van der Jeught and J. J. Dirckx, “Real-time structured light profilometry: a review,” Opt. Lasers Eng. 87, 18–31 (2016).
[Crossref]

Van Der Weide, D.

Von Bally, G.

X.-Y. Su, G. Von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. 98, 141–150 (1993).
[Crossref]

Vukicevic, D.

X.-Y. Su, G. Von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. 98, 141–150 (1993).
[Crossref]

Wang, J.

P. Wang, J. Wang, J. Xu, Y. Guan, G. Zhang, and K. Chen, “Calibration method for a large-scale structured light measurement system,” Appl. Opt. 56, 3995–4002 (2017).
[Crossref]

Z. Yang, Z. Xiong, Y. Zhang, J. Wang, and F. Wu, “Depth acquisition from density modulated binary patterns,” in Proceedings of the IEEE conference on computer vision and pattern recognition, (2013), pp. 25–32.

Wang, P.

Wang, Y.

Weise, T.

T. Weise, B. Leibe, and L. V. Gool, “Fast 3d scanning with automatic motion compensation,” in 2007 IEEE Conference on Computer Vision and Pattern Recognition, (2007), pp. 1–8.

Weng, J.

Wu, F.

P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process. 9, 396–408 (2015).
[Crossref]

Z. Yang, Z. Xiong, Y. Zhang, J. Wang, and F. Wu, “Depth acquisition from density modulated binary patterns,” in Proceedings of the IEEE conference on computer vision and pattern recognition, (2013), pp. 25–32.

Y. Zhang, Z. Xiong, and F. Wu, “Hybrid structured light for scalable depth sensing,” 2012 19th IEEE Int. Conf. on Image Process. pp. 17–20 (2012).

Xi, J.

Xiong, Z.

P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process. 9, 396–408 (2015).
[Crossref]

Y. Zhang, Z. Xiong, and F. Wu, “Hybrid structured light for scalable depth sensing,” 2012 19th IEEE Int. Conf. on Image Process. pp. 17–20 (2012).

Z. Yang, Z. Xiong, Y. Zhang, J. Wang, and F. Wu, “Depth acquisition from density modulated binary patterns,” in Proceedings of the IEEE conference on computer vision and pattern recognition, (2013), pp. 25–32.

Xu, J.

Yang, Z.

Z. Yang, Z. Xiong, Y. Zhang, J. Wang, and F. Wu, “Depth acquisition from density modulated binary patterns,” in Proceedings of the IEEE conference on computer vision and pattern recognition, (2013), pp. 25–32.

Yin, W.

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

Yin, Y.

Yu, Y.

Zhang, G.

Zhang, L.

L. Zhang, B. Curless, and S. M. Seitz, “Rapid shape acquisition using color structured light and multi-pass dynamic programming,” in Proceedings. First International Symposium on 3D Data Processing Visualization and Transmission, (2002), pp. 24–36.

Zhang, M.

S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018).
[Crossref]

T. Tao, Q. Chen, S. Feng, Y. Hu, M. Zhang, and C. Zuo, “High-precision real-time 3d shape measurement based on a quad-camera system,” J. Opt. 20, 014009 (2017).
[Crossref]

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

Zhang, Q.

X. Su and Q. Zhang, “Dynamic 3-d shape measurement method: a review,” Opt. Lasers Eng. 48, 191–204 (2010).
[Crossref]

Q. Zhang and X. Su, “High-speed optical measurement for the drumhead vibration,” Opt. Express 13, 3110–3116 (2005).
[Crossref]

Zhang, S.

Zhang, Y.

P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process. 9, 396–408 (2015).
[Crossref]

S. Feng, Y. Zhang, Q. Chen, C. Zuo, R. Li, and G. Shen, “General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique,” Opt. Lasers Eng. 59, 56–71 (2014).
[Crossref]

Z. Yang, Z. Xiong, Y. Zhang, J. Wang, and F. Wu, “Depth acquisition from density modulated binary patterns,” in Proceedings of the IEEE conference on computer vision and pattern recognition, (2013), pp. 25–32.

Y. Zhang, Z. Xiong, and F. Wu, “Hybrid structured light for scalable depth sensing,” 2012 19th IEEE Int. Conf. on Image Process. pp. 17–20 (2012).

Zhang, Z.

Z. Zhang, “Review of single-shot 3d shape measurement by phase calculation-based fringe projection techniques,” Opt. Lasers Eng. 50, 1097–1106 (2012).
[Crossref]

Z. Zhang, C. E. Towers, and D. P. Towers, “Time efficient color fringe projection system for 3d shape and color using optimum 3-frequency selection,” Opt. Express 14, 6444–6455 (2006).
[Crossref]

Zhao, S.

P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process. 9, 396–408 (2015).
[Crossref]

Zhong, J.

Zhong, K.

Zhou, X.

Zibley, P. C.

Zuo, C.

C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro fourier transform profilometry (μftp): 3d shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018).
[Crossref]

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018).
[Crossref]

T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26, 22440–22456 (2018).
[Crossref]

T. Tao, Q. Chen, S. Feng, Y. Hu, M. Zhang, and C. Zuo, “High-precision real-time 3d shape measurement based on a quad-camera system,” J. Opt. 20, 014009 (2017).
[Crossref]

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

S. Feng, Y. Zhang, Q. Chen, C. Zuo, R. Li, and G. Shen, “General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique,” Opt. Lasers Eng. 59, 56–71 (2014).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20, 19493–19510 (2012).
[Crossref]

Adv. Opt. Photonics (1)

J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics 3, 128–160 (2011).
[Crossref]

Appl. Opt. (5)

IEEE J. Sel. Top. Signal Process. (1)

P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process. 9, 396–408 (2015).
[Crossref]

IEEE Sensors J. (1)

S. Foix, G. Alenya, and C. Torras, “Lock-in time-of-flight (tof) cameras: A survey,” IEEE Sensors J. 11, 1917–1926 (2011).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell (2)

K. L. Boyer and A. C. Kak, “Color-encoded structured light for rapid active ranging,” IEEE Trans. Pattern Anal. Mach. Intell 1, 14–28 (1987).
[Crossref] [PubMed]

S. D. Cochran and G. Medioni, “3-d surface description from binocular stereo,” IEEE Trans. Pattern Anal. Mach. Intell 10, 981–994 (1992).
[Crossref]

J. Opt. (1)

T. Tao, Q. Chen, S. Feng, Y. Hu, M. Zhang, and C. Zuo, “High-precision real-time 3d shape measurement based on a quad-camera system,” J. Opt. 20, 014009 (2017).
[Crossref]

J. Opt. Soc. Am. (1)

JOSA A (1)

J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” JOSA A 20, 106–115 (2003).
[Crossref]

Opt. Commun. (1)

X.-Y. Su, G. Von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation analysis in complex objects evaluation,” Opt. Commun. 98, 141–150 (1993).
[Crossref]

Opt. Eng. (1)

J. Li, X. Su, and L. Guo, “Improved fourier transform profilometry for the automatic measurement of three-dimensional object shapes,” Opt. Eng. 29, 1439–1445 (1990).
[Crossref]

Opt. Express (9)

Q. Zhang and X. Su, “High-speed optical measurement for the drumhead vibration,” Opt. Express 13, 3110–3116 (2005).
[Crossref]

S. Zhang, D. Van Der Weide, and J. Oliver, “Superfast phase-shifting method for 3-d shape measurement,” Opt. Express 18, 9684–9689 (2010).
[Crossref]

Z. Zhang, C. E. Towers, and D. P. Towers, “Time efficient color fringe projection system for 3d shape and color using optimum 3-frequency selection,” Opt. Express 14, 6444–6455 (2006).
[Crossref]

K. Liu, Y. Wang, D. L. Lau, Q. Hao, and L. G. Hassebrook, “Dual-frequency pattern scheme for high-speed 3-d shape measurement,” Opt. Express 18, 5229–5244 (2010).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20, 19493–19510 (2012).
[Crossref]

X. Liu and J. Kofman, “High-frequency background modulation fringe patterns based on a fringe-wavelength geometry-constraint model for 3d surface-shape measurement,” Opt. Express 25, 16618–16628 (2017).
[Crossref] [PubMed]

T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26, 22440–22456 (2018).
[Crossref]

Z. Liu, P. C. Zibley, and S. Zhang, “Motion-induced error compensation for phase shifting profilometry,” Opt. Express 26, 12632–12637 (2018).
[Crossref]

B. Li, Z. Liu, and S. Zhang, “Motion-induced error reduction by combining fourier transform profilometry with phase-shifting profilometry,” Opt. Express 24, 23289–23303 (2016).
[Crossref]

Opt. Lasers Eng. (13)

Z. Zhang, “Review of single-shot 3d shape measurement by phase calculation-based fringe projection techniques,” Opt. Lasers Eng. 50, 1097–1106 (2012).
[Crossref]

S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018).
[Crossref]

C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013).
[Crossref]

C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016).
[Crossref]

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48, 133–140 (2010).
[Crossref]

C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro fourier transform profilometry (μftp): 3d shape measurement at 10,000 frames per second,” Opt. Lasers Eng. 102, 70–91 (2018).
[Crossref]

L. Huang, Q. Kemao, B. Pan, and A. K. Asundi, “Comparison of fourier transform, windowed fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry,” Opt. Lasers Eng. 48, 141–148 (2010).
[Crossref]

S. Zhang, “Recent progresses on real-time 3d shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48, 149–158 (2010).
[Crossref]

S. Van der Jeught and J. J. Dirckx, “Real-time structured light profilometry: a review,” Opt. Lasers Eng. 87, 18–31 (2016).
[Crossref]

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

S. Feng, Y. Zhang, Q. Chen, C. Zuo, R. Li, and G. Shen, “General solution for high dynamic range three-dimensional shape measurement using the fringe projection technique,” Opt. Lasers Eng. 59, 56–71 (2014).
[Crossref]

X. Su and Q. Zhang, “Dynamic 3-d shape measurement method: a review,” Opt. Lasers Eng. 48, 191–204 (2010).
[Crossref]

Q. Kemao, “Two-dimensional windowed fourier transform for fringe pattern analysis: principles, applications and implementations,” Opt. Lasers Eng. 45, 304–317 (2007).
[Crossref]

Opt. Lett. (3)

Pattern Recognit. (1)

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43, 2666–2680 (2010).
[Crossref]

Proc. R. Soc. Lond. B (1)

D. Marr and T. Poggio, “A computational theory of human stereo vision,” Proc. R. Soc. Lond. B 204, 301–328 (1979).
[Crossref] [PubMed]

Proc. SPIE (1)

H. Guo and P. S. Huang, “3-d shape measurement by use of a modified fourier transform method,” Proc. SPIE 7066, 70660E (2008).

Other (7)

T. Weise, B. Leibe, and L. V. Gool, “Fast 3d scanning with automatic motion compensation,” in 2007 IEEE Conference on Computer Vision and Pattern Recognition, (2007), pp. 1–8.

C. Bräuer-Burchardt, C. Munkelt, M. Heinze, P. Kühmstedt, and G. Notni, “Using geometric constraints to solve the point correspondence problem in fringe projection based 3d measuring systems,” in International Conference on Image Analysis and Processing, (2011), pp. 265–274.

L. Zhang, B. Curless, and S. M. Seitz, “Rapid shape acquisition using color structured light and multi-pass dynamic programming,” in Proceedings. First International Symposium on 3D Data Processing Visualization and Transmission, (2002), pp. 24–36.

D. Scharstein and R. Szeliski, “High-accuracy stereo depth maps using structured light,” in 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings, vol. 1 (2003), pp. 195–202.

Y. Cui, S. Schuon, D. Chan, S. Thrun, and C. Theobalt, “3d shape scanning with a time-of-flight camera,” in 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), pp. 1173–1180.

Z. Yang, Z. Xiong, Y. Zhang, J. Wang, and F. Wu, “Depth acquisition from density modulated binary patterns,” in Proceedings of the IEEE conference on computer vision and pattern recognition, (2013), pp. 25–32.

Y. Zhang, Z. Xiong, and F. Wu, “Hybrid structured light for scalable depth sensing,” 2012 19th IEEE Int. Conf. on Image Process. pp. 17–20 (2012).

Supplementary Material (7)

NameDescription
Visualization 1       The motion regions determined by FFDM and PFDM.
Visualization 2       Measurement results in the first complex scene
Visualization 3       Measurement results in the second complex scene.
Visualization 4       The measurement results of complex rigid measures.
Visualization 5       The measurement results of complex non-rigid measures.
Visualization 6       The real-time measurement processes and results based on PSP.
Visualization 7       The real-time measurement processes and results based on our method.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (16)

Fig. 1
Fig. 1 The principle of SPU.
Fig. 2
Fig. 2 The phases solved by PSP and the phases solved by FTP of the three three-step phase-shift fringe patterns in the case where the object moves in translation.
Fig. 3
Fig. 3 The phases solved by PSP and the phases solved by FTP of the three three-step phase-shift fringe patterns in the case where the object changes direction of motion.
Fig. 4
Fig. 4 Fusion algorithm flow diagram (The dark blue region indicates the result of the PSP, the dark red region indicates the result of the FTP and the dark green region indicates the combined result).
Fig. 5
Fig. 5 Difference between PFDM by two FTP phases and PFDM by two PSP phases.
Fig. 6
Fig. 6 Threshold determination process.
Fig. 7
Fig. 7 Simulation results of FFDM. (a) Intensity changes of the first image of the three-step phase-shifting fringe images in a noise-free environment when the phase changes at a speed of π/150 per frame. (b) Intensity changes of three fringe images in a noise-free environment when the phase changes at a speed of π/150 per frame. (c) The intensity changes in a noisy environment when the phase changes at a speed of π/150 per frame. (d) Another perspective of (c). (e) The relationship between the accuracy of motion judgment and the speed of phase change under different noise.
Fig. 8
Fig. 8 Simulation results of PFDM. (a) The changes of the PSP phase in a noise-free environment when the phase changes at a speed of π/150 per frame. (b) The changes of the FTP phase in a noise-free environment when the phase changes at a speed of π/150 per frame. (c) The changes of the PSP phase in a noisy environment when the phase changes at a speed of π/150 per frame. (d) The changes of the FTP phase in a noisey environment when the phase changes at a speed of π/150 per frame. (e) The relationship between the accuracy of motion judgment of PFDM using PSP phases and the speed of phase change under different noise. (f) The relationship between the accuracy of motion judgment of PFDM using FTP phases and the speed of phase change under different noise.
Fig. 9
Fig. 9 The quad-camera color real-time 3D imaging system.
Fig. 10
Fig. 10 The motion regions determined by FFDM and PFDM (See Visualization 1 for the whole results). (a) The first measurement scene (A flat plate in translational motion). (b) The result of FFDM of the first scene. (c) The result of PFDM of the first scene. (d) The second measurement scene (A flat plate in rotational motion). (e) The result of FFDM of the second scene. (f) The result of PFDM of the second scene. (g) The third measurement scene. (h) The result of FFDM of the third scene (A complex object in translational motion). (i) The result of PFDM of the third scene. (j) The fourth measurement scene (A hand in arbitrary motion). (k) The result of FFDM of the fourth scene. (l) The result of PFDM of the fourth scene.
Fig. 11
Fig. 11 The accuracy of motion judgment of FFPM and PFDM. (a) The result of the first scene. (b) The result of the second scene. (c) The result of the third scene. (d) The result of the fourth scene.
Fig. 12
Fig. 12 Measurement results in the first complex scene (See Visualization 2 for the whole results). (a) The captured all-white map. (b) The detected motion areas. (c) The results measured by the conventional PSP. (d) The results measured by our method. (e) The error distribution of the precision ball data in (c). (g) The error distribution of the flat plate data in (c). (i) The error distribution of the data of the precision ball in (d). (k) The error distribution of the flat plate data in (d). (f), (h), (j), (l) The histograms of (e), (g), (i), (k).
Fig. 13
Fig. 13 Measurement results in the second complex scenario (see Visualization 3 for the whole results). (a) The background map collected. (b) The detected motion areas. (c) The results measured by the conventional PSP. (e) The error distribution of the flat plate data of (c). (g)The histogram of (e). (d) The results measured by our method. (f) The error distribution of the flat plate data in (e). (h) The histogram of (f).
Fig. 14
Fig. 14 The measurement results of the first scene (see Visualization 4 for the whole results). (a) The measurement results of PSP. (b) The motion areas determined by PFDM. (c) The measurement results of our method.
Fig. 15
Fig. 15 The measurement results of the second scene (see Visualization 5 for the whole results). (a) The measurement results of PSP. (b) The motion areas determined by PFDM. (c) The measurement results of our method.
Fig. 16
Fig. 16 The real-time measurement processes and results based on (a) PSP (see Visualization 6 for the whole process) and (b) our method (see Visualization 7 for the whole process).

Equations (16)

Equations on this page are rendered with MathJax. Learn more.

I 1 c ( u c , v c ) = A c ( u c , v c ) + B c ( u c , v c ) cos ( Φ c ( u c , v c ) ) ,
I 2 c ( u c , v c ) = A c ( u c , v c ) + B c ( u c , v c ) cos ( Φ c ( u c , v c ) + 2 π 3 ) ,
I 3 c ( u c , v c ) = A c ( u c , v c ) + B c ( u c , v c ) cos ( Φ c ( u c , v c ) + 4 π 3 ) ,
ϕ c ( u c , v c ) = arctan ( 3 ( I 2 c ( u c , v c ) I 3 c ( u c , v c ) ) 2 I 1 c ( u c , v c ) I 2 c ( u c , v c ) I 3 c ( u c , v c ) ) ,
Φ c ( u c , v c ) = ϕ c ( u c , v c ) + 2 k c ( u c , v c ) π , k c ( u c , v c ) [ 0 , N 1 ] ,
I 0 c ( u c , v c ) = A c ( u c , v c ) .
I n c ( u c , v c ) = I 2 c ( u c , v c ) I 0 c ( u c , v c ) I 0 c ( u c , v c ) + b ,
I m 1 c ( u c , v c ) = A c ( u c , v c ) + B c ( u c , v c ) cos ( Φ c ( u c , v c ) + Δ Φ 1 c ( u c , v c ) ) ,
I m 2 c ( u c , v c ) = A c ( u c , v c ) + B c ( u c , v c ) cos ( Φ c ( u c , v c ) + 2 π 3 ) ,
I m 3 c ( u c , v c ) = A c ( u c , v c ) + B c ( u c , v c ) cos ( Φ c ( u c , v c ) + 4 π 3 + Δ Φ 2 c ( u c , v c ) ) ,
Δ Φ 1 c ( u c , v c ) > 0 > Δ Φ 2 c ( u c , v c ) ,
Δ Φ 1 c ( u c , v c ) < 0 < Δ Φ 2 c ( u c , v c ) .
Δ Φ 1 c ( u c , v c ) > 0 , Δ Φ 2 c ( u c , v c ) > 0 ,
Δ Φ 1 c ( u c , v c ) < 0 , Δ Φ 2 c ( u c , v c ) < 0 .
Δ Φ c ( u c , v c , t ) = | Φ c ( u c , v c , t ) Φ c ( u c , v c , t Δ t ) | = | d Φ c ( u c , v c , t ) d t Δ t | ,
flag c ( u c , v c , t ) = { 0 , Δ Φ c ( u c , v c , t ) < Th 1 , Δ Φ c ( u c , v c , t ) Th ,

Metrics