Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Real-time motion-induced error compensation for 4-step phase-shifting profilometry

Open Access Open Access

Abstract

Phase-shifting profilometry has been widely used in high-accuracy three-dimensional (3D) shape measurement. However, for dynamic scenarios, the object motion will lead to extra phase shift and then motion-induced error. Convenient and efficient motion-induced error compensation is still challenging. Therefore, we proposed a real-time motion-induced error compensation method for 4-step phase-shifting profilometry. The four phase-shifting images are divided into two groups to calculate two corresponding wrapped phases, one from the first three fringes and the other from the last three fringes. As the motion-induced error doubles the frequency of the projected fringes, the average phase can effectively compensate the motion-induced error because there is a π/2 phase shift between the adjacent frames. Furthermore, we designed a time sequence by recycling the projection fringes in a proper order, and the efficiency of 3D reconstruction could be effectively improved. This method performs pixel-wise error compensation, based on which we realized 50 fps real-time 3D measurement by GPU acceleration. Experimental results demonstrate that the proposed method can effectively reduce the motion-induced error.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Optical 3D shape measurement methods have been widely used in biomedical engineering, machine vision, industrial inspection, and other fields [14]. Fringe projection profilometry (FPP) [512] has attracted the attentions of many scholars due to its advantages of high spatial resolution and high measurement accuracy. Among them, Fourier transform profilometry (FTP) [7,8] only needs to filter a single-frame fringe pattern of high-frequency in the frequency domain to extract the phase. And phase-shifting profilometry (PSP) [912] needs multi-frame (usually at least three frames) phase-shifting fringe patterns to calculate the phase map modulated by the measured object.

With the rapid development of hardware devices, PSP techniques based on high-speed cameras and high-speed projectors became applicable to 3D shape measurement of dynamic scenes. However, PSP techniques require a fixed phase shift between the multiple fringe patterns captured by the camera, but in real dynamic scenarios, any motion of the measured object will introduce the phase shift error and measurement error [1315].

In order to solve this problem, some scholars combine PSP with FTP to reduce motion-induced error, by utilizing the single-frame measurement feature of FTP [16]. Cong et al. [17] proposed a method to estimate the phase shift error by using FTP to calculate the phase difference between phase-shifted fringe images. Li et al. [18,19] presented a hybrid measurement method that combines FTP with PSP phase to reduce the motion-induced error. Qian et al. [20] developed a pixel-wise motion detection method using FTP and PSP phases for 3D reconstruction of dynamic scenes. Guo et al. [21] proposed a dual-frequency composite grating method to identify and compensate the motion-induced error.

However, limited by FTP filtering, it is not very adaptable to complex dynamic measurement scenes. Due to this, Lu et al. proposed a manual method by placing markers [22,23] and an automated object tracking method with Scale Invariant Feature Transform (SIFT) [24] algorithm to track the motion of objects and compensate the measurement errors caused by two dimensional (2D) rigid motion of the object. Weise et al. [25] developed a GPU assisted motion reduction method for a binocular vision system. Feng et al. [14] brought out a phase shift error estimation method that iteratively obtains the average phase shift error within a single segmentation object. Liu et al. [26] compensated the motion-induced error by estimating the phase shift in the projector’s imaging plane. Wang et al. [27] developed a motion-induced error compensation method based on defocused binary projection and additional temporal sampling. Liu et al. [28] proposed an unknown phase shift error estimation method by calculating the average difference between three adjacent phase maps for 4-step phase shift method. Wang et al. [29] proposed a Hilbert transform based method to calculate an additional phase map, and compensated the motion-induced error by averaging the two phase maps for 3D measurement. However, it will require additional processing for compensating the errors at the fringe edges. The above methods require tracking the object or estimating the motion, which affects the computational efficiency. Recently, Feng et al. [30] used deep learning to obtain the high-accuracy phase from a single fringe pattern. The 3D reconstruction method based on deep learning can realize the 3D measurement of dynamic scenes with single frame image [31,32]. However, the network training of deep learning is time-consuming and difficult to achieve real-time processing. Therefore, a motion-induced error compensation method suitable for real-time phase-shifting measurement is still needed to be found.

In this paper, we proposed a real-time phase error compensation method based on 4-step phase shifting algorithm. The proposed method is based on the following two findings: First, the motion-induced error doubles its frequency compared to the projected fringe frequency, thus the error can be reversed by π/2 phase shift (1/4 fringe period); Second, the 4-step phase-shifting method itself satisfies the characteristics of the π/2 phase shift between different frames. Instead of directly calculating the phase map using 4-step algorithm, we divide four phase-shifting images into two groups. With the first and last three successive fringes, we can respectively retrieve two wrapped phase maps, and their average phase can compensate the motion-induced error since their phase shift is π/2. Our method performs pixel-wise calculation and compensation, thus it fits well with the parallel-processing character of GPU and could achieve real-time high-accuracy 3D reconstruction for moving objects. Furthermore, we designed a time sequence by recycling the projection fringes in a proper order, and the efficiency of 3D reconstruction could be effectively improved. Based on this method, a real-time 3D shape measurement system is developed with the reconstructed rate of 50 fps. Experimental results proves the effectiveness of this proposed method.

The rest of the paper is organized as follows: Section 2 illustrates the principle of the real-time motion-induced error compensation method for 4-step phase shifting profilometry. Section 3 shows some experimental results to validate the proposed method. Section 4 summarizes and discusses of the proposed method.

2. Principle

2.1 Motion-induced error for multi-step phase-shifting algorithm

The intensity distributions of the n-th fringe for a generic N-step phase-shifting algorithm can be described as,

$${I_n}(x,y) = A(x,y) + B(x,y)cos[{\Phi (x,y) + 2\pi (n - 1)/N} ],$$
where In(x,y), n=1,2, …, N, is the intensity recorded by the camera. A(x,y), B(x,y) and Φ(x,y) respectively represent the average intensity, intensity modulation and the phase map. N is the number of the projected fringe pattern. The phase shifting algorithm is used to calculate the wrapped phase ϕ(x,y) when N≥3:
$$\phi (x,y) = {\tan ^{ - 1}}\left[ {\frac{{\mathop \sum \nolimits_{n = 1}^N {I_n}(x,y)\sin ({{{2\pi (n - 1)} / N}} )}}{{\mathop \sum \nolimits_{n = 1}^N {I_n}(x,y)\cos ({{{2\pi (n - 1)} / N}} )}}} \right].$$

The continuous phase map Φ(x,y) can be unwrapped from the wrapped phase ϕ(x,y) by phase unwrapping algorithm with the well determined fringe order k(x,y),

$$\Phi (x,y) = \phi (x,y) + k(x,y) \ast 2\pi .$$

The N-step phase-shifting algorithm can obtain accurate phase map ϕ(x,y) if the phase shift 2π(n-1)/N is precise, but if the measured object is moving, the phase shift at each pixel in the captured images will have an additional unknown phase-shift error εn(x,y) [n=1,2,3…N-1] due to the object’s motion. The measurement principle diagram is shown in Fig. 1(a), and the additional unknown phase-shift error εn(x,y) caused by motion is shown in Fig. 1(b). For the N-step phase-shifting algorithm, the error εn(x,y) will result in pattern distortion and phase calculation as follows,

$$I_n^{\prime}(x,y) = A(x,y) + B(x,y)cos[{\Phi (x,y) + 2\pi (n - 1)/N + {\varepsilon_n}(x,y)} ],$$
$${\phi ^{\prime}}(x,y) = {\tan ^{ - 1}}\left[ {\frac{{\mathop \sum \nolimits_{n = 1}^N I_n^{\prime}(x,y)\sin ({{{2\pi (n - 1)} / N}} )}}{{\mathop \sum \nolimits_{n = 1}^N I_n^{\prime}(x,y)\cos ({{{2\pi (n - 1)} / N}} )}}} \right].$$

 figure: Fig. 1.

Fig. 1. Schematic diagram of (a) motion object measurement and (b) motion phase-shift error in phase shifting profilometry.

Download Full Size | PDF

Therefore, the motion-induced error calculated by Eq. (5) is [14,29]:

$$\begin{aligned} \Delta \phi (x,y) &= {\phi ^{\prime}}(x,y) - \phi (x,y)\\ &= \textrm{ta}{\textrm{n}^{ - 1}}\left[ {\frac{{\cos 2\phi \sum\nolimits_{n = 1}^N {\sin ({{2 \ast 2\pi (n - 1)} / N} + {\varepsilon_n}) - } \sin 2\phi \sum\nolimits_{n = 1}^N {\cos ({{2 \ast 2\pi (n - 1)} / N} + {\varepsilon_n}) + } \sum\nolimits_{n = 1}^N {\sin {\varepsilon_n}} }}{{\cos 2\phi \sum\nolimits_{n = 1}^N {\cos ({{2 \ast 2\pi (n - 1)} / N} + {\varepsilon_n}) + } \sin 2\phi \sum\nolimits_{n = 1}^N {\sin ({{2 \ast 2\pi (n - 1)} / N} + {\varepsilon_n}) + } \sum\nolimits_{n = 1}^N {\cos {\varepsilon_n}} }}} \right]. \end{aligned}$$

From Eq. (6), it can be seen that the motion-induced error Δϕ doubles its frequency compared to the projected fringe. Based on this feature, if we shift the original phase map by 1/4 fringe period (i.e. π/2 in phase), and then we can use the shifted phase map to compensate the motion-induced error by averaging it with the original phase map. And we found that the 4-step phase-shifting method itself just satisfies the characteristics of the π/2 phase shift between frames. Combines these two points, a method can be developed to compensate the motion-induced error for 4-step phase-shifting profilometry.

2.2 Proposed motion-induced-error compensation for 4-step phase-shifting method

For a standard 4-step phase-shifting method with π/2 phase shift, the wrapped phase can be computed using the following equation:

$$\phi (x,y) = {\tan ^{ - 1}}\left[ {\frac{{{I_4}(x,y) - {I_2}(x,y)}}{{{I_3}(x,y) - {I_1}(x,y)}}} \right].$$

The images of the 4-step phase shift I1, I2, I3, I4 can be divided into two groups: [I1, I2, I3] and [I2, I3, I4], and their corresponding wrapped phases can be calculated as:

$${\phi _1}(x,y) = {\tan ^{ - 1}}\left[ {\frac{{{I_1}(x,y) + {I_3}(x,y) - 2 \ast {I_2}(x,y)}}{{{I_3}(x,y) - {I_1}(x,y)}}} \right],$$
$${\phi _2}(x,y) = {\tan ^{ - 1}}\left[ {\frac{{{I_4}(x,y) - {I_2}(x,y)}}{{2 \ast {I_3}(x,y) - {I_4}(x,y) - {I_2}(x,y)}}} \right].$$

When the measured object is stationary, the two wrapped phases are equal ϕ1= ϕ2, but when the object is moving, the phase shift error εn(x,y) will result in pattern distortion and final phase calculation as follows,

$$\left\{ \begin{array}{l} I_1^{\prime}(x,y) = A(x,y) + B(x,y)\cos [{\Phi (x,y)} ]\\ I_2^{\prime}(x,y) = A(x,y) - B(x,y)\sin [{\Phi (x,y) + {\varepsilon_1}(x,y)} ]\\ I_3^{\prime}(x,y) = A(x,y) - B(x,y)\cos [{\Phi (x,y) + {\varepsilon_1}(x,y) + {\varepsilon_2}(x,y)} ]\\ I_4^{\prime}(x,y) = A(x,y) + B(x,y)\sin [{\Phi (x,y) + {\varepsilon_1}(x,y) + {\varepsilon_2}(x,y) + {\varepsilon_3}(x,y)} ]\end{array} \right..$$

For a small phase-shift error ε, sin(ε)≈ε and cos(ε)≈1. According to Eq. (8), the phase affected by the phase shift error εn can be expressed as:

$$\begin{aligned} \phi _1^{\prime}(x,y) &= {\tan ^{ - 1}}\left[ {\frac{{I_1^{\prime}(x,y) + I_3^{\prime}(x,y) - 2 \ast I_2^{\prime}(x,y)}}{{I_3^{\prime}(x,y) - I_1^{\prime}(x,y)}}} \right]\\ &\approx {\tan ^{ - 1}}\left[ {\frac{{(2 + {\varepsilon_1} + {\varepsilon_2})\sin \phi + 2{\varepsilon_1}\cos \phi }}{{ - 2\cos \phi + ({\varepsilon_1} + {\varepsilon_2})\sin \phi }}} \right]\textrm{ }. \end{aligned}$$

The motion-induced error ϕ1(x,y) can be derived as:

$$\begin{aligned} \Delta {\phi _1}(x,y) &= \phi _1^{\prime}(x,y) - {\phi _1}(x,y)\\ &\approx \textrm{ta}{\textrm{n}^{ - 1}}\left[ {\frac{{(2 + {\varepsilon_1} + {\varepsilon_2})\sin {\phi_1} + 2{\varepsilon_1}\cos {\phi_1}}}{{ - 2\cos {\phi_1} + ({\varepsilon_1} + {\varepsilon_2})\sin {\phi_1}}}} \right] - \textrm{ta}{\textrm{n}^{ - 1}}\left[ {\frac{{\sin {\phi_1}}}{{\cos {\phi_1}}}} \right]\\ &\approx \textrm{ta}{\textrm{n}^{ - 1}}\left[ {\frac{{ - ({\varepsilon_2} + {\varepsilon_1})\sin 2{\phi_1} + ({\varepsilon_2} - {\varepsilon_1})\cos {\phi_1} - ({\varepsilon_1} + 3{\varepsilon_2})}}{{ - ({\varepsilon_2} + {\varepsilon_1})\cos 2{\phi_1} + ({\varepsilon_1} - {\varepsilon_2})\sin {\phi_1} + (4 + {\varepsilon_1} + {\varepsilon_2})}}} \right]\textrm{ }. \end{aligned}$$

By using the arctangent function and Taylor series approximation, the motion-induced error Δϕ1(x,y) can be further simplified as:

$$\begin{aligned} \Delta {\phi _1}(x,y) &\approx \frac{{ - ({\varepsilon _2} + {\varepsilon _1})\sin 2{\phi _1} + ({\varepsilon _2} - {\varepsilon _1})\cos {\phi _1} - ({\varepsilon _1} + 3{\varepsilon _2})}}{{ - ({\varepsilon _2} + {\varepsilon _1})\cos 2{\phi _1} + ({\varepsilon _1} - {\varepsilon _2})\sin {\phi _1} + (4 + {\varepsilon _1} + {\varepsilon _2})}}\\ &\approx \frac{{{\varepsilon _1} + {\varepsilon _2}}}{4}\sin 2{\phi _1} + \frac{{3{\varepsilon _1} + {\varepsilon _2}}}{4}\textrm{ }. \end{aligned}$$

From Eq. (13), it can be seen that the motion-induced error Δϕ1(x,y) of the first image group [I1, I2, I3] in 4-step phase-shifting method is still approximately correlated with 2ϕ.

Similar to the above derivation process, the motion-induced error Δϕ2(x,y) of the last image group [I2, I3, I4] in 4-step phase-shifting method can be expressed as:

$$\begin{aligned} \Delta {\phi _2}(x,y) &= \phi _2^{\prime}(x,y) - {\phi _2}(x,y)\\ &\approx \textrm{ta}{\textrm{n}^{ - 1}}\left[ {\frac{{2\sin {\phi_2} + (2{\varepsilon_1} + {\varepsilon_2} + {\varepsilon_3})\cos {\phi_2}}}{{(2 + {\varepsilon_2} + {\varepsilon_3})\cos {\phi_2} - 2({\varepsilon_1} + {\varepsilon_2})\sin {\phi_2}}}} \right] - \textrm{ta}{\textrm{n}^{ - 1}}\left[ {\frac{{\sin {\phi_2}}}{{\cos {\phi_2}}}} \right]\\ &\approx \textrm{ta}{\textrm{n}^{ - 1}}\left[ {\frac{{({\varepsilon_2} + {\varepsilon_3})\sin 2{\phi_2} + ({\varepsilon_2} - {\varepsilon_3})\cos 2{\phi_2} - (4{\varepsilon_1} + 3{\varepsilon_2} + {\varepsilon_3})}}{{({\varepsilon_2} + {\varepsilon_3})\cos 2{\phi_2} + ({\varepsilon_3} - {\varepsilon_2})\sin 2{\phi_2} + (4 + {\varepsilon_2} + {\varepsilon_3})}}} \right]\\ &\approx{-} \frac{{{\varepsilon _2} + {\varepsilon _3}}}{4}\sin 2{\phi _2} + \frac{{4{\varepsilon _1} + 3{\varepsilon _2} + {\varepsilon _3}}}{4}\textrm{ }\textrm{.} \end{aligned}$$

Equations (13) and (14) shows that the motion-induced errors of the two image groups [I1, I2, I3] and [I2, I3, I4] in the 4-step phase shift doubles their frequencies compared to the projected fringe, and for a small phase shift error εn, ε1ε2ε3, the sine components of the two phase errors are approximately equal and the symbols are opposite, which can approximately counteract the sine components of the motion-induced error.

Since the phase error of two groups, [I1, I2, I3] and [I2, I3, I4], in 4-step phase-shifting images has opposite distributional tendency, their average phase ϕc(x,y) can be obtained to compensate the periodic motion-induced phase error:

$${\phi _c}(x,y) = {{[{\phi_1^{\prime}(x,y) + \phi_2^{\prime}(x,y)} ]} / 2}.$$

To verify the performance of this proposed method, we simulated the motion-induced error of the 4-step phase-shifting method for both uniform and non-uniform velocities. In the simulation, the uniform phase-shift errors were set as ε1=ε2=ε3=0.1 rad.

The profile of one period phase errors are shown in Fig. 2(a), in which the phase error Δϕ1 and Δϕ2 are calculated from the image groups [I1, I2, I3] and [I2, I3, I4] respectively, and the phase errors of two groups have opposite distribution. The motion-induced error of the averaged phase ϕc is significantly reduced. Figure 2(b) shows the phase error Δϕc and Δϕ of our method and the 4-step phase-shifting method respectively. It can be seen that the proposed method can effectively reduce the phase error caused by uniform motion. Furthermore, Fig. 2(c) shows the simulation results for non-uniform motion when setting the phase-shift error as ε1=0.05 rad, ε2=0.1 rad, ε3=0.15 rad. Figure 2(d) gives the motion-induced error Δϕc and Δϕ of our method and the 4-step phase-shifting method respectively. So, even for the non-uniform motion, this proposed method can also partly reduce its phase error.

 figure: Fig. 2.

Fig. 2. Simulation results of motion-induced error compensation for 4-step phase-shifting method. (a) Compensation result of our method for uniform motion. (b) Phase errors of our method and the 4-step phase-shifting method for uniform motion. (c) Compensation result of our method for non-uniform motion. (d) Phase errors of our method and the 4-step phase-shifting method for non-uniform motion.

Download Full Size | PDF

2.3 Unwrapped phase and system calibration

To solve the phase ambiguity of the wrapped phase ϕc of a moving object, the dual-frequency temporal phase unwrapping method has been used. By projecting the high- and low-frequency fringes on the initial reference plane to obtain the wrapped phases ϕr_l and ϕr_h, and the two phases of the initial plane are used for phase unwrapping with the approach reported in Ref [21].

For the wrapped phase ϕhc and ϕlc of high- and low-frequency fringes of a moving object, the unwrapped phase difference of the low-frequency fringe can be artificially created as follows:

$$\varDelta {\Phi _{lc}} = \left\{ \begin{array}{lll} {{\phi_{lc}} - {\phi_{r\_l}}}&, &{if \quad {\phi_{lc}} \ge {\phi_{r\_l}}} \\ {{\phi_{lc}} - {\phi_{r\_l}} + 2\pi }&, &{if \quad {\phi_{lc}} < {\phi_{r\_l}}} \end{array} \right..$$

Finally, high-frequency wrapped phase ϕhc can be unwrapped by the assist of ΔФlc, and the high-frequency phase difference ΔФhc is expressed as follows:

$$\varDelta {\Phi _{hc}} = {\phi _{hc}} + 2\pi \cdot round(\frac{{s\varDelta {\Phi _{lc}} + {\phi _{r\_h}} - {\phi _{hc}}}}{{2\pi }}) - {\phi _{r\_h}},$$
where round(·) denotes the operation of rounding to the nearest integer.

By using the established unwrapped phase-to-height lookup table (UPLUT) [33], the high-frequency phase difference ΔФhc is mapped into the height distribution which finally reconstructs the 3D shape of a tested scene based on the camera calibration technique [34].

2.4 Framework of the proposed method

The proposed method focuses on the efficiency of motion-induced error compensation for 4-step phase-shifting profilometry. Therefore, we specially design the fringe projection framework to enhance the efficiency and achieve real-time 3D measurement. In order to clearly explain the process of this method, Fig. 3 illustrates its whole framework.

 figure: Fig. 3.

Fig. 3. Overview of the compensation method for motion-induced error.

Download Full Size | PDF

Step1: High-frequency 4-step phase-shifting patterns [Ih1, Ih2, Ih3, Ih4] and low-frequency 4-step phase- shifting patterns [Il1, Il2, Il3, Il4] constitute the interval projection mode of dual-frequency fringes [Ih1, Il1, Ih2, Il2, Ih3, Il3, Ih4, Il4], which are cyclically projected onto the surface of a measured object by the used projector. The camera synchronously records the series of deformed fringe patterns modulated by the tested object.

Step2: The newly captured dual-frequency 4-frame images are ordered in accordance with their steps of phase shift, and divided the 4-frame high-frequency images [Ih1, Ih2, Ih3, Ih4] into group [Ih1, Ih2, Ih3] and group [Ih2, Ih3, Ih4]. After the two wrapped phases ϕh1 and ϕh2 being calculated, finally, the high-frequency average phase ϕhc is obtained.

Step3: Similarly, the 4-frame low-frequency images [Il1, Il2, Il3, Il4] are divided into group [Il1, Il2, Il3] and group [Il2, Il3, Il4]. And its average phase ϕlc is obtained from two wrapped phases ϕl1 and ϕl2.

Step4: According to the Eqs. (16) and (17), the high-frequency phase difference ΔФhc is retrieved. The phase difference ΔФhc is mapped to the height distribution by using UPLUT, and then restored the corresponding 3D shape.

For 3D reconstruction of dynamic scenes, the captured images are continuous sequence, and a more efficient cyclic strategy can be used to achieve higher frame rate 3D reconstruction. The motion error compensation method proposed in this paper is also applicable aiding by this recycling projection strategy. As shown in Fig. 4, for a 4-step phase shift image sequence, starting from the 4-th frame, a new group of 4-step phase shift images can be formed with each newly added fringe image, for example [1,2,3,4], [2,3,4,5], [3,4,5,6]. In this image sequence, The phase-shifting pattern order of the M-th frame image can be expressed as n = mod(M-1, 4) + 1, the phase-shifting pattern order of the latest 4 fringe patterns will appear in the following 4 kinds: [1234], [2341], [3412] and [4123]. They can be further divided into the first group of three fringes and the last three ones. So, there are 4 combinations of [123], [234], [341] and [412], which wrapped phases can be defined as ϕ1, ϕ2, ϕ3 and ϕ4 respectively. Among them, ϕ1 and ϕ2 can be calculated by Eqs. (8) and (9), and the two wrapped phases ϕ3 and ϕ4 are calculated as:

$${\phi _3}(x,y) ={-} {\tan ^{ - 1}}\left[ {\frac{{{I_1}(x,y) + {I_3}(x,y) - 2 \ast {I_4}(x,y)}}{{{I_3}(x,y) - {I_1}(x,y)}}} \right],$$
$${\phi _4}(x,y) ={-} {\tan ^{ - 1}}\left[ {\frac{{{I_2}(x,y) - {I_4}(x,y)}}{{{I_4}(x,y) + {I_2}(x,y) - 2 \ast {I_1}(x,y)}}} \right].$$

 figure: Fig. 4.

Fig. 4. Overview of the image cyclic strategy.

Download Full Size | PDF

It should be pointed out that every one more image is appended to the image sequence, only one wrapped phase needs to be updated to calculate the average phase.

In our proposed method, using the interval projection of high- and low-frequency fringe patterns, one more high-frequency deformed fringe image we obtained, a new 3D shape can be reconstructed, which can reduce the motion-induced error while improving the efficiency 3D reconstruction. For the acquired 2*M frame images, M-(4-1) frame 3D shape results can be reconstructed.

3. Experiments

To verify the performance of this motion-induced error compensation new method for 4-step phase-shifting profilometry, an experimental measurement system was developed, consisting of a camera (Baumer HXC40) with the imaging resolution of 1280*800 pixels and equipped with 12mm imaging lens, a digital projector (LightCrafter 4500) with the resolution of 912*1140 pixels. The camera was synchronized by a trigger signal of the projector, and the projection and capturing rates were set to 100 fps in all experiments.

The two periods, Pl and Ph, in the dual-frequency gratings were set to 128 and 16 pixels, respectively. The dual-frequency 4-step phase-shifting images were orderly projected onto a moving reference plane with the interval of 2 mm to establish the UPLUT [33] within a 140 mm measuring depth.

3.1 Accuracy analysis of moving standard spheres

To quantitatively evaluate the performance of this method, two standard spheres (diameters of 50.7991 mm and 50.7970 mm, distance of 100.2537 mm between two sphere centers) were measured while it was moving with an approximate speed of 20 mm/s in the depth direction as shown in Fig. 5(a). Figure 5(b) is one of the high-frequency deformed fringe pattern. Figure 5(c) shows the reconstructed result by 4-step phase shift algorithm in where the motion-induced errors on two standard spheres’ surface are obvious. Figure 5(d) illustrates the reconstructed result by our method, which distinctly compensated the motion-induced error. The larger STD of the two spheres’ measuring result of 4-step phase shift algorithm and our method is respectively 0.2208 mm and 0.1066 mm. The larger diameter error of the two spheres reconstructed by these two methods is 0.0423 mm and 0.0207 mm respectively. Both of them have been obviously reduced.

 figure: Fig. 5.

Fig. 5. Accuracy analysis for the 4-step phase shift method and our method. (a) Tested scene of two moving standard spheres. (b) Captured deformed fringe image. (c) Reconstructed result by 4-step phase shift algorithm. (d) Reconstructed result by our method.

Download Full Size | PDF

3.2 Measurement on dynamic scenes

The second experiment further demonstrates the performance of this new method in two dynamic scenes with isolated surfaces. In the first scene, there are two objects with different status: a translational statue (in depth direction) and a static one. The approximate speed of the translational object was 20 mm/s in the depth direction, and the measurement results are shown in Fig. 6. Figure 6(b) and 6(c) are the results of the standard 4-step phase shift algorithm and our method respectively. The acquired rate of the camera is 100 fps in this experiment. The comparison of 3D reconstruction results of this translation process by these two methods are shown in Visualization 1 with a decreased playback speed of 25 fps. In the second scene, a comparative experiment was also conducted on the rotating scene with the rotation speed of 0.1π rad/s, and the restored results are shown in Fig. 7 and Visualization 2. These two experimental results demonstrate that our method can effectively reduce the motion-induced error of both translational and rotating motion in the 4-step phase-shifting profilometry.

 figure: Fig. 6.

Fig. 6. Measurement results of a translational scene. (a) Captured deformed fringe images at different frame. (b) Reconstructed results by 4-step phase shift algorithm. (c) Reconstructed results by our method. (Visualization 1).

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. Measurement results of a rotating scene. (a) Captured deformed fringe images at different frame. (b) Reconstructed results by 4-step phase shift algorithm. (c) Reconstructed results by our method. (Visualization 2).

Download Full Size | PDF

In our method, according to the Eqs. (8), (9) and Eqs. (18), (19), the phase of each group of three deformed fringes is calculated with the cyclic projecting strategy, and each high-frequency deformed image can produce one more 3D shape, which makes the 3D reconstruction of a dynamic process more efficient. Although the high-and low-frequency interval projection will double the shooting time of the high-frequency images, which amplifies the motion-induced error. However, due to this effective motion-induced error compensation method, its reconstruction accuracy is still better than the traditional method. Meanwhile, the cyclic projecting strategy can improve the efficiency of 3D reconstruction, and for the acquired 2*M frame images, M-3 frames of 3D shape results can be reconstructed. Visualization 2 shows the high-frequency deformed image and the corresponding reconstructed results at different times. The acquired rate is 100 fps, and the playback rate is set to 10 fps.

3.3 Real-time measurement on dynamic scene

In the final experiment, we developed a software system for real-time measurement on a computer (Intel i5-7400 CPU, NVIDIA GeForce GTX1080 GPU). The calibration parameters of the 3D measurement system were pre-calculated and stored on the GPU. The calculation process of 3D reconstruction is carried out on GPU, and the visualization of 3D reconstruction results is realized by OpenGL.

The pixel-wise compensation of motion-induced error allows real-time 3D shape reconstruction through parallel calculations. Our method can reconstruct and display the 3D shapes at a speed of 50 fps, and the image resolution is 11280*800 pixels. The measured scene includes a stationary petals sculpture and three rotating statues. The real-time measuring results are shown in Fig. 8 and Visualization 3. Figure 9 shows the comparison of the measurement results between 4-step PSP method and our method. It should be noted that this comparison is only a qualitative analysis of the measuring results at an approximately rotation position of two different methods, which cannot provide an accurate numerical comparison result due to the different sampling time.

 figure: Fig. 8.

Fig. 8. Real-time measurement process and results (Visualization 3).

Download Full Size | PDF

 figure: Fig. 9.

Fig. 9. Comparison of real-time measurement results by 4-step PSP and our method.

Download Full Size | PDF

4. Conclusion and discussion

This paper has presented a real-time motion-induced error compensation method for 4-step phase shift profilometry. Compared with the traditional method, our method is more accurate in dynamic 3D shape measurement. These experimental results demonstrates the effectiveness of the real-time motion-induced error compensation method in dynamic 3D measurement.

The proposed method has the following features:

  • • It is not required to project any additional fringe pattern or estimate unknown phase shifts, so it is suitable for high-speed applications by dividing the 4-step phase shift images into two groups of the first three patterns and the last three ones, and obtaining their average phase, the measurement error introduced by motion can be reduced effectively.
  • • Through the recycling using of fringe images with Eqs. (8), (9), (18), (19), the motion-induced error can be compensated at each 3D frame, and each high-frequency deformed image we captured can produce a new 3D shape which improves the efficiency of 3D reconstruction.
  • • The proposed method realizes pixel-wise compensation for the motion-induced error, and has simple calculation process. It is suitable for GPU parallel computation to achieve real-time high-accuracy 3D measurement for dynamic objects.

However, it can be seen from Fig. 2 that there is still a small remnant although the average phase effectively reduces the motion-induced fluctuations. The reason behind is that the two wrapped phases ϕ1 and ϕ2 in Eqs. (13) and (14) are not exactly the same, but satisfy this equation ϕ1+ε1= ϕ2. Therefore, the motion-induced error is not strictly reversed for the two phase maps, resulting in the tiny remained fluctuations caused by the object motion. Therefore, further research will focus on eliminating the remnant error especially for the non-uniform motion scenarios.

Funding

National Natural Science Foundation of China (62075143); National Postdoctoral Program for Innovative Talents (BX2021199); Open Fund of Key Laboratory of Icing and Anti/De-icing (IADL20200308); Cooperative research project of Chunhui plan of Ministry of Education (2020703-8); Sichuan Province Science and Technology Program (2020YFG0077).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. K. R. Ford, G. D. Myer, and T. E. Hewett, “Reliability of landing 3D motion analysis: implications for longitudinal analyses,” Med. Sci. Sports Exercise 39(11), 2021–2028 (2007). [CrossRef]  

2. E. Malamas, E. Petrakis, M. Zervakis, L. Petit, and J. Legat, “A survey on industrial vision systems, applications and tools,” Image Vis. Comput. 21(2), 171–188 (2003). [CrossRef]  

3. F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39(1), 8–22 (2000). [CrossRef]  

4. J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128–160 (2011). [CrossRef]  

5. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010). [CrossRef]  

6. J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43(8), 2666–2680 (2010). [CrossRef]  

7. X. Su and Q. Zhang, “Dynamic 3-d shape measurement method: a review,” Opt. Lasers Eng. 48(2), 191–204 (2010). [CrossRef]  

8. Z. Zhang, “Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques,” Opt. Lasers Eng. 50(8), 1097–1106 (2012). [CrossRef]  

9. S. Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48(2), 149–158 (2010). [CrossRef]  

10. S. Van der Jeught and J. J. Dirckx, “Real-time structured light profilometry: a review,” Opt. Lasers Eng. 87, 18–31 (2016). [CrossRef]  

11. C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018). [CrossRef]  

12. C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. 85, 84–103 (2016). [CrossRef]  

13. L. Lu, V. Suresh, Y. Zheng, Y. Wang, and B. Li, “Motion induced error reduction methods for phase shifting profilometry: a review,” Opt. Lasers Eng. 141, 106573 (2021). [CrossRef]  

14. S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Lasers Eng. 103, 127–138 (2018). [CrossRef]  

15. L. Lu, Z. Jia, Y. Luan, and J. Xi, “Reconstruction of isolated moving objects with high 3d frame rate based on phase shifting profilometry,” Opt. Commun. 438, 61–66 (2019). [CrossRef]  

16. Q. Zhang and X. Su, “High-speed optical measurement for the drumhead vibration,” Opt. Express 13(8), 3110–3116 (2005). [CrossRef]  

17. P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3D sensing with Fourier-assisted phase shifting,” IEEE J. Sel. Top. Signal Process. 9(3), 396–408 (2015). [CrossRef]  

18. B. Li and S. Zhang, “Superfast high-resolution absolute 3D recovery of a stabilized flapping flight process,” Opt. Express 25(22), 27270–27282 (2017). [CrossRef]  

19. B. Li, Z. Liu, and S. Zhang, “Motion-induced error reduction by combining fourier transform profilometry with phase-shifting profilometry,” Opt. Express 24(20), 23289–23303 (2016). [CrossRef]  

20. J. Qian, T. Tao, S. Feng, Q. Chen, and C. Zuo, “Motion-artifact-free dynamic 3D shape measurement with hybrid Fourier-transform phase-shifting profilometry,” Opt. Express 27(3), 2713–2731 (2019). [CrossRef]  

21. W. Guo, Z. Wu, Y. Li, Y. Liu, and Q Zhang, “Real-time 3D shape measurement with dual-frequency composite grating and motion-induced error reduction,” Opt. Express 28(18), 26882–26897 (2020). [CrossRef]  

22. L. Lu, J. Xi, Y. Yu, and Q. Guo, “New approach to improve the accuracy of 3-D shape measurement of moving object using phase shifting profilometry,” Opt. Express 21(25), 30610–30622 (2013). [CrossRef]  

23. L. Lu, J. Xi, Y. Yu, and Q. Guo, “New approach to improve the performance of fringe pattern profilometry using multiple triangular patterns for the measurement of objects in motion,” Opt. Eng. 53(11), 112211 (2014). [CrossRef]  

24. L. Lu, Y. Ding, Y. Luan, Y. Yin, Q. Liu, and J. Xi, “Automated approach for the surface profile measurement of moving objects based on PSP,” Opt. Express 25(25), 32120–32131 (2017). [CrossRef]  

25. T. Weise, B. Leibe, and L. V. Gool, “Fast 3d scanning with automatic motion compensation,” in 2007 IEEE Conference on Computer Vision and Pattern Recognition, (2007), pp. 1–8.

26. Z. Liu, P. C. Zibley, and S. Zhang, “Motion-induced error compensation for phase shifting profilometry,” Opt. Express 26(10), 12632–12637 (2018). [CrossRef]  

27. Y. Wang, V. Suresh, and B. Li, “Motion-induced error reduction for binary defocusing profilometry via additional temporal sampling,” Opt. Express 27(17), 23948–23958 (2019). [CrossRef]  

28. X. Liu, Tianyang Tao, Y. Wan, and J. Kofman, “Real-time motion-induced-error compensation in 3D surface-shape measurement,” Opt. Express 27(18), 25265–25279 (2019). [CrossRef]  

29. Y. Wang, Z. Liu, C. Jiang, and S. Zhang, “Motion induced phase error reduction using a Hilbert transform,” Opt. Express 26(26), 34224–34235 (2018). [CrossRef]  

30. S. Feng, Q. Chen, G. Gu, T. Tao, L. Zhang, Y. Hu, W. Yin, and C. Zuo, “Fringe pattern analysis using deep learning,” Adv. Photonics 1(02), 1–7 (2019). [CrossRef]  

31. H. Yu, X. Chen, Z. Zhang, C. Zuo, Y. Zhang, D. Zheng, and J. Han, “Dynamic 3-D measurement based on fringe-to-fringe transformation using deep learning,” Opt. Express 28(7), 9405–9418 (2020). [CrossRef]  

32. J. Qian, S. Feng, T. Tao, Y. Hu, Y. Li, Q. Chen, and C. Zuo, “Deep-learning-enabled geometric constraints and phase unwrapping for single-shot absolute 3D shape measurement,” APL Photonics 5(4), 046105 (2020). [CrossRef]  

33. W. Guo, Z. Wu, R. Xu, Q. Zhang, and M. Fujigaki, “A fast reconstruction method for three-dimensional shape measurement using dual-frequency grating projection and phase-to-height lookup table,” Opt. Laser Technol. 112, 269–277 (2019). [CrossRef]  

34. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. 22(11), 1330–1334 (2000). [CrossRef]  

Supplementary Material (3)

NameDescription
Visualization 1       The comparison of 3D reconstruction results of this translation process by standard 4-step phase shift algorithm and our method
Visualization 2       The comparison of 3D reconstruction results of this rotating process by standard 4-step phase shift algorithm and our method
Visualization 3       Real-time measurement process and results

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. Schematic diagram of (a) motion object measurement and (b) motion phase-shift error in phase shifting profilometry.
Fig. 2.
Fig. 2. Simulation results of motion-induced error compensation for 4-step phase-shifting method. (a) Compensation result of our method for uniform motion. (b) Phase errors of our method and the 4-step phase-shifting method for uniform motion. (c) Compensation result of our method for non-uniform motion. (d) Phase errors of our method and the 4-step phase-shifting method for non-uniform motion.
Fig. 3.
Fig. 3. Overview of the compensation method for motion-induced error.
Fig. 4.
Fig. 4. Overview of the image cyclic strategy.
Fig. 5.
Fig. 5. Accuracy analysis for the 4-step phase shift method and our method. (a) Tested scene of two moving standard spheres. (b) Captured deformed fringe image. (c) Reconstructed result by 4-step phase shift algorithm. (d) Reconstructed result by our method.
Fig. 6.
Fig. 6. Measurement results of a translational scene. (a) Captured deformed fringe images at different frame. (b) Reconstructed results by 4-step phase shift algorithm. (c) Reconstructed results by our method. (Visualization 1).
Fig. 7.
Fig. 7. Measurement results of a rotating scene. (a) Captured deformed fringe images at different frame. (b) Reconstructed results by 4-step phase shift algorithm. (c) Reconstructed results by our method. (Visualization 2).
Fig. 8.
Fig. 8. Real-time measurement process and results (Visualization 3).
Fig. 9.
Fig. 9. Comparison of real-time measurement results by 4-step PSP and our method.

Equations (19)

Equations on this page are rendered with MathJax. Learn more.

I n ( x , y ) = A ( x , y ) + B ( x , y ) c o s [ Φ ( x , y ) + 2 π ( n 1 ) / N ] ,
ϕ ( x , y ) = tan 1 [ n = 1 N I n ( x , y ) sin ( 2 π ( n 1 ) / N ) n = 1 N I n ( x , y ) cos ( 2 π ( n 1 ) / N ) ] .
Φ ( x , y ) = ϕ ( x , y ) + k ( x , y ) 2 π .
I n ( x , y ) = A ( x , y ) + B ( x , y ) c o s [ Φ ( x , y ) + 2 π ( n 1 ) / N + ε n ( x , y ) ] ,
ϕ ( x , y ) = tan 1 [ n = 1 N I n ( x , y ) sin ( 2 π ( n 1 ) / N ) n = 1 N I n ( x , y ) cos ( 2 π ( n 1 ) / N ) ] .
Δ ϕ ( x , y ) = ϕ ( x , y ) ϕ ( x , y ) = ta n 1 [ cos 2 ϕ n = 1 N sin ( 2 2 π ( n 1 ) / N + ε n ) sin 2 ϕ n = 1 N cos ( 2 2 π ( n 1 ) / N + ε n ) + n = 1 N sin ε n cos 2 ϕ n = 1 N cos ( 2 2 π ( n 1 ) / N + ε n ) + sin 2 ϕ n = 1 N sin ( 2 2 π ( n 1 ) / N + ε n ) + n = 1 N cos ε n ] .
ϕ ( x , y ) = tan 1 [ I 4 ( x , y ) I 2 ( x , y ) I 3 ( x , y ) I 1 ( x , y ) ] .
ϕ 1 ( x , y ) = tan 1 [ I 1 ( x , y ) + I 3 ( x , y ) 2 I 2 ( x , y ) I 3 ( x , y ) I 1 ( x , y ) ] ,
ϕ 2 ( x , y ) = tan 1 [ I 4 ( x , y ) I 2 ( x , y ) 2 I 3 ( x , y ) I 4 ( x , y ) I 2 ( x , y ) ] .
{ I 1 ( x , y ) = A ( x , y ) + B ( x , y ) cos [ Φ ( x , y ) ] I 2 ( x , y ) = A ( x , y ) B ( x , y ) sin [ Φ ( x , y ) + ε 1 ( x , y ) ] I 3 ( x , y ) = A ( x , y ) B ( x , y ) cos [ Φ ( x , y ) + ε 1 ( x , y ) + ε 2 ( x , y ) ] I 4 ( x , y ) = A ( x , y ) + B ( x , y ) sin [ Φ ( x , y ) + ε 1 ( x , y ) + ε 2 ( x , y ) + ε 3 ( x , y ) ] .
ϕ 1 ( x , y ) = tan 1 [ I 1 ( x , y ) + I 3 ( x , y ) 2 I 2 ( x , y ) I 3 ( x , y ) I 1 ( x , y ) ] tan 1 [ ( 2 + ε 1 + ε 2 ) sin ϕ + 2 ε 1 cos ϕ 2 cos ϕ + ( ε 1 + ε 2 ) sin ϕ ]   .
Δ ϕ 1 ( x , y ) = ϕ 1 ( x , y ) ϕ 1 ( x , y ) ta n 1 [ ( 2 + ε 1 + ε 2 ) sin ϕ 1 + 2 ε 1 cos ϕ 1 2 cos ϕ 1 + ( ε 1 + ε 2 ) sin ϕ 1 ] ta n 1 [ sin ϕ 1 cos ϕ 1 ] ta n 1 [ ( ε 2 + ε 1 ) sin 2 ϕ 1 + ( ε 2 ε 1 ) cos ϕ 1 ( ε 1 + 3 ε 2 ) ( ε 2 + ε 1 ) cos 2 ϕ 1 + ( ε 1 ε 2 ) sin ϕ 1 + ( 4 + ε 1 + ε 2 ) ]   .
Δ ϕ 1 ( x , y ) ( ε 2 + ε 1 ) sin 2 ϕ 1 + ( ε 2 ε 1 ) cos ϕ 1 ( ε 1 + 3 ε 2 ) ( ε 2 + ε 1 ) cos 2 ϕ 1 + ( ε 1 ε 2 ) sin ϕ 1 + ( 4 + ε 1 + ε 2 ) ε 1 + ε 2 4 sin 2 ϕ 1 + 3 ε 1 + ε 2 4   .
Δ ϕ 2 ( x , y ) = ϕ 2 ( x , y ) ϕ 2 ( x , y ) ta n 1 [ 2 sin ϕ 2 + ( 2 ε 1 + ε 2 + ε 3 ) cos ϕ 2 ( 2 + ε 2 + ε 3 ) cos ϕ 2 2 ( ε 1 + ε 2 ) sin ϕ 2 ] ta n 1 [ sin ϕ 2 cos ϕ 2 ] ta n 1 [ ( ε 2 + ε 3 ) sin 2 ϕ 2 + ( ε 2 ε 3 ) cos 2 ϕ 2 ( 4 ε 1 + 3 ε 2 + ε 3 ) ( ε 2 + ε 3 ) cos 2 ϕ 2 + ( ε 3 ε 2 ) sin 2 ϕ 2 + ( 4 + ε 2 + ε 3 ) ] ε 2 + ε 3 4 sin 2 ϕ 2 + 4 ε 1 + 3 ε 2 + ε 3 4   .
ϕ c ( x , y ) = [ ϕ 1 ( x , y ) + ϕ 2 ( x , y ) ] / 2 .
Δ Φ l c = { ϕ l c ϕ r _ l , i f ϕ l c ϕ r _ l ϕ l c ϕ r _ l + 2 π , i f ϕ l c < ϕ r _ l .
Δ Φ h c = ϕ h c + 2 π r o u n d ( s Δ Φ l c + ϕ r _ h ϕ h c 2 π ) ϕ r _ h ,
ϕ 3 ( x , y ) = tan 1 [ I 1 ( x , y ) + I 3 ( x , y ) 2 I 4 ( x , y ) I 3 ( x , y ) I 1 ( x , y ) ] ,
ϕ 4 ( x , y ) = tan 1 [ I 2 ( x , y ) I 4 ( x , y ) I 4 ( x , y ) + I 2 ( x , y ) 2 I 1 ( x , y ) ] .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.