This paper proposes a novel method to substantially reduce motion-introduced phase error in phase-shifting profilometry. We first estimate the motion of an object from the difference between two subsequent 3D frames. After that, by leveraging the projector’s pinhole model, we can determine the motion-induced phase shift error from the estimated motion. A generic phase-shifting algorithm considering phase shift error is then utilized to compute the phase. Experiments demonstrated that proposed algorithm effectively improved the measurement quality by compensating for the phase shift error introduced by rigid and nonrigid motion for a standard single-projector and single-camera digital fringe projection system.
© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
Phase-shifting profilometry (PSP) is widely employed in 3D shape measurement because of its accuracy, resolution, speed, and robustness to noise. In general, PSP uses multiple fringe images with precisely known phase shifts to accurately recover the phase, and thus the measured scene should remain stationary while these phase-shifted fringe images are being captured.
For practical PSP systems, phase shifts are not always precisely known, and the unknown phase shifts introduce measurement errors. In a conventional laser interferometry system, the phase shift error could be introduced by the displacement error of the mirror driven by a piezoelectric device. The phase-shift error, in general, could also be introduced by motion of the object, i.e., the object moves during the time of capturing the required number of phase-shifted fringe patterns for phase determination. The former introduces homogeneous phase shift error that remains the same across the entire measurement surface, while the latter introduces nonhomogeneous phase shift error that could vary from point to point.
Researchers have developed methods to address the homogeneous phase-shift error problem. Wang and Han  proposed to find the randomly-shifted phase by iteratively solving for the phase and actual phase shift using the least-square optimization. This method has relatively stable and fast convergence, yet assumes the background intensity and modulation amplitude do not have pixel-to-pixel variation. By analyzing the region with zero difference of intensity between two images with different phase shifts, Guo et al.  developed a method to determine the actual phase shift. Gao et al.  proposed a method to extract the actual phase shift in interferometry with arbitrary unknown phase shift. This method first computes a rough estimation of the phase shift based on the statistical property of object’s phase, and then employs an iterative approach to further improve the phase-shift estimation.
Researchers also attempted to address the more complex nonhomogeneous phase-shift error problems especially introduced by motion of the object, which is common in practice. Lu et al.  proposed a method to reduce artifacts caused by the planar motion parallel to the imaging plane. They essentially placed a few markers on the object and analyzed the movement of the marker to estimate the rigid-body motion of an object. Lu et al.  later improved their method  to handle error induced by translation in the direction of object height. In this new method, the motion is estimated using the arbitrary phase shift extraction method developed in Wang and Han . However, this method only works well for homogeneous background intensity and modulation amplitude. Feng et al.  proposed to apply the homogeneous phase-shift extraction method proposed by Gao et al.  to solve the nonhomogeneous motion artifact problem by segmenting objects with different rigid shifts. However, this method still assumes that the phase shift error within a single segmented object is homogeneous. As a result, this method may not work well for dynamically deformable objects where each point could introduce different phase shift errors.
This paper proposes a novel method to substantially reduce the nonhomogeneous motion-induced phase shift error. We first estimate the motion of an object from the difference between two subsequent 3D frames. We then take advantage of projector’s pinhole model to determine the motion-induced phase shift error from the estimated motion. A generic phase-shifting algorithm considering phase shift error is then utilized to compute the phase. Since this proposed method treats each individual point independently, we experimentally demonstrated that it substantially reduced the motion artifact for dynamically deformable objects (e.g., human facial expressions).
2.1. Phase-shifting profilometry
Assume k-th fringe images of a generic M-step phase-shifting algorithm can be described as,7]i
The arctangent function in Eq. (3) gives the wrapped phase within (−π, π] with 2π discontinuities. The desired continuous phase Φ(uc, vc) can be obtained by applying a a phase unwrapping algorithm to determine k(uc, vc), the number of 2π’s to be added for each point, that is
2.2. Proposed error compensation method
Phase-shifting profilometry works well if ϕ(uc, vc) can be accurately determined using Eq. (3), which requires the phase shift δk to be precisely known. However, if the object moves between different frames, the actual phase shift differs from ideal phase shift δk with an error ϵk, i.e.,
Figure 1 illustrates that the motion of an deformable object can introduce phase shift error. The camera image point C1 corresponds to point S1 on the object surface without motion, but actually corresponds to if the object moves. The corresponding projected fringe pattern points are P1 and , respectively. These two points on the projector correspond to two different phase values Φ1 and ; and we define the difference between these phase values as phase shift error,
Similarly, the phase shift error also occurs for another camera image point C2 whose corresponding object surface point is S2 assuming the object does not move, and if the object moves. The phase difference for the corresponding projector points P2 and is,
Apparently, ϵ1 is not always the same as ϵ2 for different image points, and thus the phase shift error introduced by object motion is nonhomogeneous.
In order to determine the phase shift error ϵ from object motion, we propose to utilize the pin-hole model of a projector , which describes the relation between the 3D world coordinates (xw, yw, zw) and the 2D projector coordinate (up, vp) as8].
From the pin-hole model described in Eq. (8), we can clearly see that if a point at (xw, yw, zw) moves by Δxw in x direction, the corresponding point in projector’s plane changes accordingly, from (up, vp) to . Mathematically, the relationship between the movement along x direction and the corresponding change on the projector plane can be calculated as,
Similarly, the motion along the y and z directions could result in corresponding point changes on the projector plane. If the phase changes along up direction on the projector and the phase remains constant along vp direction, and the fringe period is λ in pixels on the projector, then the overall phase shift error induced by motion can be mathematically represented as,
This equation indicates that the motion-induced phase-shift error for a given point (xw, yw, zw) can be determined if the motion of the point is known. In this research, we proposed to estimate the object motion by,
However, this phase shift error estimation method is approximate and may not be accurate. To further improve the accuracy, we iteratively apply the same method to the updated 3D reconstructions from the previous step(s) until the algorithm converges. Our experiments found that this process typically converges within 2 or 3 iterations. The estimated phase-shift error is used to determine the actual phase shift in Eq. (5), that is used to calculate phase using Eqs. (2)–(3).
We built a PSP system to evaluate the performance of our proposed method. This system includes a complementary metal-oxide-semiconductor (CMOS) camera (model: PointGrey Grasshopper3 GS3-U3-23S6M) attached with a 8 mm lens (model: Computar M0814-MP2) and a digital light processing (DLP) projection developmental kit (model: LightCrafter 4500). The projector’s resolution is 912×1140 pixels and the camera’s resolution was set as 640×480 pixels. The system was calibrated using the method described by Li et al. . Both the projector and the camera speeds were set at 120 Hz. We employed a three-step phase-shifting algorithm for wrapped phase retrieval and a three-frequency temporal phase-unwrapping algorithm.
We evaluated the performance of the proposed motion error compensation method by measuring a moving sphere with a diameter of 79.2 mm. For this experiment, the sphere is moving at a speed of approximately 80 mm/s. Visualization 1 shows all 3D frames of the measurement sequence. Figure 2 shows the result of one representative 3D frame. Figure 2(a) shows the raw 3D result without employing our proposed method. The sphere surface is not smooth, as expected, because the 3D shape measurement speed is not sufficient to capture the motion of the object. We then employed our proposed phase-shift error compensation algorithm, the quality was drastically improved, as shown in Fig. 2(b).
To quantitatively evaluate the improvement using our proposed method, we compared the measurement results with an ideal sphere. Figures 2(c) and 2(d) shows the corresponding error map that is the difference between the measured data and the ideal sphere. Before error compensation, the mean error is 0.482 mm, and the standard deviation is 0.209 mm. In contrast, after applying our phase-shift error compensation method, the mean error is reduced to 0.032 mm and the standard deviation is reduced to 0.037 mm.
To verify that our proposed method can also work for more complex geometry than a smooth surface, we measured a surface with small, detailed features. Figure 3 shows the results of the experiment. Figure 3(a) shows the object photograph; and Fig. 3(b) shows the result without employing our proposed phase-shift error compensation method. The measured surface depicts vertical stripes introduced by motion. In contrast, after applying our proposed method, the vertical stripes almost disappear, as shown in Fig. 3(c). To visually verify the performance of our proposed method with a ground truth, we measured the same object while it did not move, and Fig. 3(d) shows the result. Visually, our proposed method produced the same result as the ground truth, demonstrating that our proposed method works well for non-smooth object.
We also evaluated the performance of our proposed method to measure dynamically deformable objects. In this experiment, we measured human facial expressions. Figure 4 shows one representative 3D frame, and Visualization 2 shows the entire video sequence. Figure 4(a) shows the 3D reconstruction without adopting our phase-shift error compensation algorithm. This result clearly shows measurement errors (non-smooth surface geometry). We then employed our error compensation algorithm to process the same frame, and Fig. 4(b) shows the result. The motion artifacts are less obvious and the measurement quality is significantly improved. This experiment demonstrated that our proposed method can successfully alleviate motion artifacts even for dynamically deformable objects with complex surface geometry.
In addition, we noticed that our proposed method can also significantly improve texture quality. Figure 4(c) shows the texture of the frame before applying our phase-shift error compensation method, showing some vertical stripes on the image. Figure 4(d) shows the texture after applying our phase-shift error compensation method, and those stripes almost disappear. All these experimental data clearly demonstrate the effectiveness of our proposed motion error compensation method.
This paper has presented a novel method to reduce nonhomogeneous phase shift error caused by object motion. The principle behind this proposed method was elucidated. Our experimental results demonstrated that our proposed method can effectively compensate for motion introduced phase-shift error and enhance measurement quality for objects with rigid motion as well as objects that are dynamically deformable.
National Science Foundation (NSF) (CMMI-1531048).
References and links
1. Z. Wang and B. Han, “Advanced iterative algorithm for phase extraction of randomly phase-shifted interferograms,” Opt. Lett. 29, 1671–1673 (2004). [CrossRef]
4. L. Lu, J. Xi, Y. Yu, and Q. Guo, “New approach to improve the accuracy of 3-d shape measurement of moving object using phase shifting profilometry,” Opt. Express 21, 30610–30622 (2013). [CrossRef]
6. S. Feng, C. Zuo, T. Tao, Y. Hu, M. Zhang, Q. Chen, and G. Gu, “Robust dynamic 3-d measurements with motion-compensated phase-shifting profilometry,” Opt. Laser Eng. 103, 127–138 (2018). [CrossRef]
7. H. Schreiber and J. H. Bruning, Optical Shop Testing, 3rd ed. (John Wiley & Sons, 2007).
8. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1330–1334 (2000). [CrossRef]