Abstract

Circular fringe projection profilometry (CFPP) is a recently proposed optical three-dimensional (3D) measurement technique. Theoretically, its optimal measurement accuracy should precede that of the conventional fringe projection profilometry. In practice, the measurement accuracy is impacted by many factors, and much research remains to be done in order to make CFPP reach its optimal precision. One of the dominant factors is the zero-phase point. For the usage of the cotangent function, error near the zero-phase point will be significantly amplified. This makes the overall measurement accuracy very low for CFPP with coaxial layout. To address this critical issue, CFPP with off-axis layout (called OCFPP for simplicity) is presented in this paper. The core theory of OCFPP is briefly introduced. The zero-phase point detection problem coming with OCFPP is explained. Then, two methods, one based on a two-dimensional ruler and the other based on plane constraint, are proposed to solve this additional problem. Simulation and experiments validate the effectiveness of the proposed zero-phase point detection methods, and convince the advantage of OCFPP. This paper contributes to distinctly improving the 3D measurement capability of CFPP, and lays an indispensible foundation for its practical application.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Structured-light illumination methods provide plenty of options for three-dimensional (3D) reconstruction [1]. Among them, the fringe projection profilometry (FPP) in which the employed structured light is a sinusoidal fringe pattern is most frequently used in industrial 3D measurement [2]. Generally, FPP accomplishes the 3D measurement based on the triangle formed between a projected light ray and a captured light ray. Phase of the sinusoidal fringe pattern [3,4] is a key parameter to mark this triangle. For the merits of relatively high accuracy, dense point cloud, good adaptability to the measurement environment, reasonable cost, and easy-to-use, FPP has been finding more and more applications [5–11]. To improve the 3D measurement accuracy, some fringe projection techniques employing telecentric lenses for a camera or a projector were put forward [12–14]. Despite the usage of telecentric lenses, the triangle for 3D reconstruction of these methods is the same to that of the conventional FPP.

In [15], circular fringe projection profilometry (CFPP) was presented to accomplish 3D measurement. CFPP can be regarded as a variant of FPP which is called Fourier Transform Profilometry when proposed by Takeda et al. [3] since Fourier transform was employed to process the captured fringe pattern. It requires similar hardware as FPP with telecentric lenses [12–14]. Sinusoidal fringe pattern is also employed to aid the 3D measurement. Most of the phase demodulation methods [4,16–19] and phase unwrapping methods [20–27] for FPP can be used to process the captured fringe pattern in CFPP. Phase is also a key parameter for the 3D reconstruction. Certainly, there are some differences between CFPP and FPP. The fundamental difference between CFPP and FPP lies in that CFPP accomplishes the 3D reconstruction based on the triangle formed between a projected light ray and the optical axis of the projection unit. In CFPP, a camera is leveraged to acquire parameters required for 3D reconstruction instead of forming the triangle. CFPP provides a new option for full-field 3D measurement. Theoretically, its measurement accuracy should triumph over that of FPP. In practice, some pivotal technical issues remain to be solved to guarantee the 3D measurement accuracy of CFPP.

When CFPP was proposed, the optical axes of a camera and projector are set to be coaxial [15]. This is beneficial to clearly describe the basic theory of CFPP. However, the coaxial layout is not the optimal choice. In CFPP, the height is recovered by multiplying length of a space straight line by cotangent of a projection angle. It is unavoidable that there exist noises in the calculated values of the length of the space straight line or the projection angle. When the coaxial layout is adopted, the projection angles for region near the optical axis of a projector approximate to zero. This means a small noise will bring in large error in the recovered height for the region near the optical axis. The coaxial layout becomes a dominant bottleneck to improving the 3D measurement accuracy of CFPP.

Solution to the above problem seems to be obvious. One can set the axis of a projector to be parallel but far away from that of a camera to increase the projection angle, decrease the amplification to the measurement noise, and eventually improve the 3D measurement accuracy. However, once the axes of a projector and camera are set to be far away enough (Under this condition, the hardware layout is called off-axis.), the zero-phase point of the projected circular fringe pattern will fall out of the field of view of the camera. Hence, the pixel coordinates of the zero-phase point cannot be confirmed from the captured fringe pattern. However, 3D reconstruction is unrealizable without the pixel coordinates of the zero-phase point. This additional issue has to be addressed before CFPP with off-axis layout (called OCFPP for simplicity in the following) becomes feasible. Indeed, as noted in [15], all the measurement results therein were obtained with OCFPP since CFPP with coaxial layout (called CFPP for simplicity in the following) is unable to accurately recover the 3D profile of a measured object. Theory on OCFPP was not described in [15]. In addition, no solution has been uncovered to solve the issue coming with OCFPP.

The intention of this paper is to solve the zero-phase point detection problem in OCFPP. The fundamental theory of OCFPP is described, and the merits of OCFPP over CFPP are analyzed. Origin of the zero-phase point detection problem is introduced. A method based on a two-dimensional (2D) ruler is put forward to help detect the pixel coordinates of the zero-phase point. Then, a more powerful method which relies on the plane constraint is presented. Furthermore, procedures to conduct OCFPP are outlined. The validity of the proposed methods is theoretically deduced and experimentally demonstrated. This paper remedies an obvious disadvantage in CFPP, and contributes to significantly improving its 3D measurement capability. In addition, OCFPP spares the beam splitter in CFPP, and hence its hardware layout becomes easier.

2. Principle of OCFPP

2.1 Geometric and mathematical model of OCFPP

Note that OCFPP is modified from CFPP. Only indispensable theory on OCFPP is introduced in this paper. To better understand OCFPP, it is recommended to refer to [15] first. Figure 1 shows the geometric model of OCFPP. This model is the famous pinhole model. Here, it is used to express a projection process. Point O is a point light source. It emits plenty of light, and the light field covers the measured surface. For simplicity, only the light that falls onto point M on the measured object is marked in Fig. 1. A plane whose transmittance can be adjusted pixel by pixel (for example a LCD or DLP panel) is placed in front of thelight source. OM passes through the plane at E. Light perpendicular to the plane intersects with it at F. The Z axis is set to be coincided with OF, and its origin is set to be O. θ is called the projection angle. The perpendicular line of M to the Z axis intersects with it at N. In practice, M can be any point on the measured object. If its coordinate can be obtained, the 3D profile of the measured object can be recovered. According to the triangle, the z axis of M can be computed by

 figure: Fig. 1

Fig. 1 Geometric model of OCFPP.

Download Full Size | PPT Slide | PDF

z=MNcotθ.

The schematic diagram of the hardware layout of OCFPP is depicted in Fig. 2. The optical model of a projector is the same to the one in Fig. 1 [28]. Hence, a digital projector is an ideal choice to realize the geometric model of OCFPP. A camera with telecentric lenses is used to assist the calculation of MN and cotθ in Eq. (1). The digital projector and camera consist of the main hardware of OCFPP. The optical axes of the camera and projector are set to be parallel and far away. Their distance is l. Points M and N in Fig. 2 correspond to those in Fig. 1. θmin denotes the minimum value of the projection angle during the measurement.

 figure: Fig. 2

Fig. 2 Hardware layout of OCFPP.

Download Full Size | PPT Slide | PDF

As cotθ is employed during the computation of height (see Eq. (1)), noise in calculated value of MN and θ will be hugely amplified when θ is small. Suppose MN = 10 mm, and error in calculated MN is 0.02 mm, the error rate of the calculated z corresponding to variant θ can be calculated with Eq. (1), and the result is depicted in Fig. 3(a); suppose MN = 10 mm, and error in calculated θ is 0.01 radians, the error rate of the calculated z corresponding to variant θ can also be calculated with Eq. (1), and the result is depicted in Fig. 3(b).

 figure: Fig. 3

Fig. 3 Error rate of the calculated z corresponding to variant values of θ: (a) the simulated error in MN is 0.02 mm, and in θ is 0 radian; (b) the simulated error in MN is 0 mm, and in θ is 0.01 radians.

Download Full Size | PPT Slide | PDF

From Fig. 3, it can be seen that the measurement error rate decreases as θ increases when the error in MN is considered separately; the measurement error rate decreases as θ increases until θ reaches a threshold (about 0.7 radians), and then it increases as θ increases when the error in θ is considered separately. In both cases, when θ is small, a small error in MN or θ would bring in large measurement error. Hence, a relatively larger value for θ helps to improve the 3D measurement accuracy. This requires l to be as large as possible if only thefield of view of the camera is within that of the projector. For CFPP, l=0 and θmin=0, which will eventually make the overall 3D measurement accuracy low. This is an inherent demerit of CFPP, which remains to be solved. To this end, OCFPP is formally proposed in this paper.

When MN and cotθ are substituted by other measurable parameters, Eq. (1) becomes

z=μc(xcxc0)2+(ycyc0)2kβΦ(xc,yc),
where μc is the pixel size of a used CCD camera, (xc,yc) signifies the pixel coordinates of any point in a captured fringe pattern, (xc0,yc0) signifies the pixel coordinates of the zero-phase point in a captured fringe pattern (suppose this point exists), β is the amplification factor of the employed telecentric lenses, and Φ(xc,yc) denotes the phase of a captured fringe pattern at (xc,yc), k represents a system parameter, which can be calculated by
k=μc(xcxc0)2+(ycyc0)2βΔz[1ΦΔz(xc,yc)1Φ0(xc,yc)],
where Δz signifies the displacement of an object for calibration, and Φ0(xc,yc) and ΦΔz(xc,yc) are the phases of fringe patterns captured at positions with 0 displacement and Δz displacement respectively.

As telecentric lenses is used, the transverse coordinates of a measured object can be easily computed from its pixel coordinates by

{x=μcxc/β+ξxy=μcyc/β+ξy.
where ξx and ξy are constants only determined by the spatial position of the origin of the world coordinate system. Indeed, they can always be adjusted to 0.

Although the hardware layout of OCFPP differs from that of CFPP, their mathematical models for 3D reconstruction seem to be the same. Indeed, they are different. This will be explained in section 2.2.

2.2 Zero-phase point and its detection methods

2.2.1 Zero-phase point problem in OCFPP

To better illustrate OCFPP, the 3D model of its hardware system is shown in Fig. 4. The coded circular fringe pattern at the top left hand side of Fig. 4 is projected onto the measured object by the projector, and captured by the camera with telecentric lenses. There are twokinds of zero-phase points in OCFPP. The first one is the zero-phase point in the coded circular fringe pattern which corresponds to the center of annuluses in the coded circular fringe pattern; the other is the zero-phase point in the captured circular fringe pattern which corresponds to the image of the center of annuluses in the fringe pattern on the measured plane (see Fig. 4). In this paper, the zero-phase point signifies the one in the captured fringe pattern unless otherwise specified.

 figure: Fig. 4

Fig. 4 3D model of the hardware system of OCFPP.

Download Full Size | PPT Slide | PDF

From Fig. 4, it is obvious that the zero-phase point is not within the captured circular fringe pattern because the center of annuluses in the fringe pattern on the measured plane is out of the field of view of the camera. This means the pixel coordinates of the zero-phase point, (xc0,yc0), is missing in OCFPP while it is known in CFPP. As can be seen from Eqs. (2) and (3), (xc0,yc0) is an indispensable parameter for both system calibration and height calculation. Therefore, the detection of the virtual pixel coordinates of the zero-phase point is a problem urgent to be solved in OCFPP.

2.2.2 2D ruler based method for detecting the zero-phase point

The 2D ruler based method (2DRM) works by measuring the zero-phase point with aid of a 2D ruler. A 2D ruler is a kind of ruler that has scale in both horizontal and vertical directions. During a measurement, it should be arranged to be perpendicular to the optical axis of a projector, and the horizontal (vertical) direction of it should be parallel to the row (column) direction of the sensor of a camera. Its work area should be within the field of view of a camera, and also passed by the optical axis of the projector. In practice, the 2D ruler is placed like the measured plane in Fig. 4. After the hardware system and the 2D ruler are properly configured, an image of the 2D ruler is captured by the camera. In addition, a reticle, whose center corresponds to the center of annuluses in the coded circular fringe pattern in Fig. 4, is coded and projected onto the 2D ruler.

Figure 5 depicts part of a 2D ruler used in our experiment. This 2D ruler is designed with aid of the software CorelDRAW, and then printed by a laser printer. The printed 2D ruler is pasted onto a hard plane. The reticle on it is projected by a projector. The red rectangle roughly marks the field of view of the used camera. The center of the reticle C0 corresponds to the zero-phase point of a projected circular fringe pattern in a practical measurement. The image of part of the 2D ruler is captured by the camera, and then a point C (It can be any point) in the captured image is selected. The zero-phase detection problem is equivalent todetecting the pixel coordinates of the image of C0 captured by the camera. From Figs. 4 and 5, it is clear that the image of C0 cannot be captured by the camera indeed. However, it is reasonable to assume the size of the CCD panel of the camera and the field of view of the telecentric lenses to be large enough to make C0 within the field of view while all the other parameters keep constant. Then, suppose physical coordinates of C measured by the 2D ruler are (Xc,Yc), the pixel coordinates of the image of C are (xc,yc), and the physical coordinates of C0 measured by the 2D ruler are (Xc0,Yc0), it can be deduced that the pixel coordinates of the image of C0 'captured' by the camera, (xc0,yc0), should be

 figure: Fig. 5

Fig. 5 Image of a 2D ruler with a reticle projected onto it. The interval between two adjacent lines is 5 mm; the white reticle is projected by a projector, and its center C0 is projected from the principal point of a projector (i.e., the zero-phase point of the projected circular fringe pattern); the red rectangle marks the field of view of a camera; C can be any point within the red rectangle.

Download Full Size | PPT Slide | PDF

{xc0=xc+β(Xc0Xc)/μcyc0=yc+β(Yc0Yc)/μc.

2DRM is effective in detecting the zero-phase point for OCFPP. In practice, it has two shortcomings: the first is that the manufacturing of a high-precision 2D ruler is expensive (the 2D ruler made by a printer is not accurate.); the second is that the position of the 2D ruler is too rigid to be accurately met. The inaccurate fabrication and position of the 2D ruler will reduce the calculation precision of the zero-phase point, and eventually reduce the 3D measurement accuracy. To this end, a more accurate and easy-to-use method is presented.

2.2.3 Plane constraint based method for detecting the zero-phase point

The foundation of the plane constraint based method (PCM) is constructed on two properties. Firstly, when a plane is measured by OCFPP, assume values of the other parameters in Eq. (2) are ideal, the value of (xc0,yc0) impacts the reconstructed profile of the plane, and if the used value deviates from its actual value, the reconstructed profile of the measured plane will be curved. Secondly, the value of k in Eq. (3) is a constant, and hence does not impact the planeness of the reconstructed profile.

We had coded a program which is capable of simulating almost all the parameters during a measurement with OCFPP or CFPP. It was used to reveal the impact of the values of (xc0,yc0) and k to the recovered profile of a plane with z=600 mm. During the simulation, the actual values of (xc0,yc0) and k were set to be (4960,1) and 0.001731 respectively; Eqs. (2) and (4) were adopted to recover the 3D profile of the measured plane; values of all the parameters except for those of (xc0,yc0) and k were set to be ideal during the 3D calculation. When k = 0.001731 and (xc0,yc0)=(4960,11) were used, the recovered 3D profile is plotted in Fig. 6(a); when k = 1 and (xc0,yc0)=(4960,1) were used, the recovered 3D profile is plotted in Fig. 6(b). It is obvious that when a false value of (xc0,yc0) is employed, the recovered 3D profile of the plane becomes a curved surface; when a false value of k is employed, the recovered 3D profile of the plane is still plane, albeit its height is not right.

 figure: Fig. 6

Fig. 6 3D measurement results of a plane in a simulation when (a) the value of k = 0.001731, (xc0,yc0)=(4960,11), and (b) the value of k = 1, (xc0,yc0)=(4960,1), and values of all the other parameters are ideal. During the simulation, the profile of the measured plane is z=600 mm, the true value of k is 0.001731, and the true value of (xc0,yc0) is (4960,1).

Download Full Size | PPT Slide | PDF

Therefore, if a plane is measured with OCFPP, the planeness of the recovered profile is highest only when (xc0,yc0) is set to be its actual value regardless of the value of k. Suppose a plane is measured with OCFPP, the values of (xc0,yc0) are searched within a region that contains its actual location, and the value of the qth location is (xc0q,yc0q) (q=1,2,,Q, Q is large enough to ensure traversing every necessary points in the selected region; see Fig. 7). The virtual height of the measured plane corresponding to the qth searched zero-phase point can be calculated by

 figure: Fig. 7

Fig. 7 Illustration to the search process of PCM. The rectangle signifies the region to be searched for (xc0,yc0), the red dots denote the points to be searched, and the blue reticle represents the actual location of (xc0,yc0).

Download Full Size | PPT Slide | PDF

zvirtualq(xc,yc)=μc(xcxc0q)2+(ycyc0q)2βΦ(xc,yc).

Then, plane fitting is applied to zvirtualq(xc,yc). Assume the function of the fitted plane is

zfitq(xc,yc)=aqxc+bqyc+cq.
where aq, bq, and cq are the coefficients worked out during the fitting.

The root mean square error (RMSE) of the plane fitting can be calculated by

RMSE(q)=xc=1columnyc=1row(zvirtualq(xc,yc)zfitq(xc,yc))2rowcolumn,
where row and column denote the row number and column number of the captured image respectively.

According to the foregoing deduction, the point that makes RMSE(q) smallest is the actual zero-phase point. Therefore, the detected pixel coordinates of the zero-phase point can be expressed by

(xc0,yc0)=(xc0q¯,yc0q¯),
where q¯ makes RMSE(q) smallest.

PCM is capable of accurately detecting the zero-phase point only with aid of a plane. A shortcoming of it is that its application may be time-consuming. This can be overcome by optimizing the searching strategy. To get rid of making the theory in this paper too swollen to be easily understood, the optimization strategy of PCM is not considered in this paper.

2.3 Procedures for application of OCFPP

Once the zero-phase point, (xc0,yc0), is calculated with 2DRM or PCM, the application of OCFPP becomes feasible. Procedures for application of OCFPP can be summarized as follows (please refer to Fig. 4):

  • 1) Adjust the hardware of OCFPP to make the optical axes of the camera and projector parallel and be adequately far away;
  • 2) Detect the pixel coordinates of the zero-phase point with 2DRM or PCM;
  • 3) Calibrate the hardware system to obtain the value of k (please refer to Eq. (3));
  • 4) Lay the measured object properly, project the coded circular fringe patterns onto it with the projector, and capture the fringe patterns on it with the camera;
  • 5) Process the captured fringe patterns to gain the phase;
  • 6) Calculate the 3D coordinate of the measured object with Eqs. (2) and (4).

3. Simulation and experiments

3.1 Comparison between CFPP and OCFPP

Simulation was made to demonstrate the advantage of OCFPP over CFPP. The simulation was accomplished with aid of a program coded with MATLAB. This program is able to generate the circular fringe pattern, emulate the hardware parameters, simulate the fringe pattern projection and capture process according to the set hardware parameters, process the captured fringe pattern, and complete the 3D reconstruction.

During the simulation, a plane with z=600 mm was measured with CFPP and OCFPP respectively. All the other parameters were set to be ideal except the phase. Gaussian white noise with mean of 0 and variance of 0.003 radians were added to the simulated phase. Values of l in CFPP and OCFPP during the measurement were set 0 and 200 mm respectively; θmin in CFPP and OCFPP during the measurement were around 0 radian and 0.32 radians (please refer to Figs. 1 and 2). The recovered 3D profiles are plotted in Fig. 8. It can be seen that the plane recovered with CFPP is coarse, and there are some regions evidently deviate from its actual shape (these regions approach to the zero-phase point); in contrast, OCFPP accurately recovered the 3D profile of the measured plane.

 figure: Fig. 8

Fig. 8 3D profiles of a plane with z = 600 mm recovered with (a) CFPP and (b) OCFPP respectively when Gaussian noise with mean of 0 and variance of 0.003 radians are added to the simulated phase.

Download Full Size | PPT Slide | PDF

Therefore, CFPP lacks the capability of obtaining high 3D measurement accuracy. OCFPP successfully makes up this inherent shortcoming of CFPP by adjusting the field of view of the camera away from the zero-phase point.

3.2 Hardware parameters and the detected zero-phase point

An OCFPP system was constructed to validate the proposed methods. In this system, the CCD camera is MV-120SM with a resolution of 1280 × 960 and μc=3.75μm, the telecentric lenses is BT-2396 with a valid field of view of 51.6mm × 38.7mm when the CCD size is 1/3 and β=0.093, and the projector is VPL-EW276 with a resolution of 1280 × 800.

After the OCFPP system was properly arranged, 2DRM and PCM were adopted to calculate the zero-phase point, (xc0,yc0), respectively. The result is listed in Table 1. It can be found that both 5352.7 and 5339.7 are evidently larger than the maximum resolution in column of the camera, 1280. This means the zero-phase point does not exist in the captured fringe pattern indeed. It is also obvious that the zero-phase point calculated with 2DRM is slightly different from that with PCM. As noted in section 2.2.2, the accuracy of 2DRM is impacted by the positioning accuracy of the 2D ruler. Theoretically, the accuracy of PCM is higher than that of 2DRM. Experimental validation will be given in the following.

Tables Icon

Table 1. Pixel coordinates of the zero-phase point calculated with 2DRM and PCM respectively

Once (xc0,yc0) is calculated, the system parameter k can be calibrated with aid of Eq. (3). The values of k were calculated to be 0.0017611 and 0.0017563 when the zero-phase points detected with 2DRM and PCM were employed respectively.

3.3 Experimental results of OCFPP

Some representative objects were measured with constructed OCFPP system. Firstly, a ball was measured. This experiment is conducted to quantify the performance of 2DRM and PCM. The diameter of the ball is measured by a vernier caliper to be 39.48 mm. Figure 9 shows the image of the measured ball and one of the captured circular fringe pattern. Figure 10 exhibits the 3D profiles measured with 2DRM and PCM respectively. It is difficult to judge which method performs better just from Fig. 10. Sphere fitting is applied to the acquired 3D data; the result is listed in Table 2.

 figure: Fig. 9

Fig. 9 Images of a ball (left) and a fringe pattern captured from it (right).

Download Full Size | PPT Slide | PDF

 figure: Fig. 10

Fig. 10 3D profiles of the measured ball reconstructed with (a) 2DRM and (b) PCM respectively.

Download Full Size | PPT Slide | PDF

Tables Icon

Table 2. Results of sphere fitting applied to 3D data obtained with 2DRM and PCM respectively

The diameters (D) arrived at by the sphere fitting are 39.673 mm and 39.496 mm for 3D data obtained with 2DRM and PCM respectively, the corresponding errors in the fitted diameters (eD) are 0.173 mm and 0.016 mm, and the corresponding standard deviations (SD) are 0.167 mm and 0.071 mm respectively. It can be inferred that both 2DRM and PCM are effective in detecting the zero-phase point for OCFPP while PCM performs better. Even when OCFPP with PCM is used, SD is not small enough, albeit the accuracy of the fitted diameter is relatively fine. Experimental results revealed that this is because the phase accuracy inOCFPP is relatively low with present hardware. The other factors, including phase accuracy, that impact the 3D measurement accuracy of OCFPP will be comprehensively researched in the near future.

As PCM outperforms 2DRM, in the following experiment, OCFPP with PCM is chosen. A medal and a chinar leaf (Fig. 11) were measured to further check the capability of OCFPP. The medal is made of metal, and there are some specular regions. On the top left hand side of the medal, there are some Chinese characters indicating the name of the match; in the middle, there are some runways, and three runners are running on it; the bottom part is dismissed during the measurement. The measured 3D profile of the medal is shown in Fig. 12(a). It can be seen that the 3D profiles of the Chinese characters, the runways, and three persons are successfully reconstructed.

 figure: Fig. 11

Fig. 11 Images of a medal (left) and a chinar leaf (right) to be measured.

Download Full Size | PPT Slide | PDF

 figure: Fig. 12

Fig. 12 3D measurement results of (a) the medal and (b) the chinar leaf.

Download Full Size | PPT Slide | PDF

The chinar leaf was randomly collected outside of our laboratory. It is withered. Its shape is curly. There are some evident skeletons in it. When measured, it was pasted onto a metal plane. Figure 12(b) exhibits its recovered 3D profile. It is found that the recovered 3D profile of the chinar leaf coincides with its actual shape, and even the main skeletons can be distinguished from Fig. 12(b) only by its profile. This demonstrates that OCFPP with PCM has good measurement capacity.

Hence, it can be safely concluded that OCFPP is beneficial to improve the 3D measurement accuracy of CFPP, both 2DRM and PCM are valid in solving the zero-phase point detection problem for OCFPP, and PCM is preferred for its higher accuracy.

4. Summary

In this paper, further research on the recently proposed 3D measurement technique, CFPP, is made. Theoretical deduction reveals that CFPP is unable to accurately recover the 3D profile, especially at region near the zero-phase point. To solve this problem, OCFPP is proposed. The key theory of OCFPP is introduced. The occurrence of the zero-phase point detection problem in OCFPP is analyzed. Two methods, called 2DRM and PCM, are put forward to address this issue. 2DRM works by virtue of a kind of 2D ruler. The zero-phase point in the circular fringe pattern is projected onto a 2D ruler. The physical coordinates of this point can be measured by this ruler, and then the pixel coordinates of the zero-phase point can be calculated from its physical coordinates. PCM calculates the zero-phase point by searching within a selected region pixel by pixel. At each pixel, the virtual 3D profile of a measured plane is recovered. The point that makes the planeness of the recovered profile best is the zero-phase point. The merit and demerit of 2DRM and PCM are pointed out. Procedures to implement OCFPP are outlined. Simulation is made to confirm the advantages of OCFPP over CFPP. Experiment shows that both 2DRM and PCM can effectively solve the zero-phase point detection problem while PCM performs better. Experiment also demonstrates that the 3D measurement performance of OCFPP is good under current hardware configuration.

The proposed methods break one inherent bottleneck suppressing the measurement precision of CFPP, and contribute to significantly advancing its 3D measurement capability. Theoretically, the measurement accuracy of OCFPP should overmatch that of FPP. However, its present performance does not meet the expectation (please refer to Table. 2). An important reason lies in the phase accuracy in OCFPP with a commercial digital projector is distinctly lower than that in the conventional FPP. The goodness of fit between the actual projection model and a pinhole model severely impacts the measurement accuracy. Furthermore, the coding strategy of a circular fringe pattern as well as the arrangement accuracy of the hardware is also controllable factor impacting the measurement accuracy of OCFPP. These factors remain to be deeply researched.

Funding

China Postdoctoral Science Foundation (2018M633500); Natural Science Foundation of Shaanxi Province (2018JQ6063); National Natural Science Foundation of China (NSFC) (61575157).

References

1. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128–160 (2011). [CrossRef]  

2. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010). [CrossRef]  

3. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. 22(24), 3977–3982 (1983). [CrossRef]   [PubMed]  

4. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. 23(18), 3105–3108 (1984). [CrossRef]   [PubMed]  

5. X. Chen, J. Xi, T. Jiang, and Y. Jin, “Research and development of an accurate 3D shape measurement system based on fringe projection: model analysis and performance evaluation,” Precis. Eng. 32(3), 215–221 (2008). [CrossRef]  

6. Z. Li, K. Zhong, Y. F. Li, X. Zhou, and Y. Shi, “Multiview phase shifting: a full-resolution and high-speed 3D measurement framework for arbitrary shape dynamic objects,” Opt. Lett. 38(9), 1389–1391 (2013). [CrossRef]   [PubMed]  

7. H. Zhao, X. Liang, X. Diao, and H. Jiang, “Rapid in-situ 3D measurement of shiny object based on fast and high dynamic range digital fringe projector,” Opt. Lasers Eng. 54, 170–174 (2014). [CrossRef]  

8. J. Pösch, J. Schlobohm, S. Matthias, and E. Reithmeier, “Rigid and flexible endoscopes for three dimensional measurement of inside machine parts using fringe projection,” Opt. Lasers Eng. 89, 178–183 (2017). [CrossRef]  

9. B. Li and S. Zhang, “Superfast high-resolution absolute 3D recovery of a stabilized flapping flight process,” Opt. Express 25(22), 27270–27282 (2017). [CrossRef]   [PubMed]  

10. A. Chatterjee, V. Bhatia, and S. Prakash, “Anti-spoof touchless 3D fingerprint recognition system using single shot fringe projection and biospeckle analysis,” Opt. Lasers Eng. 95, 1–7 (2017). [CrossRef]  

11. Y. Yolanda, D. López, Martínez García, Amalia, and J. Gómez, “Apple quality study using fringe projection and colorimetry techniques,” Optik (Stuttg.) 147, 401–413 (2017). [CrossRef]  

12. K. Haskamp, M. Kästner, and E. Reithmeier, “Accurate calibration of a fringe projection system by considering telecentricity,” Proc. SPIE 8082, 80821B (2011). [CrossRef]  

13. L. Rao, F. Da, W. Kong, and H. Huang, “Flexible calibration method for telecentric fringe projection profilometry systems,” Opt. Express 24(2), 1222–1237 (2016). [CrossRef]   [PubMed]  

14. M. Wang, Y. Yin, D. Deng, X. Meng, X. Liu, and X. Peng, “Improved performance of multi-view fringe projection 3D microscopy,” Opt. Express 25(16), 19408–19421 (2017). [CrossRef]   [PubMed]  

15. H. Zhao, C. Zhang, C. Zhou, K. Jiang, and M. Fang, “Circular fringe projection profilometry,” Opt. Lett. 41(21), 4951–4954 (2016). [CrossRef]   [PubMed]  

16. M. Takeda, “Fourier fringe analysis and its application to metrology of extreme physical phenomena: a review [Invited],” Appl. Opt. 52(1), 20–29 (2013). [CrossRef]   [PubMed]  

17. Q. Kemao, “Two-dimensional windowed Fourier transform for fringe pattern analysis: principles, applications and implementations,” Opt. Lasers Eng. 45(2), 304–317 (2007). [CrossRef]  

18. J. Zhong and J. Weng, “Phase retrieval of optical fringe patterns from the ridge of a wavelet transform,” Opt. Lett. 30(19), 2560–2562 (2005). [CrossRef]   [PubMed]  

19. M. Trusiak, K. Patorski, and K. Pokorski, “Hilbert-Huang processing for single-exposure two-dimensional grating interferometry,” Opt. Express 21(23), 28359–28379 (2013). [CrossRef]   [PubMed]  

20. C. Zhang, H. Zhao, F. Gu, and Y. Ma, “Phase unwrapping algorithm based on multi-frequency fringe projection and fringe background for fringe projection profilometry,” Meas. Sci. Technol. 26(4), 045203 (2015). [CrossRef]  

21. C. Zhang, H. Zhao, and L. Zhang, “Fringe order error in multifrequency fringe projection phase unwrapping: reason and correction,” Appl. Opt. 54(32), 9390–9399 (2015). [CrossRef]   [PubMed]  

22. K. Falaggis, D. P. Towers, and C. E. Towers, “Algebraic solution for phase unwrapping problems in multiwavelength interferometry,” Appl. Opt. 53(17), 3737–3747 (2014). [CrossRef]   [PubMed]  

23. Y. Wang and S. Zhang, “Novel phase-coding method for absolute phase retrieval,” Opt. Lett. 37(11), 2067–2069 (2012). [CrossRef]   [PubMed]  

24. F. Chen and X. Su, “Phase-unwrapping algorithm for the measurement of 3D object,” Optik (Stuttg.) 123(24), 2272–2275 (2012). [CrossRef]  

25. G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. 38(31), 6565–6573 (1999). [CrossRef]   [PubMed]  

26. Y. An, J. S. Hyun, and S. Zhang, “Pixel-wise absolute phase unwrapping using geometric constraints of structured light system,” Opt. Express 24(16), 18445–18459 (2016). [CrossRef]   [PubMed]  

27. M. Zhao, L. Huang, Q. Zhang, X. Su, A. Asundi, and Q. Kemao, “Quality-guided phase unwrapping technique: comparison of quality maps and guiding strategies,” Appl. Opt. 50(33), 6214–6224 (2011). [CrossRef]   [PubMed]  

28. S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128–160 (2011).
    [Crossref]
  2. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010).
    [Crossref]
  3. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. 22(24), 3977–3982 (1983).
    [Crossref] [PubMed]
  4. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. 23(18), 3105–3108 (1984).
    [Crossref] [PubMed]
  5. X. Chen, J. Xi, T. Jiang, and Y. Jin, “Research and development of an accurate 3D shape measurement system based on fringe projection: model analysis and performance evaluation,” Precis. Eng. 32(3), 215–221 (2008).
    [Crossref]
  6. Z. Li, K. Zhong, Y. F. Li, X. Zhou, and Y. Shi, “Multiview phase shifting: a full-resolution and high-speed 3D measurement framework for arbitrary shape dynamic objects,” Opt. Lett. 38(9), 1389–1391 (2013).
    [Crossref] [PubMed]
  7. H. Zhao, X. Liang, X. Diao, and H. Jiang, “Rapid in-situ 3D measurement of shiny object based on fast and high dynamic range digital fringe projector,” Opt. Lasers Eng. 54, 170–174 (2014).
    [Crossref]
  8. J. Pösch, J. Schlobohm, S. Matthias, and E. Reithmeier, “Rigid and flexible endoscopes for three dimensional measurement of inside machine parts using fringe projection,” Opt. Lasers Eng. 89, 178–183 (2017).
    [Crossref]
  9. B. Li and S. Zhang, “Superfast high-resolution absolute 3D recovery of a stabilized flapping flight process,” Opt. Express 25(22), 27270–27282 (2017).
    [Crossref] [PubMed]
  10. A. Chatterjee, V. Bhatia, and S. Prakash, “Anti-spoof touchless 3D fingerprint recognition system using single shot fringe projection and biospeckle analysis,” Opt. Lasers Eng. 95, 1–7 (2017).
    [Crossref]
  11. Y. Yolanda, D. López, Martínez García, Amalia, and J. Gómez, “Apple quality study using fringe projection and colorimetry techniques,” Optik (Stuttg.) 147, 401–413 (2017).
    [Crossref]
  12. K. Haskamp, M. Kästner, and E. Reithmeier, “Accurate calibration of a fringe projection system by considering telecentricity,” Proc. SPIE 8082, 80821B (2011).
    [Crossref]
  13. L. Rao, F. Da, W. Kong, and H. Huang, “Flexible calibration method for telecentric fringe projection profilometry systems,” Opt. Express 24(2), 1222–1237 (2016).
    [Crossref] [PubMed]
  14. M. Wang, Y. Yin, D. Deng, X. Meng, X. Liu, and X. Peng, “Improved performance of multi-view fringe projection 3D microscopy,” Opt. Express 25(16), 19408–19421 (2017).
    [Crossref] [PubMed]
  15. H. Zhao, C. Zhang, C. Zhou, K. Jiang, and M. Fang, “Circular fringe projection profilometry,” Opt. Lett. 41(21), 4951–4954 (2016).
    [Crossref] [PubMed]
  16. M. Takeda, “Fourier fringe analysis and its application to metrology of extreme physical phenomena: a review [Invited],” Appl. Opt. 52(1), 20–29 (2013).
    [Crossref] [PubMed]
  17. Q. Kemao, “Two-dimensional windowed Fourier transform for fringe pattern analysis: principles, applications and implementations,” Opt. Lasers Eng. 45(2), 304–317 (2007).
    [Crossref]
  18. J. Zhong and J. Weng, “Phase retrieval of optical fringe patterns from the ridge of a wavelet transform,” Opt. Lett. 30(19), 2560–2562 (2005).
    [Crossref] [PubMed]
  19. M. Trusiak, K. Patorski, and K. Pokorski, “Hilbert-Huang processing for single-exposure two-dimensional grating interferometry,” Opt. Express 21(23), 28359–28379 (2013).
    [Crossref] [PubMed]
  20. C. Zhang, H. Zhao, F. Gu, and Y. Ma, “Phase unwrapping algorithm based on multi-frequency fringe projection and fringe background for fringe projection profilometry,” Meas. Sci. Technol. 26(4), 045203 (2015).
    [Crossref]
  21. C. Zhang, H. Zhao, and L. Zhang, “Fringe order error in multifrequency fringe projection phase unwrapping: reason and correction,” Appl. Opt. 54(32), 9390–9399 (2015).
    [Crossref] [PubMed]
  22. K. Falaggis, D. P. Towers, and C. E. Towers, “Algebraic solution for phase unwrapping problems in multiwavelength interferometry,” Appl. Opt. 53(17), 3737–3747 (2014).
    [Crossref] [PubMed]
  23. Y. Wang and S. Zhang, “Novel phase-coding method for absolute phase retrieval,” Opt. Lett. 37(11), 2067–2069 (2012).
    [Crossref] [PubMed]
  24. F. Chen and X. Su, “Phase-unwrapping algorithm for the measurement of 3D object,” Optik (Stuttg.) 123(24), 2272–2275 (2012).
    [Crossref]
  25. G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. 38(31), 6565–6573 (1999).
    [Crossref] [PubMed]
  26. Y. An, J. S. Hyun, and S. Zhang, “Pixel-wise absolute phase unwrapping using geometric constraints of structured light system,” Opt. Express 24(16), 18445–18459 (2016).
    [Crossref] [PubMed]
  27. M. Zhao, L. Huang, Q. Zhang, X. Su, A. Asundi, and Q. Kemao, “Quality-guided phase unwrapping technique: comparison of quality maps and guiding strategies,” Appl. Opt. 50(33), 6214–6224 (2011).
    [Crossref] [PubMed]
  28. S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006).
    [Crossref]

2017 (5)

J. Pösch, J. Schlobohm, S. Matthias, and E. Reithmeier, “Rigid and flexible endoscopes for three dimensional measurement of inside machine parts using fringe projection,” Opt. Lasers Eng. 89, 178–183 (2017).
[Crossref]

B. Li and S. Zhang, “Superfast high-resolution absolute 3D recovery of a stabilized flapping flight process,” Opt. Express 25(22), 27270–27282 (2017).
[Crossref] [PubMed]

A. Chatterjee, V. Bhatia, and S. Prakash, “Anti-spoof touchless 3D fingerprint recognition system using single shot fringe projection and biospeckle analysis,” Opt. Lasers Eng. 95, 1–7 (2017).
[Crossref]

Y. Yolanda, D. López, Martínez García, Amalia, and J. Gómez, “Apple quality study using fringe projection and colorimetry techniques,” Optik (Stuttg.) 147, 401–413 (2017).
[Crossref]

M. Wang, Y. Yin, D. Deng, X. Meng, X. Liu, and X. Peng, “Improved performance of multi-view fringe projection 3D microscopy,” Opt. Express 25(16), 19408–19421 (2017).
[Crossref] [PubMed]

2016 (3)

2015 (2)

C. Zhang, H. Zhao, F. Gu, and Y. Ma, “Phase unwrapping algorithm based on multi-frequency fringe projection and fringe background for fringe projection profilometry,” Meas. Sci. Technol. 26(4), 045203 (2015).
[Crossref]

C. Zhang, H. Zhao, and L. Zhang, “Fringe order error in multifrequency fringe projection phase unwrapping: reason and correction,” Appl. Opt. 54(32), 9390–9399 (2015).
[Crossref] [PubMed]

2014 (2)

K. Falaggis, D. P. Towers, and C. E. Towers, “Algebraic solution for phase unwrapping problems in multiwavelength interferometry,” Appl. Opt. 53(17), 3737–3747 (2014).
[Crossref] [PubMed]

H. Zhao, X. Liang, X. Diao, and H. Jiang, “Rapid in-situ 3D measurement of shiny object based on fast and high dynamic range digital fringe projector,” Opt. Lasers Eng. 54, 170–174 (2014).
[Crossref]

2013 (3)

2012 (2)

Y. Wang and S. Zhang, “Novel phase-coding method for absolute phase retrieval,” Opt. Lett. 37(11), 2067–2069 (2012).
[Crossref] [PubMed]

F. Chen and X. Su, “Phase-unwrapping algorithm for the measurement of 3D object,” Optik (Stuttg.) 123(24), 2272–2275 (2012).
[Crossref]

2011 (3)

M. Zhao, L. Huang, Q. Zhang, X. Su, A. Asundi, and Q. Kemao, “Quality-guided phase unwrapping technique: comparison of quality maps and guiding strategies,” Appl. Opt. 50(33), 6214–6224 (2011).
[Crossref] [PubMed]

K. Haskamp, M. Kästner, and E. Reithmeier, “Accurate calibration of a fringe projection system by considering telecentricity,” Proc. SPIE 8082, 80821B (2011).
[Crossref]

J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128–160 (2011).
[Crossref]

2010 (1)

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010).
[Crossref]

2008 (1)

X. Chen, J. Xi, T. Jiang, and Y. Jin, “Research and development of an accurate 3D shape measurement system based on fringe projection: model analysis and performance evaluation,” Precis. Eng. 32(3), 215–221 (2008).
[Crossref]

2007 (1)

Q. Kemao, “Two-dimensional windowed Fourier transform for fringe pattern analysis: principles, applications and implementations,” Opt. Lasers Eng. 45(2), 304–317 (2007).
[Crossref]

2006 (1)

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006).
[Crossref]

2005 (1)

1999 (1)

1984 (1)

1983 (1)

Amalia,

Y. Yolanda, D. López, Martínez García, Amalia, and J. Gómez, “Apple quality study using fringe projection and colorimetry techniques,” Optik (Stuttg.) 147, 401–413 (2017).
[Crossref]

An, Y.

Asundi, A.

Bhatia, V.

A. Chatterjee, V. Bhatia, and S. Prakash, “Anti-spoof touchless 3D fingerprint recognition system using single shot fringe projection and biospeckle analysis,” Opt. Lasers Eng. 95, 1–7 (2017).
[Crossref]

Carocci, M.

Chatterjee, A.

A. Chatterjee, V. Bhatia, and S. Prakash, “Anti-spoof touchless 3D fingerprint recognition system using single shot fringe projection and biospeckle analysis,” Opt. Lasers Eng. 95, 1–7 (2017).
[Crossref]

Chen, F.

F. Chen and X. Su, “Phase-unwrapping algorithm for the measurement of 3D object,” Optik (Stuttg.) 123(24), 2272–2275 (2012).
[Crossref]

Chen, X.

X. Chen, J. Xi, T. Jiang, and Y. Jin, “Research and development of an accurate 3D shape measurement system based on fringe projection: model analysis and performance evaluation,” Precis. Eng. 32(3), 215–221 (2008).
[Crossref]

Da, F.

Deng, D.

Diao, X.

H. Zhao, X. Liang, X. Diao, and H. Jiang, “Rapid in-situ 3D measurement of shiny object based on fast and high dynamic range digital fringe projector,” Opt. Lasers Eng. 54, 170–174 (2014).
[Crossref]

Falaggis, K.

Fang, M.

García, Martínez

Y. Yolanda, D. López, Martínez García, Amalia, and J. Gómez, “Apple quality study using fringe projection and colorimetry techniques,” Optik (Stuttg.) 147, 401–413 (2017).
[Crossref]

Geng, J.

J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128–160 (2011).
[Crossref]

Gómez, J.

Y. Yolanda, D. López, Martínez García, Amalia, and J. Gómez, “Apple quality study using fringe projection and colorimetry techniques,” Optik (Stuttg.) 147, 401–413 (2017).
[Crossref]

Gorthi, S. S.

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010).
[Crossref]

Gu, F.

C. Zhang, H. Zhao, F. Gu, and Y. Ma, “Phase unwrapping algorithm based on multi-frequency fringe projection and fringe background for fringe projection profilometry,” Meas. Sci. Technol. 26(4), 045203 (2015).
[Crossref]

Halioua, M.

Haskamp, K.

K. Haskamp, M. Kästner, and E. Reithmeier, “Accurate calibration of a fringe projection system by considering telecentricity,” Proc. SPIE 8082, 80821B (2011).
[Crossref]

Huang, H.

Huang, L.

Huang, P. S.

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006).
[Crossref]

Hyun, J. S.

Jiang, H.

H. Zhao, X. Liang, X. Diao, and H. Jiang, “Rapid in-situ 3D measurement of shiny object based on fast and high dynamic range digital fringe projector,” Opt. Lasers Eng. 54, 170–174 (2014).
[Crossref]

Jiang, K.

Jiang, T.

X. Chen, J. Xi, T. Jiang, and Y. Jin, “Research and development of an accurate 3D shape measurement system based on fringe projection: model analysis and performance evaluation,” Precis. Eng. 32(3), 215–221 (2008).
[Crossref]

Jin, Y.

X. Chen, J. Xi, T. Jiang, and Y. Jin, “Research and development of an accurate 3D shape measurement system based on fringe projection: model analysis and performance evaluation,” Precis. Eng. 32(3), 215–221 (2008).
[Crossref]

Kästner, M.

K. Haskamp, M. Kästner, and E. Reithmeier, “Accurate calibration of a fringe projection system by considering telecentricity,” Proc. SPIE 8082, 80821B (2011).
[Crossref]

Kemao, Q.

M. Zhao, L. Huang, Q. Zhang, X. Su, A. Asundi, and Q. Kemao, “Quality-guided phase unwrapping technique: comparison of quality maps and guiding strategies,” Appl. Opt. 50(33), 6214–6224 (2011).
[Crossref] [PubMed]

Q. Kemao, “Two-dimensional windowed Fourier transform for fringe pattern analysis: principles, applications and implementations,” Opt. Lasers Eng. 45(2), 304–317 (2007).
[Crossref]

Kong, W.

Li, B.

Li, Y. F.

Li, Z.

Liang, X.

H. Zhao, X. Liang, X. Diao, and H. Jiang, “Rapid in-situ 3D measurement of shiny object based on fast and high dynamic range digital fringe projector,” Opt. Lasers Eng. 54, 170–174 (2014).
[Crossref]

Liu, H. C.

Liu, X.

López, D.

Y. Yolanda, D. López, Martínez García, Amalia, and J. Gómez, “Apple quality study using fringe projection and colorimetry techniques,” Optik (Stuttg.) 147, 401–413 (2017).
[Crossref]

Ma, Y.

C. Zhang, H. Zhao, F. Gu, and Y. Ma, “Phase unwrapping algorithm based on multi-frequency fringe projection and fringe background for fringe projection profilometry,” Meas. Sci. Technol. 26(4), 045203 (2015).
[Crossref]

Matthias, S.

J. Pösch, J. Schlobohm, S. Matthias, and E. Reithmeier, “Rigid and flexible endoscopes for three dimensional measurement of inside machine parts using fringe projection,” Opt. Lasers Eng. 89, 178–183 (2017).
[Crossref]

Meng, X.

Mutoh, K.

Patorski, K.

Peng, X.

Pokorski, K.

Pösch, J.

J. Pösch, J. Schlobohm, S. Matthias, and E. Reithmeier, “Rigid and flexible endoscopes for three dimensional measurement of inside machine parts using fringe projection,” Opt. Lasers Eng. 89, 178–183 (2017).
[Crossref]

Prakash, S.

A. Chatterjee, V. Bhatia, and S. Prakash, “Anti-spoof touchless 3D fingerprint recognition system using single shot fringe projection and biospeckle analysis,” Opt. Lasers Eng. 95, 1–7 (2017).
[Crossref]

Rao, L.

Rastogi, P.

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010).
[Crossref]

Reithmeier, E.

J. Pösch, J. Schlobohm, S. Matthias, and E. Reithmeier, “Rigid and flexible endoscopes for three dimensional measurement of inside machine parts using fringe projection,” Opt. Lasers Eng. 89, 178–183 (2017).
[Crossref]

K. Haskamp, M. Kästner, and E. Reithmeier, “Accurate calibration of a fringe projection system by considering telecentricity,” Proc. SPIE 8082, 80821B (2011).
[Crossref]

Rodella, R.

Sansoni, G.

Schlobohm, J.

J. Pösch, J. Schlobohm, S. Matthias, and E. Reithmeier, “Rigid and flexible endoscopes for three dimensional measurement of inside machine parts using fringe projection,” Opt. Lasers Eng. 89, 178–183 (2017).
[Crossref]

Shi, Y.

Srinivasan, V.

Su, X.

Takeda, M.

Towers, C. E.

Towers, D. P.

Trusiak, M.

Wang, M.

Wang, Y.

Weng, J.

Xi, J.

X. Chen, J. Xi, T. Jiang, and Y. Jin, “Research and development of an accurate 3D shape measurement system based on fringe projection: model analysis and performance evaluation,” Precis. Eng. 32(3), 215–221 (2008).
[Crossref]

Yin, Y.

Yolanda, Y.

Y. Yolanda, D. López, Martínez García, Amalia, and J. Gómez, “Apple quality study using fringe projection and colorimetry techniques,” Optik (Stuttg.) 147, 401–413 (2017).
[Crossref]

Zhang, C.

Zhang, L.

Zhang, Q.

Zhang, S.

Zhao, H.

H. Zhao, C. Zhang, C. Zhou, K. Jiang, and M. Fang, “Circular fringe projection profilometry,” Opt. Lett. 41(21), 4951–4954 (2016).
[Crossref] [PubMed]

C. Zhang, H. Zhao, F. Gu, and Y. Ma, “Phase unwrapping algorithm based on multi-frequency fringe projection and fringe background for fringe projection profilometry,” Meas. Sci. Technol. 26(4), 045203 (2015).
[Crossref]

C. Zhang, H. Zhao, and L. Zhang, “Fringe order error in multifrequency fringe projection phase unwrapping: reason and correction,” Appl. Opt. 54(32), 9390–9399 (2015).
[Crossref] [PubMed]

H. Zhao, X. Liang, X. Diao, and H. Jiang, “Rapid in-situ 3D measurement of shiny object based on fast and high dynamic range digital fringe projector,” Opt. Lasers Eng. 54, 170–174 (2014).
[Crossref]

Zhao, M.

Zhong, J.

Zhong, K.

Zhou, C.

Zhou, X.

Adv. Opt. Photonics (1)

J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128–160 (2011).
[Crossref]

Appl. Opt. (7)

Meas. Sci. Technol. (1)

C. Zhang, H. Zhao, F. Gu, and Y. Ma, “Phase unwrapping algorithm based on multi-frequency fringe projection and fringe background for fringe projection profilometry,” Meas. Sci. Technol. 26(4), 045203 (2015).
[Crossref]

Opt. Eng. (1)

S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006).
[Crossref]

Opt. Express (5)

Opt. Lasers Eng. (5)

A. Chatterjee, V. Bhatia, and S. Prakash, “Anti-spoof touchless 3D fingerprint recognition system using single shot fringe projection and biospeckle analysis,” Opt. Lasers Eng. 95, 1–7 (2017).
[Crossref]

S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010).
[Crossref]

H. Zhao, X. Liang, X. Diao, and H. Jiang, “Rapid in-situ 3D measurement of shiny object based on fast and high dynamic range digital fringe projector,” Opt. Lasers Eng. 54, 170–174 (2014).
[Crossref]

J. Pösch, J. Schlobohm, S. Matthias, and E. Reithmeier, “Rigid and flexible endoscopes for three dimensional measurement of inside machine parts using fringe projection,” Opt. Lasers Eng. 89, 178–183 (2017).
[Crossref]

Q. Kemao, “Two-dimensional windowed Fourier transform for fringe pattern analysis: principles, applications and implementations,” Opt. Lasers Eng. 45(2), 304–317 (2007).
[Crossref]

Opt. Lett. (4)

Optik (Stuttg.) (2)

F. Chen and X. Su, “Phase-unwrapping algorithm for the measurement of 3D object,” Optik (Stuttg.) 123(24), 2272–2275 (2012).
[Crossref]

Y. Yolanda, D. López, Martínez García, Amalia, and J. Gómez, “Apple quality study using fringe projection and colorimetry techniques,” Optik (Stuttg.) 147, 401–413 (2017).
[Crossref]

Precis. Eng. (1)

X. Chen, J. Xi, T. Jiang, and Y. Jin, “Research and development of an accurate 3D shape measurement system based on fringe projection: model analysis and performance evaluation,” Precis. Eng. 32(3), 215–221 (2008).
[Crossref]

Proc. SPIE (1)

K. Haskamp, M. Kästner, and E. Reithmeier, “Accurate calibration of a fringe projection system by considering telecentricity,” Proc. SPIE 8082, 80821B (2011).
[Crossref]

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1
Fig. 1 Geometric model of OCFPP.
Fig. 2
Fig. 2 Hardware layout of OCFPP.
Fig. 3
Fig. 3 Error rate of the calculated z corresponding to variant values of θ: (a) the simulated error in MN is 0.02 mm, and in θ is 0 radian; (b) the simulated error in MN is 0 mm, and in θ is 0.01 radians.
Fig. 4
Fig. 4 3D model of the hardware system of OCFPP.
Fig. 5
Fig. 5 Image of a 2D ruler with a reticle projected onto it. The interval between two adjacent lines is 5 mm; the white reticle is projected by a projector, and its center C 0 is projected from the principal point of a projector (i.e., the zero-phase point of the projected circular fringe pattern); the red rectangle marks the field of view of a camera; C can be any point within the red rectangle.
Fig. 6
Fig. 6 3D measurement results of a plane in a simulation when (a) the value of k = 0.001731, ( x c0 , y c0 )=(4960,11), and (b) the value of k = 1, ( x c0 , y c0 )=(4960,1), and values of all the other parameters are ideal. During the simulation, the profile of the measured plane is z=600 mm, the true value of k is 0.001731, and the true value of ( x c0 , y c0 ) is (4960,1).
Fig. 7
Fig. 7 Illustration to the search process of PCM. The rectangle signifies the region to be searched for ( x c0 , y c0 ), the red dots denote the points to be searched, and the blue reticle represents the actual location of ( x c0 , y c0 ).
Fig. 8
Fig. 8 3D profiles of a plane with z = 600 mm recovered with (a) CFPP and (b) OCFPP respectively when Gaussian noise with mean of 0 and variance of 0.003 radians are added to the simulated phase.
Fig. 9
Fig. 9 Images of a ball (left) and a fringe pattern captured from it (right).
Fig. 10
Fig. 10 3D profiles of the measured ball reconstructed with (a) 2DRM and (b) PCM respectively.
Fig. 11
Fig. 11 Images of a medal (left) and a chinar leaf (right) to be measured.
Fig. 12
Fig. 12 3D measurement results of (a) the medal and (b) the chinar leaf.

Tables (2)

Tables Icon

Table 1 Pixel coordinates of the zero-phase point calculated with 2DRM and PCM respectively

Tables Icon

Table 2 Results of sphere fitting applied to 3D data obtained with 2DRM and PCM respectively

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

z=MNcotθ.
z= μ c ( x c x c0 ) 2 + ( y c y c0 ) 2 kβΦ( x c , y c ) ,
k= μ c ( x c x c0 ) 2 + ( y c y c0 ) 2 βΔz [ 1 Φ Δz ( x c , y c ) 1 Φ 0 ( x c , y c ) ],
{ x= μ c x c / β+ ξ x y= μ c y c / β+ ξ y .
{ x c0 = x c +β( X c0 X c )/ μ c y c0 = y c +β( Y c0 Y c )/ μ c .
z virtual q ( x c , y c )= μ c ( x c x c0 q ) 2 + ( y c y c0 q ) 2 βΦ( x c , y c ) .
z fit q ( x c , y c )= a q x c + b q y c + c q .
RMSE(q)= x c =1 column y c =1 row ( z virtual q ( x c , y c ) z fit q ( x c , y c )) 2 rowcolumn ,
( x c0 , y c0 )=( x c0 q ¯ , y c0 q ¯ ),

Metrics