## Abstract

Sinusoidal fringe pattern is widely used in optical profilometry; however, the traditional constant-frequency sinusoidal fringe pattern reduces 3D measurement accuracy in the defocus region. To this end, this paper presents a variable-frequency sinusoidal fringe pattern method that is optimized by the measurement depth. The proposed method improves the pixel matching accuracy and thus increases measurement accuracy. This paper theoretically determines the optimal frequency by analyzing the pixel matching error caused by intense noise in a captured image; presents the online frequency optimization along abscissa and ordinate axes in the sinusoidal fringe patterns; and details the encoding and decoding to use variable-frequency fringe patterns for 3D profilometry. Simulations and experiments demonstrate that our proposed method can improve the 3D measurement accuracy and increase measurement robustness.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

## 1. Introduction

3D surface measurement techniques have been applied in various fields [1–3]. Among these techniques, fringe projection profilometry has several advantages, such as being non-contact and having high accuracy and flexibility [4]. Phase shifting algorithms are extensively adopted in fringe projection profilometry because of its superior high resolution, high speed, and full-field inspection. A serial of encoded sinusoidal fringe patterns is emitted from a projector, and the deformed pattern is captured by a camera to decode the relative phase from −*π* to *π* via the arctangent function. To resolve 2*π* discontinuity for 3D surface measurement, a spatial or temporal phase unwrapping method is typically required [5]. The spatial phase unwrapping methods determine number of 2*π* to be added on each pixel by analyzing the phase map itself and usually via optimization. A number of reliability-guided spatial phase unwrapped methods have been reviewed in [6]. In general, it is difficult for a spatial phase wrapping method to be used for discontinuous 3D shape measurement. Temporal phase unwrapping methods relies on capturing additional patterns in a time sequence to resolve the 2*π* discontinuity problem. Such a method is usually pixel by pixel without optimization, and thus it is more robust and versatile. A number of temporal phase unwrapped methods have been developed over the years, including two- or multi-wavelength phase shifting [7, 8], gray coding, and spatial coding.

In a typical digital fringe projection system with a phase-shifting technique, a set of uniform-frequency sinusoidal fringe patterns are usually used; and the N-step phase shifting algorithm is extensively employed because of its ability to perform pixel-by-pixel measurement rapidly and accurately. In such a technique, generating high-quality sinusoidal fringe patterns is critical to achieving high measurement accuracy. However, commercial projectors such as those based on digital light processing (DLP) techniques typically large apertures to maximize the light efficiency (e.g., to increase output light intensity). Consequently, all these projectors have a narrow depth-of-field and display only non-distorted fringe images on a fronto-parallel screen at the focus depth [9]. By virtue of the defocus character of projectors, binary defocusing techniques have been proposed to generate high-quality sinusoidal fringe patterns [10, 11]. The intensity of fringe patterns in a projector is iteratively optimized by dithering technology to increase the captured fringe pattern quality [12–15]. Although the quality of a sinusoidal fringe pattern captured by a camera is improved by these methods, the pixel-by-pixel or subgroup-by-subgroup optimization based on object depth range to be measured has not been attempted. As a result, the measurement quality may vary across the object surface. This problem is more pronounced the object is moving when it is is difficult to generate optimized fringe patterns for each measurement.

Generally, the defocusing effect can be represented by the breadth of the point spread function (*PSF*), which is appropriated by a Gaussian filter [16], and the degree of blur is characterized by the defocus kernel. Different from camera defocus, the defocus degree of a projector is scene-independent [17], which is relative to the distance from the scene point to the projector lens center [18]. Several methods have been proposed for estimating the defocus kernel to recover the depth from the defocus-distance calibration relationship [9, 19, 20]. Ideally, if the captured images reflected from the measurement surface are noise-free and thus correspond to perfect sinusoidal waves, the defocus *PSF* is used as a standard Gaussian filter to attenuate only the amplitude of the sinusoidal wave. Consequently, the phase decoded by phase shifting will equal the designed phase regardless of the frequency and step number of phase shifting. However, commercial cameras and projectors suffer from different kinds of noise, such as noise caused by temperature, exposure time, environment, and digital quantization [21]. Thus, the noise is inherent to the captured images during the projection and capture processes, which affects the phase quality and thus pixel matching accuracy.

To increase the phase quality for more precise pixel matching, this paper analyzes the pixel matching error with respect to the frequency and amplitude of a sinusoidal fringe pattern captured by a camera. It is found that the higher the frequency and larger the amplitude of the captured sinusoidal fringe pattern, the lower the pixel matching error. On one hand, when the frequency becomes higher, the same intensity error in ordinate axis will be corresponding to a reduced pixels coordinate movement in abscissa axis. On another hand, when the amplitude of the sinusoidal wave is increased, the pixels coordinate movement in abscissa axis corresponding to the same intensity error in ordinate axis is reduced. However, the defocus *PSF*, generally described by a standard Gaussian filter, is a low-pass filter, that is, a higher frequency will result in a captured sinusoidal fringe pattern with a smaller amplitude. When the frequency becomes higher, the amplitude will be attenuated, so the decrease of the pixel matching error induced by the intensity error through increasing frequency is not monotonous. Therefore, there exist optimal frequencies of the sinusoidal fringe pattern for different defocus degrees in different measurement regions at various depths. Therefore, the sinusoidal fringe patterns to be projected should be designed with variable frequency to reduce the pixel matching error induced by image noise. In this case, according to the variable measurement depth, the corresponding variable defocus kernel of the projector and camera are calibrated using the methods presented in [22] and [23], respectively, and then the optimal variable-frequency sinusoidal fringe pattern along the abscissa and ordinate axes in the projector is designed to reduce the pixel matching error in the camera. Finally, the phase decoding algorithm for the variable-frequency sinusoidal fringe pattern is developed for a 3D measurement.

Section 2 details the entire principle of the optimization method for a depth-driven variable-frequency sinusoidal fringe pattern in fringe projection profilometry. To verify our proposed methods, measurement simulations and experiments are conducted in Section 3. We summarize this research in Section 4.

## 2. Principle

To improve the pixel matching accuracy for 3D measurement, the principle of the optimized depth-driven variable-frequency sinusoidal fringe pattern in fringe projection profilometry is proposed in this section. The 3D measurement accuracy for phase shifting mainly depends on the pixel matching accuracy, which determines the correspondence between the projector and camera. The pixel matching accuracy relies on the quality of the sinusoidal fringe pattern captured by the camera, which is affected by the inevitable image noise. Therefore, the pixel matching error induced by image noise in the captured defocused image is analyzed, and then the optimal frequency, according to the defocus analysis, is presented in the frequency domain; see section 2.1. Then, taking into account the defocus of the projector and camera, the online frequency optimization for the sinusoidal fringe patterns along abscissa and ordinate axes in the projector is presented in section 2.2 to improve the pixel matching accuracy. For phase shifting, the variable-frequency sinusoidal fringe patterns are designed in section 2.3, and the decoding algorithm is developed in section 2.4.

#### 2.1. Pixel matching accuracy induced by intensity error

Without loss of generality, the sinusoidal fringe patterns along both abscissa and ordinate axes in the projector are used to improve the accuracy and robustness of correspondence matching in this paper.

The error of 3D measurement by phase shifting in fringe projection profilometry comes directly from the pixel matching error between the camera and projector, and the pixel matching error depends upon the decoded phase,

*c*refers to the

*camera*;

*N*is the total number of steps of phase shifting; and

*i*= 1, 2, ⋯,

*N*is the sequence order of fringe patterns. As mentioned in the introduction, the sinusoidal fringe images captured by the camera are corrupted by different kinds of noise; thus, the captured sinusoidal fringes are not perfectly sinusoidal waves. We define the decoded phase

*ϕ*(

*U*,

_{c}*V*) as a function of

_{c}*N*gray intensities ${I}_{1}^{c}({U}_{c},{V}_{c}),{I}_{2}^{c}({U}_{c},{V}_{c}),\cdots ,{I}_{N}^{c}({U}_{c},{V}_{c})$ of these images as follows:

*ϕ*(

*x*,

*y*) can be expressed as

*I*

^{c}^{′}is the average intensity and

*I*

^{c}^{″}is the amplitude. Therefore, the derivative of

*f*w.r.t ${I}_{i}^{c}$ can be obtained as

*I*of all pixels is restricted because of limited energy ejecting. To present the maximum decoded phase error resulting from the intensity noise, without loss of generality, errors at the same pixel coordinates for the entire sequence of phase shifting images are set as Δ

*I*under the same capture condition $\mathrm{\Delta}{I}_{1}^{c}=\mathrm{\Delta}{I}_{2}^{c}=\cdots =\mathrm{\Delta}{I}_{N}^{c}=\mathrm{\Delta}I$. Then, Eq. (5) is substituted into Eq. (3). Hence, the maximum decoded phase error can be expressed as

*U*,

_{c}*V*). (

_{c}*U*,

_{p}*V*) is the matched pixel coordinates in projector corresponding to (

_{p}*U*,

_{c}*V*), and the absolute total phase along

_{c}*U*axis in projector image plane at (

_{p}*U*,

_{p}*V*) can be generally obtained as $\varphi {\displaystyle {\int}_{0}^{{U}_{p}}\omega d{U}_{p}}$, and the derivative of

_{p}*ϕ*with respect to

*U*is $\frac{d\varphi}{d{U}_{p}}=\omega $. Thus, the maximum pixel matching error of

_{p}*U*axis can be written as

_{p}*ω*is the frequency of the sinusoidal fringes.

When the projector projects the image onto an out-of-focus plane, the displayed image is blurred due to defocus. The defocus can be modeled as a Gaussian filter as

*σ*is the defocus kernel and

*G*(

*U*) and

_{p}*G*(

*V*) are the Gaussian filters for the

_{p}*U*and

_{p}*V*directions, respectively. Theoretically, the sinusoidal fringe patterns of

_{p}*N*-step phase shifting for the

*U*axis can be presented as

_{p}*U*direction in the projector,

_{p}*I*

^{p}^{′}is the average intensity,

*I*

^{p}^{″}is the amplitude, and

*ω*is the frequency of the fringe pattern along

_{u}*U*axis, the subscript and superscript letter

_{p}*p*refers to the

*projector*. The output image blurred by defocus can be modelled as the convolution of the defocus Gaussian filter on the designed sinusoidal fringe pattern as Equation (10) is transferred into the frequency domain by the Fourier transform as

*G*(

*Up*) and

*G*(

*Vp*), respectively. Thus the displayed image can be retrieved as,

*I*(

^{c}*U*,

_{c}*V*)

_{c}*blurred by defocus is attenuated by $\mathcal{F}(G({U}_{p}))$ namely*

_{u}*ω*is increased. Similarly, amplitude attenuation for fringe patterns along

_{u}*V*axis can be achieved. Thus, the maximum pixel matching error in Eq. (7) can be rewritten as

_{p}Figure 1 illustrates the intuitive illustration of the influence of the fringe frequency and image noise on pixel matching accuracy. From Eq. (14), Eq. (15), and Fig. 1, we can see that 1) when the frequency is higher, the attenuation item $\mathcal{F}(G({U}_{p}))$ will reduce the amplitude of the fringe pattern on the display; thus, the phase decoding error induced by image noise will be increased; and 2) when the frequency of fringes is lower, the phase sensitivity with respect to intensity variation will be augmented. These two aspects will reduce the SNR of the sinusoidal fringe pattern; thereby, there exists an optimized frequency that minimizes the pixel matching error.

The optimal frequency of the designed fringe pattern is obtained as

by solving equation*h*(

_{σ}*ω*)

^{′}= 0, which is the first-order derivative of Eq. (15) with respect to

*ω*. Thus, ${\left|\mathrm{\Delta}{U}_{p}\right|}_{\mathrm{max}}$ is minimized when the frequency is set to

*ω*under the condition that the second-order derivative

_{opt}*h*(

_{σ}*ω*)″>0.

_{opt}#### 2.2. Optimized frequency based on depth

A projector is usually produced with a large aperture; thus, the displayed image is blurred because of defocus when the display plane is at an out-of-focus depth. Following the geometric optics of a projector, shown in Fig. 2(a), when the display plane is out of focus, the image of a point with radius r on the display plane is no longer a point with radius r but a blurred circular patch with radius *R _{p}*(

*u*). The blurring kernel

*σ*of the projector is proportional to the patch radius [18]

_{p}*f*is the focal length of the projector,

_{p}*u*is the depth of the display plane with respect to the lens,

*D*is the radius of the projector lens,

_{p}*v*is the distance from the image plane to the lens, and

*a*and

_{p}*b*are the fitted slope and intercept.

_{p}Similar to projector defocus, an out-of-focus scene point is also blurred in the camera image plane according to the camera geometric optics model (Fig. 2(b)). Thus, when a point on a measured object surface is out of focus, its image on the camera image plane is not a point but a circular patch with radius *R _{c}*(

*u*). The blur kernel of the camera defocus can be presented as

*f*is the focal length of the camera,

_{c}*u*is the depth of the scene point with respect to the lens,

*D*is the radius of the camera lens,

_{c}*s*is the distance from the camera image plane to the lens, and

*a*and

_{c}*b*are the fitted parameters.

_{c}The projected pattern is blurred by the projector defocus and camera defocus successively; thus, the total defocus kernel *σ* can be obtained by the superposition principle of Gaussian convolution as

We use the well-known pinhole model to describe the camera image system. The projection from 3D coordinates (*x _{c}*,

*y*,

_{c}*z*) in the camera coordinate system to 2D pixel coordinates (

_{c}*U*,

_{c}*V*) in the imaging plane can be presented as

_{c}*f*and

_{cu}*f*are the focal lengths of the camera for the

_{cv}*U*-axis and

*V*-axis and

*U*

_{c}_{0},

*V*

_{c}_{0}is the principle point. The normalized camera pixel coordinates (

*U*,

_{c}*V*) can be calculated using Eq. (20) with the values of

_{c}*f*,

_{cu}*f*and (

_{cv}*U*

_{c}_{0},

*V*

_{c}_{0}) obtained by the camera calibration:

*x*,

_{p}*y*,

_{p}*z*) and (

_{p}*U*,

_{p}*V*) are the 3D coordinates and 2D pixel coordinates, respectively. By virtue of the calibrated transformation matrix between the projector and camera coordinate systems, the 3D coordinate relationship between the camera and projector coordinate systems can be expressed as where ${R}_{pc}={[{r}_{ij}]}_{3\times 3}$ and ${\mathbf{t}}_{pc}={[{t}_{1}\phantom{\rule{1em}{0ex}}{t}_{2}\phantom{\rule{1em}{0ex}}{t}_{3}]}^{T}$ are the rotation matrix and translation vector from the camera coordinate system to the projector coordinate system, respectively. Then, Eq. (21) and Eq. (22) are substituted into Eq. (23) to yield the relationship between (

_{p}*x*,

_{c}*y*,

_{c}*z*) and

_{c}*u*,

_{c}*v*, and

_{c}*u*:

_{p}*H*=

*r*

_{11}

*u*+

_{c}*r*

_{12}

*v*+

_{c}*r*

_{13}and

*J*=

*r*

_{31}

*u*+

_{c}*r*

_{32}

*v*+

_{c}*r*

_{33}. According to Eq. (24), the error of

*z*can be derived as

_{c}*u*,

_{c}*v*) can be obtained as

_{c}*u*,

_{c}*v*,

_{c}*u*) and ${K}_{\mathrm{sin}\varphi}=\frac{{M}_{\mathrm{sin}\varphi}}{N{I}^{p}{}^{\u2033}\cdot {f}_{cu}}$ is constant for a certain

_{p}*ϕ*. As shown in Fig. 3, for the vertical sinusoidal fringes along the

*U*direction, the total errors vary in the interval [

*V*

_{1},

*V*

_{2}] at a certain horizontal projector coordinate ${\widehat{U}}_{p}$ because of the varied

*k*(

*u*,

_{c}*vc*,

*u*) resulting from the variable matched camera coordinates $({\widehat{U}}_{c},{\widehat{V}}_{c})$ corresponding to $({\widehat{U}}_{p},{\widehat{V}}_{p})$ and the variable

_{p}*σ*due to the variable depth. The mean value of the maximum composite error from $({\widehat{U}}_{p},{V}_{1})$ to $({\widehat{U}}_{p},{V}_{2})$

The aforementioned frequency optimization is for the vertical fringes at a certain pixel coordinate ${\widehat{U}}_{p}$ the optimized frequency for horizontal fringes at ${\widehat{V}}_{p}$ can be similarly obtained. Via the analysis of the pixel matching error induced by the intensity error in section 2.1, the optimal frequency for the improved accuracy is determined by the defocus kernel for various depths. Thus, the online adaptive variable-frequency sinusoidal fringe pattern in fringe projection profilometry is proposed to improve the accuracy, with the depth as the feedback. The depth can be approximated from the computer aided design(CAD) model, coarse measurement, and other methods.

#### 2.3. Variable-frequency shifting fringe pattern design

To alleviate the composite error for the overall measurement area, the frequency of the vertical sinusoidal fringes along the *U _{p}*-axis or the horizontal sinusoidal fringe wave along the

*V*-axis should be optimized and varied according to the depth. From Eq. (17), the defocus kernel is nonlinearly varied with the depth to measure the curved surface in the projector coordinate system. However, it is mathematically difficult to wrap and unwrap the phase if the frequency is strictly selected to be the optimal frequency, which is the reciprocal of the nonlinearly varied defocus kernel. To simplify the wrapping and unwrapping pattern, the frequency of the sinusoidal fringe pattern is optimized along only the

_{p}*U*(or

_{p}*V*) direction, whereas equivalent defocus kernels along

_{p}*U*(or

_{p}*V*) are segmented and piecewise linearly fitted in a segment, as shown in Fig. 4.

_{p}One solution to approximate the depth is to measure the object surface by the uniform-frequency sinusoidal fringe pattern. Then, the defocus kernel is acquired via kernel-distance relationship calibration according to Eq. (17) and Eq. (18). The phase coding along the *U*-axis at ${\widehat{U}}_{p}$ is

*ω*(

*U*) is the frequency at

_{p}*U*. According to the piecewise linearly fitted

_{p}*σ*, the optimal phase at ${\widehat{U}}_{p}$ can be obtained as

*m*is the segment order in which the ${\widehat{U}}_{p}$is included;

*W*is the width of the segment;

_{s}*C*

_{m}_{−1}is the sum phase from pixel coordinate 0 to the end of the (

*m*− 1)th segment; and

*α*and

_{j}*β*are the fitted slope and intercept of the

_{j}*j*th segment, respectively. If

*α*= 0, Eq. (33) can be rewritten as

_{m}*S*is where ${U}_{p}^{r}$ represents the right limit of the pixel value matched in the projector image plane. Also, the optimal phase for the horizontal fringe pattern can be similarly determined based on $\varphi ({\widehat{V}}_{p})$. When all the phases along the

_{u}*U*- and

_{p}*V*-axes are determined via Eq. (33) or Eq. (34), the sinusoidal fringe pattern can be designed as follows:

_{p}#### 2.4. Variable-frequency sinusoidal fringe pattern decoding algorithm

The variable-frequency sinusoidal fringe patterns are designed according to the measured defocus kernel. In the measurement process with the adaptive variable-frequency sinusoidal fringe pattern, the first matched pixel coordinate by the dual uniform-frequency sinusoidal fringe pattern is ${\overline{U}}_{p}$. By projecting the fringe patterns with variable frequency, the phase can be calculated as

*π*can be estimated by the pixel coordinate from the unwrapping of the uniform fringes as

*m*is determined by ${\overline{U}}_{p}$ firstly, and then the phase order $Q\left({\widehat{U}}_{c},{\widehat{V}}_{c}\right)$ of 2

*π*is obtained by Eq. (40). Thus, the matched pixels can be acquired by solving Eq. (41). Equation (33) and Eq. (34) are taken into consideration to calculate the matched pixels.

*Case*1: *α _{m}* ≠ 0

*Case*2:

*α*≠ 0

_{m}Using Eq. (42) and Eq. (43), the pixels are rematched by the variable-frequency sinusoidal fringe pattern to improve the pixel matching accuracy.

## 3. Experiment

To verify the performance of the proposed adaptive online feedback measurement method with variable-frequency sinusoidal fringes, a fringe projection profilometry system is developed, as shown in Fig. 5, which is composed of a single CMOS camera (JAI GO5000M-USB) with a 25.4 *mm* focal length and one digital light process (DLP) projector (DLP lightcrafter 4500). The resolutions of the camera and projector are 2560 × 2048 and 1140 × 912, respectively. The measurement distance of the projector is 119 mm. The system is calibrated using the method proposed in [24].

#### 3.1. Projector and camera defocus kernel calibration

For the variable-frequency fringe pattern, the optimal frequency is designed based on the measured defocus kernel. To optimize the frequency online based on the defocus kernel in Eq. (17) and Eq. (18), the parameters *a _{p}*,

*b*and

_{p}*a*,

_{c}*b*should be calibrated.

_{c}The defocus blur of projection can be presented as a convolution of a *PSF* with the image to be projected. The degradation caused by the projector defocus blur is usually modeled as 2D Gaussian with a standard deviation *σ* [20]. The defocus kernel varies when the depth of the display plane with respect to the projector is changed. The kernel of the PSF is estimated via correlation analysis when the projector is situated at a certain depth [22]. For the projector defocus kernel calibration, the camera is set to be at focus on the display plane, the projector defocus calibration result is shown in Fig. 6(a). The image captured by the camera is also blurred by defocus when the scene is not in focus. The camera is also fixed on a moving platform and can be moved along the camera optical axis. The method proposed in [23], in which the camera defocus kernel is estimated from a single image, is employed to calibrate the defocus kernel-distance relationship of the camera. The calibration result is shown in Fig. 6(b).

The fitted defocus kernel-distance relationship is

And, the nonlinear camera defocus kernel-distance relationship is fitted to be cubic polynomial:#### 3.2. Simulations

To verify the optimal frequency method based on the pixel matching error induced by image noise, simulation experiments are conducted in several scenarios. A theoretical plane is set perpendicular to the optical axis of the projector at *z _{p}* = 355 in our simulations. The uniform-frequency sinusoidal fringe patterns are generated and blurred by a Gaussian filter with a known kernel, and then random noise is added to the blurred patterns. In the simulations, these images are blurred using a defocus Gaussian filter, the plane is measured by a series of different uniform-frequency sinusoidal fringe patterns, and the root-mean-square (RMS) error of plane fitting is obtained for every frequency. Four Gaussian filters with different defocus kernels and three different random noises are simulated in the simulation experiment. The changes in the RMS error for different frequencies for a certain kernel are plotted in Fig. 7, where the abscissa axis is the frequency and the ordinate axis represents the RMS errors of plane fitting.

Figure 7 reveals that the RMS error is lowest when the frequency is set to *ω* = 1/*σ* for a certain defocus kernel. Also, it is found that the optimal frequency is consistent when the magnitude of random noise is changed.

#### 3.3. Measurement of field objects with variable depth

The proposed depth-driven variable-frequency pattern in fringe projection profilometry was tested by measuring a standard white balance plane catty-corner with respect to the front of the projector and standard sphere in front of the projector and a step like object in front of projector. The experiment setup is shown in Fig. 8.

First, the standard white balance plane, which is tilted in front of the projector, was measured. The number of steps is *N* = 20, and the plane is measured by a dual-frequency phase shifting method. The period number *nT _{h}* of high-frequency fringes is

*nT*= 10, 15, 20, 25, 30 and 35 along the

_{h}*U*-axis; the frequency of fringe patterns along the

_{p}*V*-axis is set in the same way. As shown in Fig. 9, the Gaussian kernels of the measurement area are estimated by Eq. (44) and Eq. (45) according to the measured depths

_{p}*z*and

_{p}*z*for a uniform-frequency fringe pattern with

_{c}*nT*= 20. Thus, the phase of the variable-frequency fringe pattern can be designed for the

_{h}*U*-axis (Fig. 10(a)) and

_{p}*V*axis (Fig. 10(c)). To eliminate the edge influence for the plane fitting, the area to be evaluated is shrunk to be 85% multiply of the shared view(SV) between camera and projector in the center.

_{p}Then, the plane is measured using uniform-frequency and variable-frequency fringe patterns. The measurement results by uniform-frequency fringe patterns with different frequencies are illustrated in Fig. 11. It reveals that the measurement error is increased in the area of the plane where the degree of defocus is becoming larger. Figures 11(d)–11(f) illustrate that the measured point cloud of plane is distorted and that the distortion in the more blurred area (the left areas in Fig. 11(a)–11(c)) is aggravated when the frequency is increased.

The variable-frequency fringe patterns (Fig. 12(a) and Fig. 12(b)) are generated by Eq. (36) according to the phases designed for the *U _{p}*-axis and

*V*-axis. The measurement results obtained using the variable-frequency fringe pattern are shown in Fig. 12(c) and Fig. 12(d). Compared with the measurement results obtained using the uniform-frequency fringe pattern shown in Fig. 11, the amplitude (represented by the standard deviation(STD) of the fitting error value, which are marked in bottom of Fig. 11(d)–11(f) and Fig. 12(d)) of the sinusoidal shape-like wave measurement error is apparently attenuated throughout the measurement area by the variable-frequency fringe pattern. It indicates that variable-frequency can reduce the measurement error fluctuation and improve the robustness.

_{p}The plane is measured ten times one place, and the RMS errors of plane fitting are plotted in Fig. 13(a). In other two places, the same measurement experiment is conducted and the measurement results are plotted in Fig. 13(b) and Fig. 13(c).

And the mean values of the RMS errors of every plane fitting measured by uniform-frequency and variable-frequency in three places are listed in Table 1, which shows that the variable frequency method improves measurement accuracy of at least 5% over the method using uniform frequencies. Furthermore, the variable frequency design method gives the frequency range that can be selected for an arbitrary object with a known coarse depth measurement.

When the evaluated area is shrunk to be 70% and 55% multiply of the shared view between camera and projector for uniform-frequency *nT _{h}* = 10, 15, 20, 25, 30, 35 and variable-frequency, the RMS errors of plane fitting in three plane measurement are listed in Table 2.

From Fig. 13, Table 1 and Table 2, it is found that the measurement accuracy obtained using the variable-frequency fringe pattern is the greatest compared with that obtained using the uniform-frequency fringe pattern. As shown in Table 2, When the plane fitting area to be evaluated is shrunk more to the view center, the measurement accuracy is increased but is still inferior to the measurement accuracy obtained by varying the frequency. Therefore, Table 2 reveals that the measurement involving the variable-frequency fringe pattern is superior over all those involving the uniform-frequency fringe pattern in terms of both accuracy and robustness.

Second, as shown in Fig. 8(e), a standard sphere with radius *r* = 100mm is measured. The defocus kernel maps of camera and projector and the total value from a coarse depth measured are figured out, which are shown in Fig. 14. So the variable period number(Fig. 15(b) and Fig. 15(d)) and variable-frequency phase(Fig. 15(a) and Fig. 15(c)) are designed according to the proposed variable-frequency fringe pattern design method under the calculated defocus kernel with coarse measured depth.

The designed variable-frequency fringe patterns along *U _{p}* axis and

*V*axis are shown in Fig. 16(a) and Fig. 16(b). And the measurement results are shown in Fig. 16(c) and Fig. 16(d).

_{p}And the measurement results by uniform-frequency fringe patterns with different frequencies are illustrated in Fig. 17, the measurement results also indicate that the measurement accuracy of the defocus area at the contour of the shared view decreases when the frequency increases.

The sphere is measured ten times respectively at two places and the measurement results are shown in Fig. 18.

And the mean value of the RMS errors for every uniform-frequency and variable-frequency are listed in Table 3. The results indicate that the variable-frequency method outperforms the uniform-frequency method. For ${\overline{RMS}}_{1}$, the improvement is approximately 9%. Furthermore, the variable frequency obtains the best measurement result for ${\overline{RMS}}_{2}$. These experimental results demonstrate that it is valuable to determine an optimal frequency even if one uses the uniform frequency method according to the designed range of variable frequency.

When the evaluated area is shrunk(Srk) by 100 pixels(pxs) and 200 pixels to the center of the shared view between camera and projector the RMS errors of sphere fitting in other two places are listed in Table 4. The results also indicates that the variable-frequency fringes based measurement method obtains the greatest measurement accuracy superior to the best result by uniform frequency.

Third, a step like object(Fig. 8(c)) is measured, because of the sharp change of the depth, there are some shadow areas in the shared view on the cliff surface between step planes, we set the defocus kernels of these areas to be constant for the variable-frequency fringes design along *U _{p}* axis and for the variable-frequency fringes design along

*V*axis. And when measuring this step like object, the shadow areas are excluded. The defocus kernel maps of camera, projector and their total value are shown in Fig. 19(c). And the designed variable-frequency phase and variable frequency correspondingly are shown in Fig. 20.

_{p}The measurement results are shown in Fig. 21.

Also, the step like object is measured ten times, and the three parallel planes are fitted respectively. The norm vector ${\overrightarrow{N}}_{i}(i=1,2,3)$ of the plane corresponding to the minimum RMS of fitting error is selected as the reference norm vector ${\overrightarrow{N}}_{\mathrm{min}}$, and the angle deviation between the fitted norm vectors of other two planes and the reference norm vector is calculated as

From the fitting results in Table 5, the plane fitting results for PLANE1 and PLANE2 are superior compared to those obtained by the uniform frequency method. Although PLANE3 is less inferior compared with uniform frequency *nTh* = 10 due to the fitting gap of the defocus kernel between PLANE3 and PLANE2, as well as the relatively narrow area of PLANE3, the angle deviation between PLANE1 and PLANE3 decreases about 55%. So the variable frequency measurement obtains the best comprehensive measurement accuracy according to three plane fitting RMS error and norm vector deviation.

## 4. Summary and discussion

This paper proposed a depth-driven variable-frequency fringe pattern to improve the overall accuracy of 3D measurement in fringe projection profilometry. To reduce the pixel matching error caused by intense noise in the captured image, the optimal frequency was derived based on the depth, which determines the defocus degree of the projector and camera. Then, online frequency optimization along abscissa and ordinate axes in the sinusoidal fringe pattern according to depth feedback was presented. Next, the coding and decoding methods for variable-frequency sinusoidal fringe patterns were given. The experimental results demonstrate the efficiency of our proposed method in improving the 3D shape measurement accuracy with variable depth by designed variable frequency fringe pattern. Though the effect of our proposed variable-frequency measurement method to select the optimal frequency with different measurement depth to improve the measurement accuracy, some flaws remain to be considered. First, the coarse measurement of object depth is required to design the optimal frequency, which slows down the measurement process. Second, when the depth is so large that the defocus is very worse, the optimal frequency makes no obvious improvement because the amplitude of the sinusoidal wave is attenuated too small to accurately determine the phase by phase shifting method.

The defocus kernel projector is proportional to the scene depth measured and during the coarse measurement of the depth, more image processing technologies are required to exclude the gross error and some area can not be reconstructed. And the optimal frequency is not straightforward calculated out for the object with drastic change in depth or noncontinuous gap in the object surface because of the simplification for the defocus kernel linearly fitting with constant section length. More deeply analysis works of the defocus kernel fitting with nonlinear relationship and adaptive section length will be done in the future.

## Funding

National Key *R*&*D* Program of China(2016YFE0206200); National Natural Science Foundation of China (NSFC) (U1613205, 51675291); State Key Laboratory of China (SKLT2018C04); Basic Research Program of Shenzhen (JCYJ20160229123030978, JCYJ20160429161539298).

## References and links

**1. **L. Zhou, F. Zhou, D. Qu, X. Liu, and W. Lu, “Error analysis of the non-diffraction grating structured light generated by triangular prism,” Opt. Commun. **306**, 174–178 (2013). [CrossRef]

**2. **P. Jin, J. Liu, S. Liu, and X. Wang, “A new multi-vision-based reconstruction algorithm for tube inspection,” Int. J. Adv. Manuf. Technol. **93**, 1–15 (2017). [CrossRef]

**3. **Y. J. Xu, C. Chen, S. J. Huang, and Z. H. Zhang, “Simultaneously measuring 3d shape and colour texture of moving objects using ir and colour fringe projection techniques,” Opt. & Lasers Eng. **61**, 1–7 (2014). [CrossRef]

**4. **J. Geng, “Structured-light 3d surface imaging: a tutorial,” Adv. Opt. Photonics **3**, 128–160 (2011). [CrossRef]

**5. **J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern recognition **43**, 2666–2680 (2010). [CrossRef]

**6. **X. Su and W. Chen, “Reliability-guided phase unwrapping algorithm: a review,” Opt. Lasers Eng. **42**, 245–261 (2004). [CrossRef]

**7. **E. Li, X. Peng, J. Xi, J. Chicharo, J. Yao, and D. Zhang, “Multi-frequency and multiple phase-shift sinusoidal fringe projection for 3d profilometry,” Opt. Express **13**, 1561–1569 (2005). [CrossRef]

**8. **Y. Y. Cheng and J. C. Wyant, “Multiple-wavelength phase-shifting interferometry,” Appl. Opt. **24**, 804–807 (1985). [CrossRef]

**9. **Z. Li and S. Nayar, “Projection defocus analysis for scene capture and image display,” ACM Transactions on Graph. (TOG) **25**, 907–915 (2006). [CrossRef]

**10. **Y. Wang and S. Zhang, “Three-dimensional shape measurement with binary dithered patterns,” Appl. Opt. **51**, 6631–6636 (2012). [CrossRef]

**11. **W. Lohry and S. Zhang, “Genetic method to optimize binary dithering technique for high-quality fringe generation,” Opt. Lett. **38**, 540–542 (2013). [CrossRef]

**12. **J. Dai and S. Zhang, “Phase-optimized dithering technique for high-quality 3d shape measurement,” Opt. Lasers Eng. **51**, 790–795 (2013). [CrossRef]

**13. **J. Dai, B. Li, and S. Zhang, “Intensity-optimized dithering technique for three-dimensional shape measurement with projector defocusing,” Opt. Lasers Eng. **53**, 79–85 (2014). [CrossRef]

**14. **J. Dai, B. Li, and S. Zhang, “High-quality fringe pattern generation using binary pattern optimization through symmetry and periodicity,” Opt. Lasers Eng. **52**, 195–200 (2014). [CrossRef]

**15. **X. X. Li, Z. J. Zhang, and C. Yang, “High-quality fringe pattern generation using binary pattern optimization based on a novel objective function,” Optik - Int. J. for Light. Electron Opt. **127**, 5322–5327 (2016). [CrossRef]

**16. **G. A. Ayubi, J. A. Ayubi, J. M. D. Martino, and J. A. Ferrari, “Pulse-width modulation in defocused three-dimensional fringe projection,” Opt. Lett. **35**, 3682–3684 (2010). [CrossRef] [PubMed]

**17. **Z. Li and S. Nayar, “Projection defocus analysis for scene capture and image display,” ACM Transactions on Graph. (TOG) **25**, 907–915 (2006). [CrossRef]

**18. **M. Nagase, D. Iwai, and K. Sato, “Dynamic defocus and occlusion compensation of projected imagery by model-based optimal projector selection in multi-projection environment,” Virtual Real. **15**, 119–132 (2011). [CrossRef]

**19. **Y. Oyamada and H. Saito, “*Estimation of projector defocus blur by extracting texture rich region in projection image*,” Vaclav Skala - UNION Agency (2008).

**20. **Y. Oyamada and H. Saito, “Defocus blur correcting projector-camera system,” Lect. Notes Comput. Sci. **5259**, 453–464 (2008). [CrossRef]

**21. **K. Irie, A. E. Mckinnon, K. Unsworth, and I. M. Woodhead, “A technique for evaluation of ccd video-camera noise,” IEEE Transactions on Circuits Syst. for Video Technol. **18**, 280–284 (2008). [CrossRef]

**22. **Y. Oyamada and H. Saito, “Focal pre-correction of projected image for deblurring screen image,” Comput. Vis. Pattern Recognition, 2007. CVPR ’07. IEEE Conf. on., 1–8 (2007).

**23. **S. Zhuo and T. Sim, “Defocus map estimation from a single image,” Pattern Recognit. **44**, 1852–1858 (2011). [CrossRef]

**24. **H. Chen, J. Su, J. Xu, K. Chen, R. Chen, and Z. Zhang, “Accurate calibration method for camera and projector in fringe patterns measurement system,” Appl. Opt. **55**, 4293–4300 (2016). [CrossRef]