Abstract

Phase-shifting profilometry (PSP) is considered to be the most accurate technique for phase retrieval with fringe projection profilometry (FPP) systems. However, PSP requires that multiple phase-shifted fringe patterns be acquired, usually sequentially, which has limited PSP to static or quasi-static imaging. In this paper, we introduce multispectral 4-step phase-shifting FPP that provides 3D imaging using a single acquisition. The method enables real-time profilometry applications. A single frame provides all four phase-shifted fringe patterns needed for the PSP phase retrieval algorithm. The multispectral nature of the system ensures that light does not leak between the spectral bands, which is a common problem in simultaneous phase-shifting with color cameras. With the use of this new concept, custom composite patterns containing multiple patterns can be acquired with a single acquisition.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Fringe projection profilometry (FPP), also known as phase measuring profilometry (PMP), is a three-dimensional (3D) shape measurement method based on structured light [1]. The shape measurement in FPP relies on phase estimates from two-dimensional (2D) fringe patterns (FPs) and triangulation analysis. The method has four steps: (1) illumination of the object with one or more FPs, (2) recording of the distorted FPs with a camera, (3) computation of the phase changes within the FPs with respect to FPs at a reference plane, and (4) conversion of the computed phase changes into a topographical elevation map of the object’s surface.

The phase map computation from the FPs can be performed with both spatial and temporal phase stepping methods. Spatial phase stepping algorithms evaluate fringe deformation at each pixel by considering the surrounding pixels. This class of algorithms is also referred to as transform-based algorithms since mathematical transform functions such as Fourier, Wavelet, and Hilbert are implemented [2,3]. Spatial phase stepping methods can retrieve the phase from a single copy of a deformed FP, which can be advantageous for measuring time-varying surfaces. However, any shadowing caused by poor illumination, unavoidably leads to uncertainty and error [4]. Alternatively, temporal phase stepping methods consider the phase of the FP at each pixel as a function of time. Since a typical 2D FP model has three unknowns (background, modulation intensity, and phase), calculation of the phase values requires at least three phase-shifted FPs. Indeed, phase-shifting profilometry (PSP) is another name for this class of algorithms. The main advantage of temporal phase stepping is the capability of pixel-wise phase measurement; however, the need for multiple acquisitions limits its application to static and quasi-static scenes.

Use of a multiple FPs leads to better topographical estimates [5]. Given that projecting a sequence of FPs is time-consuming and suboptimal for applications that require fast measurements, single-shot 3D shape measurement using composite patterns (CP) has been developed [6]. Composite patterns contain multiple FPs, which are distinguishable by some means. The CPs can be categorized as either grayscale or color [7] and a variety of combinations have been designed and evaluated, such as those based on combination of different FPs with different orientations [810], different spatial frequencies [1115], and different spatial phases [1621]. While many of these techniques have been developed to improve the accuracy and sensitivity of measurements for spatial phase-stepping algorithms, others have been designed to enable PSP algorithms to be applied to data collected when projecting a single structured light pattern.

With a technique based on amplitude modulation, it is possible to compile any number of patterns into a grayscale composite pattern; however, increasing the number of patterns results in reduced dynamic range of individual patterns and errors in the demodulation process [16,17]. On the other hand, color patterns encode the fringes by taking advantage of color in addition to the intensity of the projected light. One way to take advantage of this idea is to use a sinusoidal intensity FP composed with a color-coded stripe pattern [2224]. The functionality of the color stripes is to identify the fringe order while that of the sinusoidal intensity FP is to provide dense relative depths. However, these methods only provide a single FP supported with color strips to reduce the phase retrieval error. There are other utilizations of the color information that help provide multiple FPs in a single frame. One way is to mix separate FPs produced in different color channels, e.g., red, green, and, blue [2528]. These methods are subject to non-linear errors introduced by non-linear responses of color projectors and color cameras. The errors have been discussed in the literature and compensating methods to reduce them have been proposed [29]. Nevertheless, the errors cannot be eliminated entirely even with compensation. Additionally, these techniques are sensitive to surface colors and suffer from lower SNR due to spectral division [16]. These issues, along with the color-band leakage problem, limit the accuracy of phase maps obtained with color CPs [30].

Our group recently published a method based on multispectral filter array (MFA) technology to produce multispectral CPs [31]. We described a multispectral CP that encoded multiple FPs by decomposing the light source into several spectral bands. Every band provided a unique pattern and their combination generated a multispectral CP. The light was decomposed using a MFA made from an array of micro-sized bandpass filters manufactured on a glass substrate. Reflections of the multispectral CP from the object were detected with a multispectral snapshot camera. CPs generated in this way do not have the dynamic range limitations of grayscale methods, nor do they suffer from the limitations of color methods such as sensitivity to surface colors and leakage between RGB bands. In our previous work, the MFA determined the structure of the CP and resulted in fringe patterns in four different directions [31]. The CP provided sampling of a greater number of features on the surface of objects; however, the design of the CP required a spatial phase stepping retrieval algorithm that was sensitive to illumination artifacts such as shadows. Here, we present a new multispectral CP design that contains four phase-shifted fringe patterns whose phase map can be retrieved using a PSP processing algorithm. With this improvement, we can perform pixel-wise phase measurement using only a single image acquisition. Furthermore, the technique is less sensitive to phase errors introduced by shadows.

The manuscript is organized as follows. Section 2 describes the projection method and the phase retrieval procedure. Results of the application of the novel multispectral CP technique are presented in Section 3. Section 4 concludes the article with a discussion on the main findings, their interpretation, and limitations of the technique.

2. Methods

2.1 Setup

A custom designed multispectral filter array (MFA; Spectral Devices Inc., London, Canada) was used to generate a CP that encompassed four phase-shifted FPs. The MFA was manufactured on a glass substrate and contained a repeating pattern of four adjacent bandpass filters that were each 11 µm wide and 11.264 mm tall. The pattern was repeated 256 times across the device, resulting in a square device 11.264 mm by 11.264 mm (Fig. 1(a)). The filters were designed to have peak optical transmission at 580 nm (F1), 660 nm (F2), 735 nm (F3), and 820 nm (F4). The spectral response of the MFA is shown in Fig. 1(b). The MFA was positioned in the path of a broadband light source (Fig. 1(c)), which resulted in a multispectral CP on the surface of the object (Fig. 1(d)). The multispectral CP was recorded with a snapshot multispectral camera with spectral response identical to that of the MFA (MSC-AGRI-1-A, Spectral Devices Inc., London, Canada [32]). Pixels in the camera were arranged in a mosaic pattern such that each 2×2 block of pixels contained 4 spectrally unique pixels. Four spectral images were extracted from each raw image using a demosaicing algorithm (pixel reordering) implemented in a software tool provided by the camera manufacturer (msDemosaic, Ver 1.0, Spectral Devices Inc., London, Canada).

 figure: Fig. 1.

Fig. 1. Multispectral pattern generation. (a) Schematic of the MFA (not to scale). (b) Spectral responses curves for the MFA and the multispectral camera. (c) Rendering of the optical setup consisting of a halogen light source, MFA, lenses, the object, and snapshot multispectral camera. (d) Color image (Apple iPhone-6s) of the CP projected onto the surface of a white plastic object.

Download Full Size | PPT Slide | PDF

Each set of 4 multispectral images from a single camera exposure resulted in 4 FPs, where one FP due to one filter pattern was π/2 phase-shifted with respect to the FP from the adjacent filter pattern, resulting in a set of FPs that were suitable for PSP phase retrieval algorithms.

2.2 Phase retrieval

Each FP represents a reflection of the light pattern (here, straight fringes) from the surface of the imaged object. The fringes are distorted due to the object’s topography, and this can be mathematically modeled as [33]:

$$I({x,y} )= A({x,y} )+ B({x,y} )\cos [{\phi ({x,y} )} ]$$
where $({x,y} )$ denote the camera pixel coordinates, $A({x,y} )$ and $B({x,y} )$ are the background intensity and the modulation amplitude, respectively, which are associated with the object’s surface reflectivity index, the ambient light, and the effects of the optics. The phase of the distorted FP is indicated as $\phi ({x,y} )$ which includes the phase of the original FP plus the phase modulation caused by the object’s surface depth and height variations.

Traditional 4-step phase-shifting involves projecting 4 sinusoidal FPs in which the patterns are evenly phase-shifted, i.e., $\pi /2$. Correspondingly, the camera captures the distorted patterns as:

$${I_n}({x,y} )= A({x,y} )+ B({x,y} )\cos [{\phi ({x,y} )+ ({n - 1} )\pi /2} ]$$
where n = 1,2,3,4 are the phase-shift indexes. For this notation, it has been assumed that the background intensity, modulation amplitude, and the object surface behavior is the same for all four acquired frames. Under such assumption, the phase can be calculated as [33]:
$$\phi = \arctan \left( {\frac{{\sin (\phi )}}{{\cos (\phi )}}} \right) = \arctan \left( {\frac{{{I_4} - {I_2}}}{{{I_1} - {I_3}}}} \right)$$

Hereinafter, the coordinate $({x,y} )$ has been eliminated to simplify the notations. The phase computed by this method produces a wrapped version of the phase map limited between $- \pi $ and $\pi $ due to the properties of the arctan function. Therefore, there is a need for unwrapping process to obtain the absolute value of the phase.

As mentioned earlier, the wrapped phase is computed under the assumptions that (1) the object is invariant and (2) all optical aspects of the illumination subsystem, image acquisition subsystem, and the target object behavior are unchanged during the four acquisitions. Since we acquire all 4 acquisitions simultaneously using multispectral illumination and multispectral imaging, the first assumption is met, i.e. the object is stationary for all 4 FPs acquisitions. The second assumption is not fully met since with multispectral imaging there can be significant differences between spectral images of the FPs. This is because (1) the snapshot multispectral camera has different responses for different wavelengths and (2) the reflectivity of the object is not the same for each spectral band. In this case, we have:

$${I_n} = {A_n} + {B_n}\cos [{\phi + ({n - 1} )\pi /2} ]$$

Correspondingly, the $\arctan $ of the same ratio between the four phase-shifted patterns brings:

$$\begin{aligned} \arctan \left( {\frac{{{I_4} - {I_2}}}{{{I_1} - {I_3}}}} \right) &= \arctan \left( {\frac{{{A_4} - {A_2} + ({{B_4} + {B_2}} )\sin (\phi )}}{{{A_1} - {A_3} + ({{B_1} + {B_3}} )\cos (\phi )}}} \right)\\ &= \arctan \left( {\frac{{{A_N} + ({{B_N}} )\sin (\phi )}}{{{A_D} + ({{B_D}} )\cos (\phi )}}} \right) = \widehat \phi \end{aligned}$$
where ${A_N} = {A_4} - {A_2}$, ${B_N} = {B_4} + {B_2}$, ${A_D} = {A_1} - {A_3}$, and ${B_D} = {B_1} + {B_3}$. It can be seen that $\widehat \phi \ne \phi $, meaning any discrepancy in the background or modulation of the four bands will add errors to the retrieved phase map. On the other hand, by removing ${A_n}\textrm{s}$ and equalizing ${B_n}\textrm{s}$, the error can be mitigated.

We propose a method based on the Lissajous figures and ellipse fitting (LEF) derived from [10] and [34] to decrease the aforementioned error. The Lissajous figure is a graph that plots two sinusoidal intensity profiles against each other in a 2D Cartesian coordinate system. When the profiles share the same frequency of repetition but have a non-zero phase offset, the graph is equivalent to an ellipse. By assuming that the profiles are parametrically represented as:

$${I_x} = {x_0} + {\rho _x}\cos \omega t\textrm{ and }{I_y} = {y_0} + {\rho _y}\cos ({\omega t + \psi } )$$
then based on the Pythagorean identity that ${\cos ^2}\omega t + {\sin ^2}\omega t = 1$, the ellipse can be derived as:
$$\frac{{{{({{I_x} - {x_0}} )}^2}}}{{{\rho _x}^2}} + \frac{{{{({{I_y} - {y_0}} )}^2}}}{{{\rho _y}^2}} - \frac{{2\cos \psi ({{I_x} - {x_0}} )({{I_y} - {y_0}} )}}{{{\rho _x}{\rho _y}}} = {\sin ^2}\psi$$
Therefore, by considering that the captured FPs are sequentially $\pi /2$ phase-shifted representations of the same sinusoidal pattern, it can be concluded that plotting of the Lissajous figure for every adjacent pair will produce an ellipse. To normalize, we take the four adjacent pairs (Fig. 2(b)) to plot the Lissajous figure four times. For each pair, two identical and synchronized sliding windows with center $(x,y)$ and size $(w,h)$ were used to select two corresponding parts of the two FPs (Fig. 2(c)). The size of the window pair was designed to ensure that it covered at least one fringe period. Additionally, the window covered at least five pixels of the local FP, which automatically covered at least one fringe period when the fringe period was three pixels or greater. These conditions enabled the fitting of an ellipse to the intensity samples that enclosed at least one cycle of the local fringe pattern [34]. The intensities of the selected pixels were then plotted against each other (e.g., the Lissajous plot of the selected window pair for ${I_2}$ against ${I_1}$ is shown in Fig. 2(d)). An ellipse was fitted to the points of the Lissajous plot (Fig. 2(e)) using the least square method by modeling the conical form of the ellipse as shown in Eq. (8):
$${c_1}{I_x}^2 + 2{c_2}{I_x}{I_y} + {c_3}{I_y}^2 + 2{c_4}{I_x} + 2{c_5}{I_y} + {c_6} = 0$$

The background, amplitude, and phase shift parameters of the two profiles were then computed as [35]:

$$\left\{ {\begin{array}{c} {{x_0} = \frac{{{c_2}{c_5} - {c_3}{c_4}}}{\delta } \; \& \; {\rho_x} = \frac{{\sqrt { - {c_3}\Delta } }}{\delta }}\\ {{y_0} = \frac{{{c_2}{c_4} - {c_1}{c_5}}}{\delta } \; \& \; {\rho_y} = \frac{{\sqrt { - {c_1}\Delta } }}{\delta }}\\ {\psi = \arccos \left( {\frac{{ - {c_2}}}{{\sqrt {{c_1}{c_3}} }}} \right)} \end{array}} \right.\textrm{ where }\left\{ {\begin{array}{c} {\delta = \left|{\begin{array}{cc} {{c_1}}&{{c_2}}\\ {{c_2}}&{{c_3}} \end{array}} \right|}\\ {\Delta = \det \left( {\left[ {\begin{array}{ccc} {{c_1}}&{{c_2}}&{{c_4}}\\ {{c_2}}&{{c_3}}&{{c_5}}\\ {{c_4}}&{{c_5}}&{{c_6}} \end{array}} \right]} \right)} \end{array}} \right.$$

 figure: Fig. 2.

Fig. 2. Normalization process for phase extraction from four spectral fringe patterns. (a) Raw data before demosaicing. (b) Four adjacent pairs of FPs. (c) Schematic of a pair of identical windows sliding over a pair of adjacent FPs. (d) Example of intensity plots of ${I_2}$ against ${I_1}$ for a sliding window (25,25) pixels in size and centered at (100,400). (e) Ellipse fitted with least squares method to data in panel (d). (f) Normalized fitted ellipse computed from data in panel (e). (g-j) Four FPs captured with the multispectral camera for a flat surface after filtering with a gaussian low pass filter. (k) Intensity profiles along the dashed lines within the regions indicated in the magnified parts of (g-j). (l-o) The normalized FPs corresponding to data in panels (g-j). (p) Intensity profiles along the dashed lines within the regions indicated in the magnified parts of (k-n).

Download Full Size | PPT Slide | PDF

The two normalized quadratic profiles were defined as ${I^{\prime}_x} = ({{I_x} - {x_0}} )/{\rho _x}$ and ${I^{\prime}_y} = ({{I_y} - {y_0}} )/{\rho _y}$ in which the Lissajous figure produces a unit circle with the center located at $({{{x^{\prime}}_0},{{y^{\prime}}_0}} )= ({0,0} )$ as shown in Fig. 2(f). The process was repeated for all pixels to generate pixel-wise normalization for each FP pair resulting in two normalized frames for each ${I_n}({n = 1,2,3,4} )$. A pixel-wise averaging was performed on each pair of normalized frames and resulted in the final normalized FP (Fig. 2(l-o)) suitable for phase extraction.

Due to the design of the MFA, the FPs are expected to cover 4 phase-steps in increments of $\pi /2$ resulting in normalized Lissajous plots. However, with measurements, there are always some errors in the phase shift. Figure 3 shows examples of the normalized fitted ellipses for the four possible pairs with the value of the estimated phase shift among the FPs for the specific window and the data represented in Fig. 2. The term ${I_m}:{I_n}$ refers to plot ${I_n}$ against ${I_m}\textrm{.}$ In this example, the maximum error was almost 17°, which was significant and could easily affect the phase map retrieved with Eq. (3). However, the summation of the estimated phase shifts was 359.74, which is close to the ideal case of 360°, which did not appreciably affect the fringe map phase calculation. The discrepancies could be caused by measurement and geometric errors, including those caused by optics used to expand the light patterns and camera lens effects [36,37].

 figure: Fig. 3.

Fig. 3. Example of the phase step estimation for all four FP pairs based on the normalized Lissajous figure.

Download Full Size | PPT Slide | PDF

To cope with measurement and geometric errors, we used the actual phase shift information in the phase map calculation. Since the phase shift steps were not necessarily evenly spaced, the FPs were represented as:

$${I_n} = A + B\cos [{\phi + {\psi_n}} ]$$
where ${\psi _n}$ represents the phase step for $n = 1,2,\ldots ,N({N \ge 3} )$. The phase can be calculated as [38]:
$$\phi = \arctan \left( {\frac{{\sum\limits_{k = 1}^N {\sum\limits_{l = 1}^N {\sum\limits_{n = 1}^N {{I_n}[{\cos ({{\psi_k}} )- \cos ({{\psi_l}} )} ][{\cos ({{\psi_k}} )\sin ({{\psi_l}} )+ \cos ({{\psi_n}} )\sin ({{\psi_k}} )+ \cos ({{\psi_l}} )\sin ({{\psi_n}} )} ]} } } }}{{\sum\limits_{k = 1}^N {\sum\limits_{l = 1}^N {\sum\limits_{n = 1}^N {{I_n}[{\sin ({{\psi_k}} )- \sin ({{\psi_l}} )} ]} [{\cos ({{\psi_k}} )\sin ({{\psi_l}} )+ \cos ({{\psi_n}} )\sin ({{\psi_k}} )+ \cos ({{\psi_l}} )\sin ({{\psi_n}} )} ]} } }}} \right)$$

In our case, we utilized four-step phase-shifting $({N = 4} )\textrm{,}$ where ${\psi _1}$ was considered zero and ${\psi _2}$, ${\psi _3}$, and ${\psi _4}$ were the estimated phase-step values calculated using Eq. (9) between ${I_1}$: ${I_2}$, ${I_2}$: ${I_3}$, and ${I_3}$: ${I_4}$ respectively. Therefore, for every pixel coordinate, the surrounding pixel values and Lissajous figure principal axis gave an estimation of the actual phase shift, which was later used in Eq. (11) to retrieve the phase map more accurately.

3. Results

To test the performance of the preprocessing method, we designed a simulation in two parts. First, to evaluate the performance of the equalization step and then, to test the performance of the angle offset detection step. For the first part, we simulated a set of 4 phase-shifted fringe patterns each containing 5 vertical fringes in a frame of 50 by 50 pixels, accompanied by zero-mean Gaussian white noise with a variance of 0.001. Every other frame had phase shifts fixed at 90 degrees. The background was changed from 0 indicating there was no background for all four frames up to a value of 20 which indicated the amplitude of the next frame was 20 times that of the previous frame (first frame amplitude set to a non-zero value). The modulation amplitude changed from 0, the ideal case, when all frames had the same dynamic range, to 20, when the next frame had an amplitude 20 times that of the previous one. The set of 4 frames was constrained to be within the range [0, 1] for each iteration prior to feeding the frames to the normalization process described in Fig. 2, to normalize every frame to the range of [−1, 1]. Using this method, the (15,15) windows were used to ensure that each window would cover a full fringe. The mean square error (MSE) between the results (normalized fringe patterns with range [−1, 1]) and a set of ideal fringe patterns (90° phase-shifted sinusoidal fringe patterns with range [−1, 1]) was used as a performance indicator. Based on the performance indicator, the normalization method proved to be near perfect when the background and modulation amplitudes differed by less than 5 times between two nearby FPs (Fig. 4(a)).

 figure: Fig. 4.

Fig. 4. Simulations to test the performance of the pre-processing explained in the method section for (a) normalization and (b) angle offset detection.

Download Full Size | PPT Slide | PDF

For the second simulation, the 4 phase-shifted frames were designed the same way as in the previous simulation, but the background for all frames was set to 0, the modulation amplitude was the same, and the range was set to [−1, 1]. In an iteration, shifts in phase between adjacent frames have been altered. Between the 1st and 2nd frames, the phase shifts ranged from 45 degrees to 135 degrees (i.e. −45 degrees to +45 degrees off from the ideal). One-third of the designed offset was allocated to each other phase shift during each iteration, to maintain the total phase shift to 360 degrees for the sets of FPs. Figure 4(b) compares the measured phase shifts between the first 2 frames with the simulated phase shifts (including offsets) between them. A comparison of measured phase shifts to the ideal values revealed that the error is always within a range of 0 to 4 degrees, which did not impact the phase measurements appreciably.

To test the performance of the proposed normalization technique with our setup, the imaged object was set to be a flat rigid surface. Each raw output image from the multispectral camera was demosaiced into 4 spectral images using vendor-supplied software. Each spectral image contained a non-distorted FP related to a specific wavelength and all four FPs constructed a 4-step phase-shifted set of FPs. Figure 5 shows the phase map retrieved with different methods including through computation of Eq. (3) for non-normalized FPs (low pass filter only), Eq. (3) for normalized FPs, and Eq. (11) for normalized FPs. Since the object was a rigid flat surface, it was expected that the absolute phase map showed a ramp where every fringe produced $2 \times \pi $ linear value change. The calculated phase map with Eq. (3) for non-normalized patterns showed significant error in the wrapped phase (Fig. 5(a)) and subsequently in the unwrapped phase (Fig. 5(b)) due to the existence of the intensity biased in the original (i.e., acquired) FPs. Employing the normalization process over the FPs before employing Eq. (3) improved the wrapping and unwrapping (Fig. 5(d) and 5(e), respectively); however, the phase maps still showed a periodic error over the phase map alongside the normal of the fringes. This error was a result of the phase offset of the FPs and could be mitigated with the third method, where the actual phase shifts were estimated through Eq. (11) (Fig. 5(g-i)). Comparing the line profiles to the ideal case illustrated these errors more clearly in terms of scope and variations (Fig. 5(j)). The green line, which indicated the difference between the third method and the ideal flat surface, showed the smallest error. Nevertheless, it also showed a slight variation due to the non-ideal nature of the target surface and the presence of systematic errors such as internal noise of the camera and light fluctuations. Phase unwrapping was performed with the Goldstein algorithm [2,39].

 figure: Fig. 5.

Fig. 5. Phase retrieval process for a flat object. Wrapped phase map (a), unwrapped phase map (b), and line profiles obtained using Eq. (3) with FPs before normalization (c). (d-f) Same as (a-c) after normalization over FPs. (g-i) Same as (a-c) using Eq. (11) and FPs after normalization. (j) Comparison of the profiles obtained using the different phase retrieval methods to the ideal case (flat imaged object) which shows the phase error.

Download Full Size | PPT Slide | PDF

The proposed normalization process based on LEF involved windowing over an area that included at least one fringe in cross-section. Since the normalization process is expected to be sensitive to discontinuities and sharp edges on the object surface, we tested the processing method on data from a target with horizontal and vertical edges. The results presented in Fig. 6 show the effect of normalization on the absolute phase maps, indicating that the normalization process improved the computed phase maps significantly compared to un-normalized FPs (compare Fig. 6(c) to Fig. 6(e)). However, small errors were discovered at the edges of features that were parallel to the fringe patterns (observable as bright vertical features in Fig. 6(f)).

 figure: Fig. 6.

Fig. 6. Normalization effect for an object with discontinuities and edges. (a) CAD model of a target consisting of five groups of bars of varying width (1 to 5 mm in steps of 1 mm). (b) Original FPs. (c) Unwrapped phase for the original FPs. (d) Normalized FPs. (e) Unwrapped phase for the normalized FPs. (f) Same as panel (e), except with background removed.

Download Full Size | PPT Slide | PDF

Multiple objects were measured with the multispectral system, and the results were processed and shown in Fig. 7. The figure illustrates the raw images taken by the multispectral camera, the measured wrapped phase maps, the height maps, and 3D representations of each object. The height maps were obtained by removing the background from the absolute unwrapped phase map and then converting to height. To remove the background in each experiment, the object was replaced with a reference flat plane (e.g., the object used to generate Fig. 5), the absolute phase map of the reference was calculated and subtracted from the calculated phase map of the object. The resultant background-removed phase map was converted into a height map using a nonlinear calibration method described by Jia et al. (2007) [40]. Complex targets were selected for each experiment to demonstrate the robustness of the system. The first target was a paper cube with one corner facing the camera (Fig. 7(a-d)). The cube tested the system against an object with multiple flat faces each with a different surface normal to the projector/camera baseline. The gently sloped surface feature (Fig. 7(b) left of center) was characterized with fewer fringes and was prone to under sampling errors. On the other hand, features with steeper slopes were sampled with a greater number of fringes and less error (Fig. 7(b) top right and center right). Figure 7(e-h) shows the results for a plastic head phantom. With this phantom, the system was tested to illustrate how well it can detect fine details of an object as well as coarse features. Note that the reflectiveness of the phantom material caused glare which led to noise in the results. The final two objects were two different gestures of a male hand showing the back of the hand (Fig. 7(i-l)) and the hand holding two ping-pong balls (Fig. 7(m-p)). The two experiments were designed to assess the capability of the system to image an object that has complex responses to the spectral bands, like human skin. As shown in Fig. 7(m), the CP reflection from the skin is less intense than that of the plastic ping-pong balls. However, even with diverse reflection conditions, the system succeeded in decomposing the FPs.

 figure: Fig. 7.

Fig. 7. 3D imaging of complex objects. The left column shows the raw images captured by the multispectral camera before demosaicing. The second column shows the wrapped phase maps. The third column shows the color-coded height maps. The right-most column shows 3D visualizations of the height map. (a-d) A cube object, (e-h) a plastic head phantom, (i-l) back surface of a hand, and (m-p) a hand holding two ping-pong balls.

Download Full Size | PPT Slide | PDF

4. Discussion and conclusion

4.1 Overview

This work introduces a novel multispectral CP that delivers four phase-shifted FPs in a single frame. In addition, a multispectral camera was used to capture the frame in one snapshot. The resulting single-shot imaging system is capable of phase-shifting FPP with high temporal resolution and precise pixel-wise phase retrieval. However, the acquired FPs had different modulation and background intensities, causing non-negligible phase offsets between the FPs which led to errors in the process of phase retrieval. We implemented a normalization method based on the Lissajous figure to eliminate the background and equalize the modulation. Moreover, a numerical calibration was performed to estimate the exact phase steps between the FPs and compensate for the effect of the phase offset error.

4.2 Major findings

The multispectral CP method presented in this paper relied on PSP algorithms to retrieve the phase map of the object. Because of this, the measurement was pixel-wise and less sensitive to fringe quality compared to spatial phase stepping methods. In addition, due to the nature of the PSP algorithms, all processing took place without reliance on human intervention. This achievement, along with the fact that our method captured 4 phase-shifted FPs simultaneously, contributed to the effectiveness of the technique.

Characterization of objects (Fig. 7) demonstrated that the method was successful in localizing errors and preventing them from spreading. Locally erroneous pixels like the shadowed area in Fig. 7(m) and the glare area in Fig. 7(e) could easily spread to other pixels in a spatial phase-stepping algorithm, but in our case the errors were kept highly localized. Furthermore, the system was insensitive to variable responsivity between FP channels even for difficult surfaces such as human skin (Fig. 7(i) and 7(m)), whose reflection properties are not the same for all spectral bands.

4.3 Comparison to previous works

In comparison to other CP platforms that allow for single-shot PSP, our method offers several advantages. As opposed to grayscale CPs based on amplitude modulation [16,17], multispectral CPs can expand the number of patterns in a single CP without compromising the dynamic range. Moreover, the data generated by our method is much easier to process since it does not require demodulation. It only requires a demosaicing procedure to separate the different FPs. Contrary to color camera approaches [27,28], multispectral cameras give access to a greater number of unique color channels. This option enables an increase in the number of image channels for each acquisition and the customization of spectral bands according to the application and the target (e.g., the system can use invisible patterns for security applications). Additionally, the bandwidth of multispectral cameras can be significantly narrower compared to standard color cameras greatly lowering crosstalk between spectral channels. Crosstalk is one of the primary sources of phase noise in color CP methods [30].

4.4 Procedure achievements

This paper presents a four-step phase-shifting FPP normalization method modeled after a well-known technique. For the case where both background and modulation amplitude vary 500% from one FP to the next, simulation results (Fig. 4(a)) show that our method is highly effective at removing background and retrieving the modulation amplitudes for 4 phase-shifted frames. A method based upon the parameters already calculated for the normalization method was used in order to estimate the exact phase shift from one FP to the next. We observed that even if the phase difference between 2 nearby FPs has an error by up to 45 degrees from the expected (90 degrees), the algorithm could still detect the phase offset with an error less than 4 degrees (Fig. 4(b)). It is noteworthy that although the current method was designed specifically for a four-step phase shift protocol, the technique could be generalized to estimate phase shift between any two FPs.

4.5 Limitations

An issue that can be seen from the results presented in Fig. 7(d, h, and p) is error related to under-sampling. This error can impact the retrieved phase map when the number of phase shifts is too few to sample the surface variation [41]. The problem is especially evident on surfaces with steep slopes related to the baseline of the camera and projector. The phase computation may also be affected by nonlinearity introduced by non-sinusoidal fringes [42]. One possible solution to these problems is to increase the density of spectral bands in the MFA. Furthermore, due to the narrow bandwidth of the MFA, light transmission was reduced compared to white light approaches. This required a more powerful light source to project enough light onto the object so that the camera could form a well-exposed image. We used a 100 W QTH source to generate the desired pattern at a distance of approximately 50 cm in our experiment. A more powerful light source (e.g. flashlamp or high-power pulsed LED) could yield better images and shorter exposure times. Another limitation of the system was the need for a lens in the illumination beam. The use of a lens may result in a deformed CP and hence a deformed set of fringe patterns. However, this problem could potentially be mitigated by calibrating the system to estimate and remove distortion in the FPs prior to phase retrieval [43,44].

4.6 Future work

The snapshot capability of the system allows for robust 3D imaging of dynamic objects since only one camera exposure is needed to obtain the full phase-shifted data set. With this feature, the camera is less sensitive to vibrational noise since the exposure time of the camera can be reduced to compensate. Taking advantage of the fact that the speed of the multispectral CP 3D imaging is limited only by the camera frame rate, the system should be extremely useful for real-time 3D measurements related to mobile devices, robotics, and 3D color video. Another area that needs to be addressed is the number and the density of the fringes in the CP. Both features can be increased to improve the sensitivity of the system and provide broader coverage to image larger objects. Additionally, the profilometry results can be augmented with multispectral imaging to provide spectroscopy in 3D.

Funding

Natural Sciences and Engineering Research Council of Canada.

Acknowledgments

The authors thank Spectral Devices Inc for providing the multispectral filter array and the snapshot multispectral camera for this work.

Disclosures

Jeffrey Carson and Mohamadreza Najiminaini are co-founders, directors, and shareholders in Spectral Devices Inc., which supplied the multispectral filter array and the multispectral camera for this work. Parsa Omidi and Mamadou Diop have no competing interests.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43(8), 2666–2680 (2010). [CrossRef]  

2. P. Omidi, L. C. M. Yip, H. Wang, M. Diop, and J. J. L. Carson, “PhaseWare : Phase map retrieval for fringe projection profilometry and off-axis digital holographic interferometry,” SoftwareX elsevier 13, 100652 (2021). [CrossRef]  

3. P. Omidi, H. Wang, M. Diop, and J. J. L. Carson, “Algorithm for phase-displacement conversion from reflection digital holographic interferometry,” Pract. Hologr. Displays, Mater. Appl. SPIE 10944, 109440Q (2019). [CrossRef]  

4. X. Su and W. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. 35(5), 263–284 (2001). [CrossRef]  

5. W. Yin, S. Feng, T. Tao, L. Huang, M. Trusiak, Q. Chen, and C. Zuo, “High-speed 3D shape measurement using the optimized composite fringe patterns and stereo-assisted structured light system,” Opt. Express 27(3), 2411 (2019). [CrossRef]  

6. Z. H. Zhang, “Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques,” Opt. Lasers Eng. 50(8), 1097–1106 (2012). [CrossRef]  

7. C. Y. Liu and C. Y. Wang, “Investigation of Phase Pattern Modulation for Digital Fringe Projection Profilometry,” Meas. Sci. Rev. 20(1), 43–49 (2020). [CrossRef]  

8. L. Huang, C. Ng, and A. K. Asundi, “Dynamic 3D measurement for specular reflecting surface with monoscopic fringe reflection deflectometry,” Opt. InfoBase Conf. Pap. 19, CWC3 (2011). [CrossRef]  

9. J. L. Flores, B. Bravo-Medina, and J. A. Ferrari, “One-frame two-dimensional deflectometry for phase retrieval by addition of orthogonal fringe patterns,” Appl. Opt. 52(26), 6537–6542 (2013). [CrossRef]  

10. M. T. Nguyen, Y. S. Ghim, and H. G. Rhee, “Single-shot deflectometry for dynamic 3D surface profile measurement by modified spatial-carrier frequency phase-shifting method,” Sci. Rep. 9(1), 3157 (2019). [CrossRef]  

11. W.-H. Su and H. Liu, “Calibration-based two-frequency projected fringe profilometry: a robust, accurate, and single-shot measurement for objects with large depth discontinuities,” Opt. Express 14(20), 9178 (2006). [CrossRef]  

12. M. Takeda, Q. Gu, M. Kinoshita, H. Takai, and Y. Takahashi, “Frequency-multiplex Fourier-transform profilometry : a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations,” Appl. Opt. 36(22), 5347–5354 (1997). [CrossRef]  

13. C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using a composite fringe pattern,” Opt. Lasers Eng. 53, 25–30 (2014). [CrossRef]  

14. C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using phase partitions,” Opt. Lasers Eng. 68, 111–120 (2015). [CrossRef]  

15. G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017). [CrossRef]  

16. C. Guan, L. G. Hassebrook, and D. L. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express 11(5), 406 (2003). [CrossRef]  

17. Y. He and Y. Cao, “A composite-structured-light 3D measurement method based on fringe parameter calibration,” Opt. Lasers Eng. 49(7), 773–779 (2011). [CrossRef]  

18. H. M. Yue, X. Y. Su, and Y. Z. Liu, “Fourier transform profilometry based on composite structured light pattern,” Opt. Laser Technol. 39(6), 1170–1175 (2007). [CrossRef]  

19. S. Yang, Z. Hairong, L. Zhipei, and J. Jun, “Influence of sampling on composite structured light pattern,” 5th Int. Conf. Inf. Technol. Appl. Biomed. ITAB 2008 conjunction with 2nd Int. Symp. Summer Sch. Biomed. Heal. Eng. IS3BHE 2008171–174 (2008).

20. Y. You, Y. Shen, G. Zhang, and X. Xing, “Real-time and high-resolution 3D face measurement via a smart active optical sensor,” Sensors 17(4), 734 (2017). [CrossRef]  

21. Y. C. Wu, Y. P. Cao, Z. F. Huang, M. T. Lu, and D. L. Chen, “Improved composite Fourier transform profilometry,” Opt. Laser Technol. 44(7), 2037–2042 (2012). [CrossRef]  

22. Philip Fong and F. Buron, “Sensing Deforming and Moving Objects with Commercial Off the Shelf Hardware,” 101 (2006).

23. H. J. Chen, J. Zhang, D. J. Lv, and J. Fang, “3-D shape measurement by composite pattern projection and hybrid processing,” Opt. Express 15(19), 12318 (2007). [CrossRef]  

24. W.-H. Su, “Color-encoded fringe projection for 3D shape measurements,” Opt. Express 15(20), 13167 (2007). [CrossRef]  

25. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128 (2011). [CrossRef]  

26. P. S. Huang, “Color-encoded digital fringe projection technique for high-speed three-dimensional surface contouring,” Opt. Eng. 38(6), 1065 (1999). [CrossRef]  

27. J. L. Flores, J. A. Ferrari, G. García Torales, R. Legarda-Saenz, and A. Silva, “Color-fringe pattern profilometry using a generalized phase-shifting algorithm,” Appl. Opt. 54(30), 8827 (2015). [CrossRef]  

28. I. Trumper, H. Choi, and D. W. Kim, “Instantaneous phase shifting deflectometry,” Opt. Express 24(24), 27993 (2016). [CrossRef]  

29. M. Wu, N. Fan, G. Wu, S. Zhang, and F. Liu, “An inverse error compensation method for color-fringe pattern profilometry,” J. Opt. 22(3), 035705 (2020). [CrossRef]  

30. Y. Wan, Y. Cao, X. Liu, T. Tao, and J. Kofman, “High-frequency color-encoded fringe-projection profilometry based on geometry constraint for large depth range,” Opt. Express 28(9), 13043 (2020). [CrossRef]  

31. P. Omidi, M. Najiminaini, M. Diop, and J. J. L. Carson, “Single-shot detection of 8 unique monochrome fringe patterns representing 4 distinct directions via multispectral fringe projection profilometry,” Sci. Rep. 11(1), 10367 (2021). [CrossRef]  

32. “Multispectral snapshot cameras,” https://www.spectraldevices.com/products/multispectral-snapshot-cameras.

33. C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018). [CrossRef]  

34. F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018). [CrossRef]  

35. T. Požar and J. Možina, “Enhanced ellipse fitting in a two-detector homodyne quadrature laser interferometer,” Meas. Sci. Technol. 22(8), 085301 (2011). [CrossRef]  

36. P. De Groot, “Phase-shift calibration errors in interferometers with spherical Fizeau cavities,” Appl. Opt. 34(16), 2856–2863 (1995). [CrossRef]  

37. Y. Takahashi, “Effect of Phase Error in Phase-Shifting Interferometer,” Appl. Mech. Mater. 888, 11–16 (2019). [CrossRef]  

38. G. A. Ayubi, I. Duarte, and J. A. Ferrari, “Optimal phase-shifting algorithm for interferograms with arbitrary steps and phase noise,” Opt. Lasers Eng. 114, 129–135 (2019). [CrossRef]  

39. W. C. Goldstein RM and H. A. Zebker, “Satellite radar interferometry: Two-dimensional phase unwrapping,” Radio. Sci. 23(4), 713–720 (1988). [CrossRef]  

40. P. Jia, “Comparison of linear and nonlinear calibration methods for phase-measuring profilometry,” Opt. Eng. 46(4), 043601 (2007). [CrossRef]  

41. Y. Surrel, “Measurements By the Use of Phase Stepping,” Appl. Opt. 35(1), 51–60 (1996). [CrossRef]  

42. S. Ri, Q. Wang, P. Xia, and H. Tsuda, “Spatiotemporal phase-shifting method for accurate phase analysis of fringe pattern,” J. Opt. 21(9), 095702 (2019). [CrossRef]  

43. K. Li, J. Bu, and D. Zhang, “Lens distortion elimination for improving measurement accuracy of fringe projection profilometry,” Opt. Lasers Eng. 85, 53–64 (2016). [CrossRef]  

44. S. Xing and H. Guo, “Iterative calibration method for measurement system having lens distortions in fringe projection profilometry,” Opt. Express 28(2), 1177 (2020). [CrossRef]  

References

  • View by:

  1. J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43(8), 2666–2680 (2010).
    [Crossref]
  2. P. Omidi, L. C. M. Yip, H. Wang, M. Diop, and J. J. L. Carson, “PhaseWare : Phase map retrieval for fringe projection profilometry and off-axis digital holographic interferometry,” SoftwareX elsevier 13, 100652 (2021).
    [Crossref]
  3. P. Omidi, H. Wang, M. Diop, and J. J. L. Carson, “Algorithm for phase-displacement conversion from reflection digital holographic interferometry,” Pract. Hologr. Displays, Mater. Appl. SPIE 10944, 109440Q (2019).
    [Crossref]
  4. X. Su and W. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. 35(5), 263–284 (2001).
    [Crossref]
  5. W. Yin, S. Feng, T. Tao, L. Huang, M. Trusiak, Q. Chen, and C. Zuo, “High-speed 3D shape measurement using the optimized composite fringe patterns and stereo-assisted structured light system,” Opt. Express 27(3), 2411 (2019).
    [Crossref]
  6. Z. H. Zhang, “Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques,” Opt. Lasers Eng. 50(8), 1097–1106 (2012).
    [Crossref]
  7. C. Y. Liu and C. Y. Wang, “Investigation of Phase Pattern Modulation for Digital Fringe Projection Profilometry,” Meas. Sci. Rev. 20(1), 43–49 (2020).
    [Crossref]
  8. L. Huang, C. Ng, and A. K. Asundi, “Dynamic 3D measurement for specular reflecting surface with monoscopic fringe reflection deflectometry,” Opt. InfoBase Conf. Pap. 19, CWC3 (2011).
    [Crossref]
  9. J. L. Flores, B. Bravo-Medina, and J. A. Ferrari, “One-frame two-dimensional deflectometry for phase retrieval by addition of orthogonal fringe patterns,” Appl. Opt. 52(26), 6537–6542 (2013).
    [Crossref]
  10. M. T. Nguyen, Y. S. Ghim, and H. G. Rhee, “Single-shot deflectometry for dynamic 3D surface profile measurement by modified spatial-carrier frequency phase-shifting method,” Sci. Rep. 9(1), 3157 (2019).
    [Crossref]
  11. W.-H. Su and H. Liu, “Calibration-based two-frequency projected fringe profilometry: a robust, accurate, and single-shot measurement for objects with large depth discontinuities,” Opt. Express 14(20), 9178 (2006).
    [Crossref]
  12. M. Takeda, Q. Gu, M. Kinoshita, H. Takai, and Y. Takahashi, “Frequency-multiplex Fourier-transform profilometry : a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations,” Appl. Opt. 36(22), 5347–5354 (1997).
    [Crossref]
  13. C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using a composite fringe pattern,” Opt. Lasers Eng. 53, 25–30 (2014).
    [Crossref]
  14. C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using phase partitions,” Opt. Lasers Eng. 68, 111–120 (2015).
    [Crossref]
  15. G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017).
    [Crossref]
  16. C. Guan, L. G. Hassebrook, and D. L. Lau, “Composite structured light pattern for three-dimensional video,” Opt. Express 11(5), 406 (2003).
    [Crossref]
  17. Y. He and Y. Cao, “A composite-structured-light 3D measurement method based on fringe parameter calibration,” Opt. Lasers Eng. 49(7), 773–779 (2011).
    [Crossref]
  18. H. M. Yue, X. Y. Su, and Y. Z. Liu, “Fourier transform profilometry based on composite structured light pattern,” Opt. Laser Technol. 39(6), 1170–1175 (2007).
    [Crossref]
  19. S. Yang, Z. Hairong, L. Zhipei, and J. Jun, “Influence of sampling on composite structured light pattern,” 5th Int. Conf. Inf. Technol. Appl. Biomed. ITAB 2008 conjunction with 2nd Int. Symp. Summer Sch. Biomed. Heal. Eng. IS3BHE 2008171–174 (2008).
  20. Y. You, Y. Shen, G. Zhang, and X. Xing, “Real-time and high-resolution 3D face measurement via a smart active optical sensor,” Sensors 17(4), 734 (2017).
    [Crossref]
  21. Y. C. Wu, Y. P. Cao, Z. F. Huang, M. T. Lu, and D. L. Chen, “Improved composite Fourier transform profilometry,” Opt. Laser Technol. 44(7), 2037–2042 (2012).
    [Crossref]
  22. Philip Fong and F. Buron, “Sensing Deforming and Moving Objects with Commercial Off the Shelf Hardware,” 101 (2006).
  23. H. J. Chen, J. Zhang, D. J. Lv, and J. Fang, “3-D shape measurement by composite pattern projection and hybrid processing,” Opt. Express 15(19), 12318 (2007).
    [Crossref]
  24. W.-H. Su, “Color-encoded fringe projection for 3D shape measurements,” Opt. Express 15(20), 13167 (2007).
    [Crossref]
  25. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128 (2011).
    [Crossref]
  26. P. S. Huang, “Color-encoded digital fringe projection technique for high-speed three-dimensional surface contouring,” Opt. Eng. 38(6), 1065 (1999).
    [Crossref]
  27. J. L. Flores, J. A. Ferrari, G. García Torales, R. Legarda-Saenz, and A. Silva, “Color-fringe pattern profilometry using a generalized phase-shifting algorithm,” Appl. Opt. 54(30), 8827 (2015).
    [Crossref]
  28. I. Trumper, H. Choi, and D. W. Kim, “Instantaneous phase shifting deflectometry,” Opt. Express 24(24), 27993 (2016).
    [Crossref]
  29. M. Wu, N. Fan, G. Wu, S. Zhang, and F. Liu, “An inverse error compensation method for color-fringe pattern profilometry,” J. Opt. 22(3), 035705 (2020).
    [Crossref]
  30. Y. Wan, Y. Cao, X. Liu, T. Tao, and J. Kofman, “High-frequency color-encoded fringe-projection profilometry based on geometry constraint for large depth range,” Opt. Express 28(9), 13043 (2020).
    [Crossref]
  31. P. Omidi, M. Najiminaini, M. Diop, and J. J. L. Carson, “Single-shot detection of 8 unique monochrome fringe patterns representing 4 distinct directions via multispectral fringe projection profilometry,” Sci. Rep. 11(1), 10367 (2021).
    [Crossref]
  32. “Multispectral snapshot cameras,” https://www.spectraldevices.com/products/multispectral-snapshot-cameras .
  33. C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
    [Crossref]
  34. F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018).
    [Crossref]
  35. T. Požar and J. Možina, “Enhanced ellipse fitting in a two-detector homodyne quadrature laser interferometer,” Meas. Sci. Technol. 22(8), 085301 (2011).
    [Crossref]
  36. P. De Groot, “Phase-shift calibration errors in interferometers with spherical Fizeau cavities,” Appl. Opt. 34(16), 2856–2863 (1995).
    [Crossref]
  37. Y. Takahashi, “Effect of Phase Error in Phase-Shifting Interferometer,” Appl. Mech. Mater. 888, 11–16 (2019).
    [Crossref]
  38. G. A. Ayubi, I. Duarte, and J. A. Ferrari, “Optimal phase-shifting algorithm for interferograms with arbitrary steps and phase noise,” Opt. Lasers Eng. 114, 129–135 (2019).
    [Crossref]
  39. W. C. Goldstein RM and H. A. Zebker, “Satellite radar interferometry: Two-dimensional phase unwrapping,” Radio. Sci. 23(4), 713–720 (1988).
    [Crossref]
  40. P. Jia, “Comparison of linear and nonlinear calibration methods for phase-measuring profilometry,” Opt. Eng. 46(4), 043601 (2007).
    [Crossref]
  41. Y. Surrel, “Measurements By the Use of Phase Stepping,” Appl. Opt. 35(1), 51–60 (1996).
    [Crossref]
  42. S. Ri, Q. Wang, P. Xia, and H. Tsuda, “Spatiotemporal phase-shifting method for accurate phase analysis of fringe pattern,” J. Opt. 21(9), 095702 (2019).
    [Crossref]
  43. K. Li, J. Bu, and D. Zhang, “Lens distortion elimination for improving measurement accuracy of fringe projection profilometry,” Opt. Lasers Eng. 85, 53–64 (2016).
    [Crossref]
  44. S. Xing and H. Guo, “Iterative calibration method for measurement system having lens distortions in fringe projection profilometry,” Opt. Express 28(2), 1177 (2020).
    [Crossref]

2021 (2)

P. Omidi, L. C. M. Yip, H. Wang, M. Diop, and J. J. L. Carson, “PhaseWare : Phase map retrieval for fringe projection profilometry and off-axis digital holographic interferometry,” SoftwareX elsevier 13, 100652 (2021).
[Crossref]

P. Omidi, M. Najiminaini, M. Diop, and J. J. L. Carson, “Single-shot detection of 8 unique monochrome fringe patterns representing 4 distinct directions via multispectral fringe projection profilometry,” Sci. Rep. 11(1), 10367 (2021).
[Crossref]

2020 (4)

M. Wu, N. Fan, G. Wu, S. Zhang, and F. Liu, “An inverse error compensation method for color-fringe pattern profilometry,” J. Opt. 22(3), 035705 (2020).
[Crossref]

Y. Wan, Y. Cao, X. Liu, T. Tao, and J. Kofman, “High-frequency color-encoded fringe-projection profilometry based on geometry constraint for large depth range,” Opt. Express 28(9), 13043 (2020).
[Crossref]

C. Y. Liu and C. Y. Wang, “Investigation of Phase Pattern Modulation for Digital Fringe Projection Profilometry,” Meas. Sci. Rev. 20(1), 43–49 (2020).
[Crossref]

S. Xing and H. Guo, “Iterative calibration method for measurement system having lens distortions in fringe projection profilometry,” Opt. Express 28(2), 1177 (2020).
[Crossref]

2019 (6)

S. Ri, Q. Wang, P. Xia, and H. Tsuda, “Spatiotemporal phase-shifting method for accurate phase analysis of fringe pattern,” J. Opt. 21(9), 095702 (2019).
[Crossref]

P. Omidi, H. Wang, M. Diop, and J. J. L. Carson, “Algorithm for phase-displacement conversion from reflection digital holographic interferometry,” Pract. Hologr. Displays, Mater. Appl. SPIE 10944, 109440Q (2019).
[Crossref]

W. Yin, S. Feng, T. Tao, L. Huang, M. Trusiak, Q. Chen, and C. Zuo, “High-speed 3D shape measurement using the optimized composite fringe patterns and stereo-assisted structured light system,” Opt. Express 27(3), 2411 (2019).
[Crossref]

M. T. Nguyen, Y. S. Ghim, and H. G. Rhee, “Single-shot deflectometry for dynamic 3D surface profile measurement by modified spatial-carrier frequency phase-shifting method,” Sci. Rep. 9(1), 3157 (2019).
[Crossref]

Y. Takahashi, “Effect of Phase Error in Phase-Shifting Interferometer,” Appl. Mech. Mater. 888, 11–16 (2019).
[Crossref]

G. A. Ayubi, I. Duarte, and J. A. Ferrari, “Optimal phase-shifting algorithm for interferograms with arbitrary steps and phase noise,” Opt. Lasers Eng. 114, 129–135 (2019).
[Crossref]

2018 (2)

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018).
[Crossref]

2017 (2)

Y. You, Y. Shen, G. Zhang, and X. Xing, “Real-time and high-resolution 3D face measurement via a smart active optical sensor,” Sensors 17(4), 734 (2017).
[Crossref]

G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017).
[Crossref]

2016 (2)

I. Trumper, H. Choi, and D. W. Kim, “Instantaneous phase shifting deflectometry,” Opt. Express 24(24), 27993 (2016).
[Crossref]

K. Li, J. Bu, and D. Zhang, “Lens distortion elimination for improving measurement accuracy of fringe projection profilometry,” Opt. Lasers Eng. 85, 53–64 (2016).
[Crossref]

2015 (2)

2014 (1)

C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using a composite fringe pattern,” Opt. Lasers Eng. 53, 25–30 (2014).
[Crossref]

2013 (1)

2012 (2)

Z. H. Zhang, “Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques,” Opt. Lasers Eng. 50(8), 1097–1106 (2012).
[Crossref]

Y. C. Wu, Y. P. Cao, Z. F. Huang, M. T. Lu, and D. L. Chen, “Improved composite Fourier transform profilometry,” Opt. Laser Technol. 44(7), 2037–2042 (2012).
[Crossref]

2011 (4)

J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128 (2011).
[Crossref]

T. Požar and J. Možina, “Enhanced ellipse fitting in a two-detector homodyne quadrature laser interferometer,” Meas. Sci. Technol. 22(8), 085301 (2011).
[Crossref]

L. Huang, C. Ng, and A. K. Asundi, “Dynamic 3D measurement for specular reflecting surface with monoscopic fringe reflection deflectometry,” Opt. InfoBase Conf. Pap. 19, CWC3 (2011).
[Crossref]

Y. He and Y. Cao, “A composite-structured-light 3D measurement method based on fringe parameter calibration,” Opt. Lasers Eng. 49(7), 773–779 (2011).
[Crossref]

2010 (1)

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43(8), 2666–2680 (2010).
[Crossref]

2007 (4)

H. M. Yue, X. Y. Su, and Y. Z. Liu, “Fourier transform profilometry based on composite structured light pattern,” Opt. Laser Technol. 39(6), 1170–1175 (2007).
[Crossref]

H. J. Chen, J. Zhang, D. J. Lv, and J. Fang, “3-D shape measurement by composite pattern projection and hybrid processing,” Opt. Express 15(19), 12318 (2007).
[Crossref]

W.-H. Su, “Color-encoded fringe projection for 3D shape measurements,” Opt. Express 15(20), 13167 (2007).
[Crossref]

P. Jia, “Comparison of linear and nonlinear calibration methods for phase-measuring profilometry,” Opt. Eng. 46(4), 043601 (2007).
[Crossref]

2006 (1)

2003 (1)

2001 (1)

X. Su and W. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. 35(5), 263–284 (2001).
[Crossref]

1999 (1)

P. S. Huang, “Color-encoded digital fringe projection technique for high-speed three-dimensional surface contouring,” Opt. Eng. 38(6), 1065 (1999).
[Crossref]

1997 (1)

1996 (1)

1995 (1)

1988 (1)

W. C. Goldstein RM and H. A. Zebker, “Satellite radar interferometry: Two-dimensional phase unwrapping,” Radio. Sci. 23(4), 713–720 (1988).
[Crossref]

Alcalá Ochoa, N.

C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using phase partitions,” Opt. Lasers Eng. 68, 111–120 (2015).
[Crossref]

C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using a composite fringe pattern,” Opt. Lasers Eng. 53, 25–30 (2014).
[Crossref]

Asundi, A. K.

L. Huang, C. Ng, and A. K. Asundi, “Dynamic 3D measurement for specular reflecting surface with monoscopic fringe reflection deflectometry,” Opt. InfoBase Conf. Pap. 19, CWC3 (2011).
[Crossref]

Ayubi, G. A.

G. A. Ayubi, I. Duarte, and J. A. Ferrari, “Optimal phase-shifting algorithm for interferograms with arbitrary steps and phase noise,” Opt. Lasers Eng. 114, 129–135 (2019).
[Crossref]

Bravo-Medina, B.

Bu, J.

K. Li, J. Bu, and D. Zhang, “Lens distortion elimination for improving measurement accuracy of fringe projection profilometry,” Opt. Lasers Eng. 85, 53–64 (2016).
[Crossref]

Buron, F.

Philip Fong and F. Buron, “Sensing Deforming and Moving Objects with Commercial Off the Shelf Hardware,” 101 (2006).

Cao, Y.

Y. Wan, Y. Cao, X. Liu, T. Tao, and J. Kofman, “High-frequency color-encoded fringe-projection profilometry based on geometry constraint for large depth range,” Opt. Express 28(9), 13043 (2020).
[Crossref]

Y. He and Y. Cao, “A composite-structured-light 3D measurement method based on fringe parameter calibration,” Opt. Lasers Eng. 49(7), 773–779 (2011).
[Crossref]

Cao, Y. P.

Y. C. Wu, Y. P. Cao, Z. F. Huang, M. T. Lu, and D. L. Chen, “Improved composite Fourier transform profilometry,” Opt. Laser Technol. 44(7), 2037–2042 (2012).
[Crossref]

Carson, J. J. L.

P. Omidi, L. C. M. Yip, H. Wang, M. Diop, and J. J. L. Carson, “PhaseWare : Phase map retrieval for fringe projection profilometry and off-axis digital holographic interferometry,” SoftwareX elsevier 13, 100652 (2021).
[Crossref]

P. Omidi, M. Najiminaini, M. Diop, and J. J. L. Carson, “Single-shot detection of 8 unique monochrome fringe patterns representing 4 distinct directions via multispectral fringe projection profilometry,” Sci. Rep. 11(1), 10367 (2021).
[Crossref]

P. Omidi, H. Wang, M. Diop, and J. J. L. Carson, “Algorithm for phase-displacement conversion from reflection digital holographic interferometry,” Pract. Hologr. Displays, Mater. Appl. SPIE 10944, 109440Q (2019).
[Crossref]

Chen, D. L.

Y. C. Wu, Y. P. Cao, Z. F. Huang, M. T. Lu, and D. L. Chen, “Improved composite Fourier transform profilometry,” Opt. Laser Technol. 44(7), 2037–2042 (2012).
[Crossref]

Chen, H. J.

Chen, Q.

W. Yin, S. Feng, T. Tao, L. Huang, M. Trusiak, Q. Chen, and C. Zuo, “High-speed 3D shape measurement using the optimized composite fringe patterns and stereo-assisted structured light system,” Opt. Express 27(3), 2411 (2019).
[Crossref]

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

Chen, W.

X. Su and W. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. 35(5), 263–284 (2001).
[Crossref]

Choi, H.

De Groot, P.

Diop, M.

P. Omidi, M. Najiminaini, M. Diop, and J. J. L. Carson, “Single-shot detection of 8 unique monochrome fringe patterns representing 4 distinct directions via multispectral fringe projection profilometry,” Sci. Rep. 11(1), 10367 (2021).
[Crossref]

P. Omidi, L. C. M. Yip, H. Wang, M. Diop, and J. J. L. Carson, “PhaseWare : Phase map retrieval for fringe projection profilometry and off-axis digital holographic interferometry,” SoftwareX elsevier 13, 100652 (2021).
[Crossref]

P. Omidi, H. Wang, M. Diop, and J. J. L. Carson, “Algorithm for phase-displacement conversion from reflection digital holographic interferometry,” Pract. Hologr. Displays, Mater. Appl. SPIE 10944, 109440Q (2019).
[Crossref]

Du, G.

G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017).
[Crossref]

Duarte, I.

G. A. Ayubi, I. Duarte, and J. A. Ferrari, “Optimal phase-shifting algorithm for interferograms with arbitrary steps and phase noise,” Opt. Lasers Eng. 114, 129–135 (2019).
[Crossref]

Fan, N.

M. Wu, N. Fan, G. Wu, S. Zhang, and F. Liu, “An inverse error compensation method for color-fringe pattern profilometry,” J. Opt. 22(3), 035705 (2020).
[Crossref]

Fang, J.

Feng, S.

W. Yin, S. Feng, T. Tao, L. Huang, M. Trusiak, Q. Chen, and C. Zuo, “High-speed 3D shape measurement using the optimized composite fringe patterns and stereo-assisted structured light system,” Opt. Express 27(3), 2411 (2019).
[Crossref]

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

Fernandez, S.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43(8), 2666–2680 (2010).
[Crossref]

Ferrari, J. A.

Flores, J. L.

Fong, Philip

Philip Fong and F. Buron, “Sensing Deforming and Moving Objects with Commercial Off the Shelf Hardware,” 101 (2006).

García Torales, G.

García-Isáis, C. A.

C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using phase partitions,” Opt. Lasers Eng. 68, 111–120 (2015).
[Crossref]

C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using a composite fringe pattern,” Opt. Lasers Eng. 53, 25–30 (2014).
[Crossref]

Geng, J.

J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128 (2011).
[Crossref]

Ghim, Y. S.

M. T. Nguyen, Y. S. Ghim, and H. G. Rhee, “Single-shot deflectometry for dynamic 3D surface profile measurement by modified spatial-carrier frequency phase-shifting method,” Sci. Rep. 9(1), 3157 (2019).
[Crossref]

Goldstein RM, W. C.

W. C. Goldstein RM and H. A. Zebker, “Satellite radar interferometry: Two-dimensional phase unwrapping,” Radio. Sci. 23(4), 713–720 (1988).
[Crossref]

Gu, Q.

Guan, C.

Guo, H.

Hairong, Z.

S. Yang, Z. Hairong, L. Zhipei, and J. Jun, “Influence of sampling on composite structured light pattern,” 5th Int. Conf. Inf. Technol. Appl. Biomed. ITAB 2008 conjunction with 2nd Int. Symp. Summer Sch. Biomed. Heal. Eng. IS3BHE 2008171–174 (2008).

Hassebrook, L. G.

He, Y.

Y. He and Y. Cao, “A composite-structured-light 3D measurement method based on fringe parameter calibration,” Opt. Lasers Eng. 49(7), 773–779 (2011).
[Crossref]

Huang, L.

W. Yin, S. Feng, T. Tao, L. Huang, M. Trusiak, Q. Chen, and C. Zuo, “High-speed 3D shape measurement using the optimized composite fringe patterns and stereo-assisted structured light system,” Opt. Express 27(3), 2411 (2019).
[Crossref]

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

L. Huang, C. Ng, and A. K. Asundi, “Dynamic 3D measurement for specular reflecting surface with monoscopic fringe reflection deflectometry,” Opt. InfoBase Conf. Pap. 19, CWC3 (2011).
[Crossref]

Huang, P. S.

P. S. Huang, “Color-encoded digital fringe projection technique for high-speed three-dimensional surface contouring,” Opt. Eng. 38(6), 1065 (1999).
[Crossref]

Huang, Z. F.

Y. C. Wu, Y. P. Cao, Z. F. Huang, M. T. Lu, and D. L. Chen, “Improved composite Fourier transform profilometry,” Opt. Laser Technol. 44(7), 2037–2042 (2012).
[Crossref]

Jia, P.

P. Jia, “Comparison of linear and nonlinear calibration methods for phase-measuring profilometry,” Opt. Eng. 46(4), 043601 (2007).
[Crossref]

Jun, J.

S. Yang, Z. Hairong, L. Zhipei, and J. Jun, “Influence of sampling on composite structured light pattern,” 5th Int. Conf. Inf. Technol. Appl. Biomed. ITAB 2008 conjunction with 2nd Int. Symp. Summer Sch. Biomed. Heal. Eng. IS3BHE 2008171–174 (2008).

Kim, D. W.

Kinoshita, M.

Kofman, J.

König, N.

F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018).
[Crossref]

Lau, D. L.

Legarda-Saenz, R.

Lei, Z.

G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017).
[Crossref]

Li, H.

G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017).
[Crossref]

Li, K.

K. Li, J. Bu, and D. Zhang, “Lens distortion elimination for improving measurement accuracy of fringe projection profilometry,” Opt. Lasers Eng. 85, 53–64 (2016).
[Crossref]

Li, Y.

G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017).
[Crossref]

Liu, C. Y.

C. Y. Liu and C. Y. Wang, “Investigation of Phase Pattern Modulation for Digital Fringe Projection Profilometry,” Meas. Sci. Rev. 20(1), 43–49 (2020).
[Crossref]

Liu, F.

M. Wu, N. Fan, G. Wu, S. Zhang, and F. Liu, “An inverse error compensation method for color-fringe pattern profilometry,” J. Opt. 22(3), 035705 (2020).
[Crossref]

F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018).
[Crossref]

Liu, H.

Liu, X.

Liu, Y. Z.

H. M. Yue, X. Y. Su, and Y. Z. Liu, “Fourier transform profilometry based on composite structured light pattern,” Opt. Laser Technol. 39(6), 1170–1175 (2007).
[Crossref]

Llado, X.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43(8), 2666–2680 (2010).
[Crossref]

Lu, M. T.

Y. C. Wu, Y. P. Cao, Z. F. Huang, M. T. Lu, and D. L. Chen, “Improved composite Fourier transform profilometry,” Opt. Laser Technol. 44(7), 2037–2042 (2012).
[Crossref]

Lv, D. J.

Možina, J.

T. Požar and J. Možina, “Enhanced ellipse fitting in a two-detector homodyne quadrature laser interferometer,” Meas. Sci. Technol. 22(8), 085301 (2011).
[Crossref]

Najiminaini, M.

P. Omidi, M. Najiminaini, M. Diop, and J. J. L. Carson, “Single-shot detection of 8 unique monochrome fringe patterns representing 4 distinct directions via multispectral fringe projection profilometry,” Sci. Rep. 11(1), 10367 (2021).
[Crossref]

Ng, C.

L. Huang, C. Ng, and A. K. Asundi, “Dynamic 3D measurement for specular reflecting surface with monoscopic fringe reflection deflectometry,” Opt. InfoBase Conf. Pap. 19, CWC3 (2011).
[Crossref]

Nguyen, M. T.

M. T. Nguyen, Y. S. Ghim, and H. G. Rhee, “Single-shot deflectometry for dynamic 3D surface profile measurement by modified spatial-carrier frequency phase-shifting method,” Sci. Rep. 9(1), 3157 (2019).
[Crossref]

Omidi, P.

P. Omidi, L. C. M. Yip, H. Wang, M. Diop, and J. J. L. Carson, “PhaseWare : Phase map retrieval for fringe projection profilometry and off-axis digital holographic interferometry,” SoftwareX elsevier 13, 100652 (2021).
[Crossref]

P. Omidi, M. Najiminaini, M. Diop, and J. J. L. Carson, “Single-shot detection of 8 unique monochrome fringe patterns representing 4 distinct directions via multispectral fringe projection profilometry,” Sci. Rep. 11(1), 10367 (2021).
[Crossref]

P. Omidi, H. Wang, M. Diop, and J. J. L. Carson, “Algorithm for phase-displacement conversion from reflection digital holographic interferometry,” Pract. Hologr. Displays, Mater. Appl. SPIE 10944, 109440Q (2019).
[Crossref]

Požar, T.

T. Požar and J. Možina, “Enhanced ellipse fitting in a two-detector homodyne quadrature laser interferometer,” Meas. Sci. Technol. 22(8), 085301 (2011).
[Crossref]

Pribanic, T.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43(8), 2666–2680 (2010).
[Crossref]

Rhee, H. G.

M. T. Nguyen, Y. S. Ghim, and H. G. Rhee, “Single-shot deflectometry for dynamic 3D surface profile measurement by modified spatial-carrier frequency phase-shifting method,” Sci. Rep. 9(1), 3157 (2019).
[Crossref]

Ri, S.

S. Ri, Q. Wang, P. Xia, and H. Tsuda, “Spatiotemporal phase-shifting method for accurate phase analysis of fringe pattern,” J. Opt. 21(9), 095702 (2019).
[Crossref]

Salvi, J.

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43(8), 2666–2680 (2010).
[Crossref]

Schmitt, R.

F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018).
[Crossref]

Shen, Y.

Y. You, Y. Shen, G. Zhang, and X. Xing, “Real-time and high-resolution 3D face measurement via a smart active optical sensor,” Sensors 17(4), 734 (2017).
[Crossref]

Si, S.

G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017).
[Crossref]

Silva, A.

Su, W.-H.

Su, X.

X. Su and W. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. 35(5), 263–284 (2001).
[Crossref]

Su, X. Y.

H. M. Yue, X. Y. Su, and Y. Z. Liu, “Fourier transform profilometry based on composite structured light pattern,” Opt. Laser Technol. 39(6), 1170–1175 (2007).
[Crossref]

Surrel, Y.

Takahashi, Y.

Takai, H.

Takeda, M.

Tao, T.

Trumper, I.

Trusiak, M.

Tsuda, H.

S. Ri, Q. Wang, P. Xia, and H. Tsuda, “Spatiotemporal phase-shifting method for accurate phase analysis of fringe pattern,” J. Opt. 21(9), 095702 (2019).
[Crossref]

Wan, Y.

Y. Wan, Y. Cao, X. Liu, T. Tao, and J. Kofman, “High-frequency color-encoded fringe-projection profilometry based on geometry constraint for large depth range,” Opt. Express 28(9), 13043 (2020).
[Crossref]

F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018).
[Crossref]

Wang, C. Y.

C. Y. Liu and C. Y. Wang, “Investigation of Phase Pattern Modulation for Digital Fringe Projection Profilometry,” Meas. Sci. Rev. 20(1), 43–49 (2020).
[Crossref]

Wang, H.

P. Omidi, L. C. M. Yip, H. Wang, M. Diop, and J. J. L. Carson, “PhaseWare : Phase map retrieval for fringe projection profilometry and off-axis digital holographic interferometry,” SoftwareX elsevier 13, 100652 (2021).
[Crossref]

P. Omidi, H. Wang, M. Diop, and J. J. L. Carson, “Algorithm for phase-displacement conversion from reflection digital holographic interferometry,” Pract. Hologr. Displays, Mater. Appl. SPIE 10944, 109440Q (2019).
[Crossref]

Wang, M.

G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017).
[Crossref]

Wang, Q.

S. Ri, Q. Wang, P. Xia, and H. Tsuda, “Spatiotemporal phase-shifting method for accurate phase analysis of fringe pattern,” J. Opt. 21(9), 095702 (2019).
[Crossref]

Wu, F.

F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018).
[Crossref]

Wu, G.

M. Wu, N. Fan, G. Wu, S. Zhang, and F. Liu, “An inverse error compensation method for color-fringe pattern profilometry,” J. Opt. 22(3), 035705 (2020).
[Crossref]

Wu, M.

M. Wu, N. Fan, G. Wu, S. Zhang, and F. Liu, “An inverse error compensation method for color-fringe pattern profilometry,” J. Opt. 22(3), 035705 (2020).
[Crossref]

Wu, Y.

F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018).
[Crossref]

Wu, Y. C.

Y. C. Wu, Y. P. Cao, Z. F. Huang, M. T. Lu, and D. L. Chen, “Improved composite Fourier transform profilometry,” Opt. Laser Technol. 44(7), 2037–2042 (2012).
[Crossref]

Xia, P.

S. Ri, Q. Wang, P. Xia, and H. Tsuda, “Spatiotemporal phase-shifting method for accurate phase analysis of fringe pattern,” J. Opt. 21(9), 095702 (2019).
[Crossref]

Xing, S.

Xing, X.

Y. You, Y. Shen, G. Zhang, and X. Xing, “Real-time and high-resolution 3D face measurement via a smart active optical sensor,” Sensors 17(4), 734 (2017).
[Crossref]

Xu, Y.

F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018).
[Crossref]

Yang, S.

S. Yang, Z. Hairong, L. Zhipei, and J. Jun, “Influence of sampling on composite structured light pattern,” 5th Int. Conf. Inf. Technol. Appl. Biomed. ITAB 2008 conjunction with 2nd Int. Symp. Summer Sch. Biomed. Heal. Eng. IS3BHE 2008171–174 (2008).

Yin, W.

W. Yin, S. Feng, T. Tao, L. Huang, M. Trusiak, Q. Chen, and C. Zuo, “High-speed 3D shape measurement using the optimized composite fringe patterns and stereo-assisted structured light system,” Opt. Express 27(3), 2411 (2019).
[Crossref]

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

Yip, L. C. M.

P. Omidi, L. C. M. Yip, H. Wang, M. Diop, and J. J. L. Carson, “PhaseWare : Phase map retrieval for fringe projection profilometry and off-axis digital holographic interferometry,” SoftwareX elsevier 13, 100652 (2021).
[Crossref]

You, Y.

Y. You, Y. Shen, G. Zhang, and X. Xing, “Real-time and high-resolution 3D face measurement via a smart active optical sensor,” Sensors 17(4), 734 (2017).
[Crossref]

Yue, H. M.

H. M. Yue, X. Y. Su, and Y. Z. Liu, “Fourier transform profilometry based on composite structured light pattern,” Opt. Laser Technol. 39(6), 1170–1175 (2007).
[Crossref]

Zebker, H. A.

W. C. Goldstein RM and H. A. Zebker, “Satellite radar interferometry: Two-dimensional phase unwrapping,” Radio. Sci. 23(4), 713–720 (1988).
[Crossref]

Zhang, D.

K. Li, J. Bu, and D. Zhang, “Lens distortion elimination for improving measurement accuracy of fringe projection profilometry,” Opt. Lasers Eng. 85, 53–64 (2016).
[Crossref]

Zhang, G.

Y. You, Y. Shen, G. Zhang, and X. Xing, “Real-time and high-resolution 3D face measurement via a smart active optical sensor,” Sensors 17(4), 734 (2017).
[Crossref]

Zhang, J.

Zhang, S.

M. Wu, N. Fan, G. Wu, S. Zhang, and F. Liu, “An inverse error compensation method for color-fringe pattern profilometry,” J. Opt. 22(3), 035705 (2020).
[Crossref]

Zhang, Z. H.

Z. H. Zhang, “Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques,” Opt. Lasers Eng. 50(8), 1097–1106 (2012).
[Crossref]

Zhipei, L.

S. Yang, Z. Hairong, L. Zhipei, and J. Jun, “Influence of sampling on composite structured light pattern,” 5th Int. Conf. Inf. Technol. Appl. Biomed. ITAB 2008 conjunction with 2nd Int. Symp. Summer Sch. Biomed. Heal. Eng. IS3BHE 2008171–174 (2008).

Zhou, C.

G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017).
[Crossref]

Zuo, C.

W. Yin, S. Feng, T. Tao, L. Huang, M. Trusiak, Q. Chen, and C. Zuo, “High-speed 3D shape measurement using the optimized composite fringe patterns and stereo-assisted structured light system,” Opt. Express 27(3), 2411 (2019).
[Crossref]

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

Adv. Opt. Photonics (1)

J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photonics 3(2), 128 (2011).
[Crossref]

Appl. Mech. Mater. (1)

Y. Takahashi, “Effect of Phase Error in Phase-Shifting Interferometer,” Appl. Mech. Mater. 888, 11–16 (2019).
[Crossref]

Appl. Opt. (5)

J. Opt. (2)

M. Wu, N. Fan, G. Wu, S. Zhang, and F. Liu, “An inverse error compensation method for color-fringe pattern profilometry,” J. Opt. 22(3), 035705 (2020).
[Crossref]

S. Ri, Q. Wang, P. Xia, and H. Tsuda, “Spatiotemporal phase-shifting method for accurate phase analysis of fringe pattern,” J. Opt. 21(9), 095702 (2019).
[Crossref]

Meas. Sci. Rev. (1)

C. Y. Liu and C. Y. Wang, “Investigation of Phase Pattern Modulation for Digital Fringe Projection Profilometry,” Meas. Sci. Rev. 20(1), 43–49 (2020).
[Crossref]

Meas. Sci. Technol. (1)

T. Požar and J. Možina, “Enhanced ellipse fitting in a two-detector homodyne quadrature laser interferometer,” Meas. Sci. Technol. 22(8), 085301 (2011).
[Crossref]

Opt. Appl. (1)

G. Du, M. Wang, C. Zhou, S. Si, H. Li, Z. Lei, and Y. Li, “One shot profilometry using iterative two-step temporal phase-unwrapping,” Opt. Appl. 47, 97–110 (2017).
[Crossref]

Opt. Eng. (2)

P. S. Huang, “Color-encoded digital fringe projection technique for high-speed three-dimensional surface contouring,” Opt. Eng. 38(6), 1065 (1999).
[Crossref]

P. Jia, “Comparison of linear and nonlinear calibration methods for phase-measuring profilometry,” Opt. Eng. 46(4), 043601 (2007).
[Crossref]

Opt. Express (8)

Opt. InfoBase Conf. Pap. (1)

L. Huang, C. Ng, and A. K. Asundi, “Dynamic 3D measurement for specular reflecting surface with monoscopic fringe reflection deflectometry,” Opt. InfoBase Conf. Pap. 19, CWC3 (2011).
[Crossref]

Opt. Laser Technol. (2)

H. M. Yue, X. Y. Su, and Y. Z. Liu, “Fourier transform profilometry based on composite structured light pattern,” Opt. Laser Technol. 39(6), 1170–1175 (2007).
[Crossref]

Y. C. Wu, Y. P. Cao, Z. F. Huang, M. T. Lu, and D. L. Chen, “Improved composite Fourier transform profilometry,” Opt. Laser Technol. 44(7), 2037–2042 (2012).
[Crossref]

Opt. Lasers Eng. (8)

C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018).
[Crossref]

Y. He and Y. Cao, “A composite-structured-light 3D measurement method based on fringe parameter calibration,” Opt. Lasers Eng. 49(7), 773–779 (2011).
[Crossref]

C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using a composite fringe pattern,” Opt. Lasers Eng. 53, 25–30 (2014).
[Crossref]

C. A. García-Isáis and N. Alcalá Ochoa, “One shot profilometry using phase partitions,” Opt. Lasers Eng. 68, 111–120 (2015).
[Crossref]

Z. H. Zhang, “Review of single-shot 3D shape measurement by phase calculation-based fringe projection techniques,” Opt. Lasers Eng. 50(8), 1097–1106 (2012).
[Crossref]

X. Su and W. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. 35(5), 263–284 (2001).
[Crossref]

K. Li, J. Bu, and D. Zhang, “Lens distortion elimination for improving measurement accuracy of fringe projection profilometry,” Opt. Lasers Eng. 85, 53–64 (2016).
[Crossref]

G. A. Ayubi, I. Duarte, and J. A. Ferrari, “Optimal phase-shifting algorithm for interferograms with arbitrary steps and phase noise,” Opt. Lasers Eng. 114, 129–135 (2019).
[Crossref]

Pattern Recognit. (1)

J. Salvi, S. Fernandez, T. Pribanic, and X. Llado, “A state of the art in structured light patterns for surface profilometry,” Pattern Recognit. 43(8), 2666–2680 (2010).
[Crossref]

Pract. Hologr. Displays, Mater. Appl. SPIE (1)

P. Omidi, H. Wang, M. Diop, and J. J. L. Carson, “Algorithm for phase-displacement conversion from reflection digital holographic interferometry,” Pract. Hologr. Displays, Mater. Appl. SPIE 10944, 109440Q (2019).
[Crossref]

Radio. Sci. (1)

W. C. Goldstein RM and H. A. Zebker, “Satellite radar interferometry: Two-dimensional phase unwrapping,” Radio. Sci. 23(4), 713–720 (1988).
[Crossref]

Sci. Rep. (3)

M. T. Nguyen, Y. S. Ghim, and H. G. Rhee, “Single-shot deflectometry for dynamic 3D surface profile measurement by modified spatial-carrier frequency phase-shifting method,” Sci. Rep. 9(1), 3157 (2019).
[Crossref]

F. Liu, Y. Wu, F. Wu, N. König, R. Schmitt, Y. Wan, and Y. Xu, “Precise phase demodulation of single carrier-frequency interferogram by pixel-level Lissajous figure and ellipse fitting,” Sci. Rep. 8(1), 148 (2018).
[Crossref]

P. Omidi, M. Najiminaini, M. Diop, and J. J. L. Carson, “Single-shot detection of 8 unique monochrome fringe patterns representing 4 distinct directions via multispectral fringe projection profilometry,” Sci. Rep. 11(1), 10367 (2021).
[Crossref]

Sensors (1)

Y. You, Y. Shen, G. Zhang, and X. Xing, “Real-time and high-resolution 3D face measurement via a smart active optical sensor,” Sensors 17(4), 734 (2017).
[Crossref]

SoftwareX elsevier (1)

P. Omidi, L. C. M. Yip, H. Wang, M. Diop, and J. J. L. Carson, “PhaseWare : Phase map retrieval for fringe projection profilometry and off-axis digital holographic interferometry,” SoftwareX elsevier 13, 100652 (2021).
[Crossref]

Other (3)

S. Yang, Z. Hairong, L. Zhipei, and J. Jun, “Influence of sampling on composite structured light pattern,” 5th Int. Conf. Inf. Technol. Appl. Biomed. ITAB 2008 conjunction with 2nd Int. Symp. Summer Sch. Biomed. Heal. Eng. IS3BHE 2008171–174 (2008).

Philip Fong and F. Buron, “Sensing Deforming and Moving Objects with Commercial Off the Shelf Hardware,” 101 (2006).

“Multispectral snapshot cameras,” https://www.spectraldevices.com/products/multispectral-snapshot-cameras .

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Multispectral pattern generation. (a) Schematic of the MFA (not to scale). (b) Spectral responses curves for the MFA and the multispectral camera. (c) Rendering of the optical setup consisting of a halogen light source, MFA, lenses, the object, and snapshot multispectral camera. (d) Color image (Apple iPhone-6s) of the CP projected onto the surface of a white plastic object.
Fig. 2.
Fig. 2. Normalization process for phase extraction from four spectral fringe patterns. (a) Raw data before demosaicing. (b) Four adjacent pairs of FPs. (c) Schematic of a pair of identical windows sliding over a pair of adjacent FPs. (d) Example of intensity plots of ${I_2}$ against ${I_1}$ for a sliding window (25,25) pixels in size and centered at (100,400). (e) Ellipse fitted with least squares method to data in panel (d). (f) Normalized fitted ellipse computed from data in panel (e). (g-j) Four FPs captured with the multispectral camera for a flat surface after filtering with a gaussian low pass filter. (k) Intensity profiles along the dashed lines within the regions indicated in the magnified parts of (g-j). (l-o) The normalized FPs corresponding to data in panels (g-j). (p) Intensity profiles along the dashed lines within the regions indicated in the magnified parts of (k-n).
Fig. 3.
Fig. 3. Example of the phase step estimation for all four FP pairs based on the normalized Lissajous figure.
Fig. 4.
Fig. 4. Simulations to test the performance of the pre-processing explained in the method section for (a) normalization and (b) angle offset detection.
Fig. 5.
Fig. 5. Phase retrieval process for a flat object. Wrapped phase map (a), unwrapped phase map (b), and line profiles obtained using Eq. (3) with FPs before normalization (c). (d-f) Same as (a-c) after normalization over FPs. (g-i) Same as (a-c) using Eq. (11) and FPs after normalization. (j) Comparison of the profiles obtained using the different phase retrieval methods to the ideal case (flat imaged object) which shows the phase error.
Fig. 6.
Fig. 6. Normalization effect for an object with discontinuities and edges. (a) CAD model of a target consisting of five groups of bars of varying width (1 to 5 mm in steps of 1 mm). (b) Original FPs. (c) Unwrapped phase for the original FPs. (d) Normalized FPs. (e) Unwrapped phase for the normalized FPs. (f) Same as panel (e), except with background removed.
Fig. 7.
Fig. 7. 3D imaging of complex objects. The left column shows the raw images captured by the multispectral camera before demosaicing. The second column shows the wrapped phase maps. The third column shows the color-coded height maps. The right-most column shows 3D visualizations of the height map. (a-d) A cube object, (e-h) a plastic head phantom, (i-l) back surface of a hand, and (m-p) a hand holding two ping-pong balls.

Equations (11)

Equations on this page are rendered with MathJax. Learn more.

I ( x , y ) = A ( x , y ) + B ( x , y ) cos [ ϕ ( x , y ) ]
I n ( x , y ) = A ( x , y ) + B ( x , y ) cos [ ϕ ( x , y ) + ( n 1 ) π / 2 ]
ϕ = arctan ( sin ( ϕ ) cos ( ϕ ) ) = arctan ( I 4 I 2 I 1 I 3 )
I n = A n + B n cos [ ϕ + ( n 1 ) π / 2 ]
arctan ( I 4 I 2 I 1 I 3 ) = arctan ( A 4 A 2 + ( B 4 + B 2 ) sin ( ϕ ) A 1 A 3 + ( B 1 + B 3 ) cos ( ϕ ) ) = arctan ( A N + ( B N ) sin ( ϕ ) A D + ( B D ) cos ( ϕ ) ) = ϕ ^
I x = x 0 + ρ x cos ω t  and  I y = y 0 + ρ y cos ( ω t + ψ )
( I x x 0 ) 2 ρ x 2 + ( I y y 0 ) 2 ρ y 2 2 cos ψ ( I x x 0 ) ( I y y 0 ) ρ x ρ y = sin 2 ψ
c 1 I x 2 + 2 c 2 I x I y + c 3 I y 2 + 2 c 4 I x + 2 c 5 I y + c 6 = 0
{ x 0 = c 2 c 5 c 3 c 4 δ & ρ x = c 3 Δ δ y 0 = c 2 c 4 c 1 c 5 δ & ρ y = c 1 Δ δ ψ = arccos ( c 2 c 1 c 3 )  where  { δ = | c 1 c 2 c 2 c 3 | Δ = det ( [ c 1 c 2 c 4 c 2 c 3 c 5 c 4 c 5 c 6 ] )
I n = A + B cos [ ϕ + ψ n ]
ϕ = arctan ( k = 1 N l = 1 N n = 1 N I n [ cos ( ψ k ) cos ( ψ l ) ] [ cos ( ψ k ) sin ( ψ l ) + cos ( ψ n ) sin ( ψ k ) + cos ( ψ l ) sin ( ψ n ) ] k = 1 N l = 1 N n = 1 N I n [ sin ( ψ k ) sin ( ψ l ) ] [ cos ( ψ k ) sin ( ψ l ) + cos ( ψ n ) sin ( ψ k ) + cos ( ψ l ) sin ( ψ n ) ] )

Metrics