We present a novel color fringe projection system to obtain absolute 3D shape and color of objects simultaneously. Optimum 3-frequency interferometry is used to produce time efficient analysis of the projected fringes by encoding three fringe sets of different pitch into the primary colors of a digital light projector and recording the information on a 3-chip color CCD camera. Phase shifting analysis is used to retrieve sub-wavelength phase information. Absolute phase across the field is calculated using the 3-frequency method independently at each pixel. Concurrent color data is also captured via the RGB channels of the CCD. Thus full-field absolute shape (XYZ) and color (RGB) can be obtained. In this paper we present the basis of the technique and preliminary results having addressed the issue of crosstalk between the color channels.
©2006 Optical Society of America
In the past 20 years, optical metrology has found numerous applications in scientific  and commercial fields [2, 3] owing to its inherent non-intrusive nature. One of the most widely researched topics has been the measurement of 3D surface form with fringe projection techniques becoming popular due to their flexibility in adapting to different length scales. Sub-fringe phase measurement modulo 2π is obtained routinely using phase stepping  and Fourier transform techniques [5–6]. In recent years considerable research has been performed on absolute fringe order identification. Temporal unwrapping techniques have become dominant where a sequence of fringe patterns are projected with varying fringe frequency [7–10]. The authors recently introduced a robust optimization process for frequency selection in multi-wavelength interferometry where a geometric series of synthetic wavelengths is defined to maximize the overall process reliability [11–13]. The limited lateral resolution of commercial digital light processing (DLP) video projectors and CCD cameras restricts the maximum number of projected fringes to ~250. We have demonstrated that three projected fringe frequencies are required to determine fringe order with 6σ reliability over this number of fringes using optimum frequency selection .
Color has also been used as a means of encoding depth information for 3D shape measurement. Hausler et al presented a color-coded triangulation method, which is simple, fast, and without moving mechanical parts. Whilst this approach uses a single frame, the dynamic range is limited to <200 . Huang et al proposed a color-encoded fringe projection technique using three patterns, one in each color channel, with a phase shift of 2π/3 between neighboring channels. The 3D surface contour information can be retrieved from a single image snapshot of the object surface . Skydan et al presented a method with up to three fringe patterns projected on the three primary color channels from three different video projectors at different viewpoints to overcome shadowing effects on the object . However, the approaches described by Huang et al. and Skydan et al only produce wrapped phase maps, so spatial phase unwrapping methods are required to generate the 3D shape of an object and hence these methods cannot be applied to objects with large slope or discontinuity.
Kakunai et al. introduced a method to project two gratings simultaneously with different color and fringe pitch . The color allowed the image of each grating to be captured by separate monochrome cameras via appropriate color filters. However, this approach required the images from the two cameras to be accurately registered which becomes increasingly complex for larger numbers of projected fringes. Pfortner et al used a three-chip color camera to simultaneously record the interference patterns at three wavelengths generated by three different lasers in a classical interferometer, although, this approach is better suited to measurements of nanometer resolution over 10’s to 100’s of microns .
In this paper we present a novel approach to measure the color (herein this term is used to denote the true reflectance of an object) and shape of an object simultaneously. The synergy between requiring data at three projected fringe frequencies for absolute shape measurement and there being three primary color channels (RGB) is exploited such that parallel acquisition of the fringe data is obtained as well as giving the potential to measure surface color. By recording the data on a 3-chip color CCD each fringe set is automatically registered. Here the colors are used as an information carrier and not as an interferometric wavelength. The numbers of fringes used in each channel is determined by the optimum multi-frequency process allowing >100 fringe orders to be measured absolutely with a phase resolution of order 1/100th of a fringe . The modulation depth in each color channel provides information on the red, green and blue color content of the surface at each pixel. For test objects where there is sufficient reflected light imaged in each channel, all the data required for color and shape measurement can be obtained in four frames when the sub-fringe phase resolution is obtained by phase stepping. With Fourier transform fringe analysis the same measurement maybe obtained from a single frame. The automatic control and analysis software operates such that for objects with poor reflectivity in a particular color channel a further sequence of four phase stepped images are acquired to obtain a wrapped phase map at the required fringe frequencies via the other color channels. This paper describes the setup and configuration of the optical system such that crosstalk between the color channels is mitigated. Experimental measurements are presented showing the absolute phase from a number of objects.
Figure 1 shows the layout of the optical system with a DLP video projector , a 3-chip color CCD camera, and a personal computer (PC). A color image whose RGB components are three fringe patterns with different spatial frequency is generated in the PC and projected onto an object surface by the DLP projector. As the phase stepped fringe patterns are generated in software, accurate phase shifts are obtained without miscalibration. The 3-CCD camera captures an image of the fringe pattern from an angularly displaced viewpoint compared to the projector and the images are saved into the computer for post processing. In this paper, four phase stepped images are used with a phase step of π/2 and the wrapped phase map calculated for each channel. Using the wrapped phase maps from the three channels, the absolute phase distribution is calculated via the fringe order and the optimum 3-frequency selection method.
2.1 Optimum three-frequency selection
In full-field fringe projection, the number of fringes projected, Nf , is related to the effective wavelength produced by the projector, λ, by Nf =L/λ, where L is the desired unambiguous measurement range, in this case the field of view of the projector. The optimum frequency selection process defines the numbers of projected fringes to be :
where Nf0 and Nfi are the maximum number of fringes and the number of fringes in the ith fringe set, respectively, and n is the number of fringe sets used. When the maximum number of fringes is 100 and n=3, N f1=99 and N f2=90. This approach resolves fringe order ambiguity as the beat obtained between N f0 and N f1 is a single fringe over the full field of view.
2.2 Fringe projection
Since the fringe pattern is digitally created in the computer, the fringe parameters are under direct control such as pitch, phase, modulation and DC intensity for each color channel. The red, green, and blue sinusoidal fringe patterns are generated in the computer as:
where c=r, g, b corresponds to the red, green, and blue channels, respectively, Ic is the gray value, DCc is the average intensity, Mc is the fringe amplitude, pc is the fringe pitch in pixels, (x, y) are the horizontal and vertical pixel indices of the DLP, and φ is the phase shift. The three image components combine together in a color image comprising the red, green, and blue channels, as shown in Fig. 2. The 3-CCD camera captures the composite fringe pattern into three channels as:
where m,n are the pixel indices of the CCD, , , , and are the intensity, the average intensity, the modulation depth, and the phase of the captured fringe pattern, respectively.
The control software allows the average intensity and modulation depth to be defined in each color channel independently with real time display of the captured intensity profiles from the red, green and blue channels. In the work presented here a 4-frame 90 degree phase step algorithm was implemented and the modulation depth calculated from each color channel .
2.3 Color separation
Since the proposed method utilizes the red, green, and blue channels to hold independent information it is important that there is minimal crosstalk between them in order to optimize the phase resolution of the wrapped maps produced. For most color CCD cameras and DLPs, the spectra of the red, green, and blue channels are designed to overlap so that there are no color-blind areas. However, the regions between the color bands are often at different wavelengths in CCD cameras than those in DLPs. Hence, the information captured in each of the three color channels is not independent.
Huang et al  proposed a method to compensate for the coupling effects between channels when projecting the same numbers of fringes with a phase shift between colors and found the compensation scheme reduced the crosstalk errors significantly. Here, the application of this technique has been explored when different numbers of fringes are projected on each channel. The use of new dielectric filters with sharp transitions between the color bands has also been investigated to minimize the crosstalk at source.
To quantify the coupling effects between color channels a derivative of the calculation introduced by Huang has been implemented . For example, with the red channel, four pure red fringe pattern sets with π/2 phase shift are generated and projected onto a flat white surface. Four full-color images are captured and separated into their RGB components, giving twelve grayscale images. The intensity modulations (m, n),c=r, g,b from the three channels are calculated and the ratios of the spatially averaged values of (m, n) between channels indicate the magnitude of the coupling effects, defined for the red channel as:
A similar process is used to evaluate the coupling effects for the green and blue channels. Finally, a matrix whose elements are the values from Eq. (4) expressed as percentages is defined to represent the coupling effects between channels:
where Cij =100×Rij, i, j=r, g,b and the first suffix denotes illumination from the projector and the second suffix the color plane in the detected image. The elements along the main diagonal in Eq. (5) are 100.0, that is Crr=Cgg=Cbb =100.0.
3. Experiments and results
3.1 Experimental System
The system comprises a DLP video projector, a 3-chip color CCD camera with firewire port and a personal computer, as shown in Fig. 1. The projector is from BenQ (Model PB6200) with one-chip digital micro-mirror device (DMD) and with a lateral resolution of up to 1024 x 768 pixels (XGA). The colors of red, green, and blue are produced by rapidly spinning a color filter wheel in the projector and synchronously modifying the state of the DMD. The 3-CCD color camera from Hitachi (Model HV-F22F) has a lateral resolution of 1360 x 1024 pixels and may be used at either 8 or 10 bit depth for each color channel. The camera has a standard zoom lens (Pentax) with focal length from 8 to 48 mm and an adjustable aperture. A personal computer (PC) provides system control. The PC graphics card is setup to drive two monitors, one for the DLP and the other for the control software and viewing the captured data.
Due to the nonlinear intensity response of the projector, the sinusoidal fringe patterns captured by the camera are distorted reducing the potential phase resolution. Putting a calibrated precision Lightmeter (cal-LIGHT 400L from the COOKE Corporation) in front of the projector with a working distance 80cm, we measured the projected light intensity by generating the pure red, green and blue colors, as shown in Fig. 3. It shows that when the input gray levels are greater than 220 for red and blue channels, 240 for green channel, the generated intensities have saturated. Therefore, the chosen ranges of the input gray levels for the three channels are 30–220, 30–240 and 30–220, respectively.
It has been found that the three color channels have different non-linear responses including different saturation levels from the projector and the camera. The camera response is significantly reduced in the red channel compared to the blue and green. It was found that to optimize the overall performance of the system, the full dynamic range of the red channel must be utilized even though this meant that the usable input grayscale range for the blue and green had to be reduced to avoid saturation, see Fig. 4(a). To calibrate the non-linear response a sequence of constant intensity images are projected onto a white board for each color channel in turn. For each intensity 10 images are captured to average shot noise effects. Figure 4(a) shows the nonlinear response for the three color channels using data from a group of pixels near the image centre. The nonlinear response can be fitted by a fourth-order polynomial (avoiding the regions near saturation and for grey levels <40). A look up table (LUT) is created by treating the captured intensities as input and the projected intensities as the output of the polynomial. The intensity range of the fringes is reduced to the region where there is reasonable variation in captured intensity, i.e. 40–230, 40–160, and 40–140 for red, green and blue, respectively. Using the calculated LUT, the captured gray levels in the three color channels have approximate linear responses, as shown in Fig. 4(b).
3.2 Evaluation of Crosstalk
In order to explore the coupling effects between channels, separate red, green, and blue fringe patterns were generated in the PC and projected onto a white surface. Because the three channels have an intensity imbalance, the F-number of the lens, gain of the CCD and white balance were adjusted to avoid saturation in the images captured by the camera whilst maximizing the grayscale usage. For each color, four phase-shifted images s (0, π/2, π, and 3π/2) were generated and projected on the white surface. We evaluated the coupling effects for the red, green, and blue channels under 3 conditions: the standard DLP and camera, using additional filters and using additional filters without the DLP built-in color filter wheel.
Using the standard DLP and camera, twelve images were captured; one of the four phase-shifted images for each color projected is displayed in Fig.5. The first column is the captured color fringe pattern, and the second, third, and fourth columns correspond to the red, green, and blue channels, respectively. From the figure, it can be seen that for each color fringe pattern the other two channels contain a weak fringe image. As the detected channel gets further away from the fringe color projected in terms of optical wavelengths, the fringe pattern almost disappears, for example, the blue channel from a red projected fringe pattern and the red channel from a blue projected fringe pattern.
The coupling matrix in Eq. (5) gives a quantitative evaluation of the coupling effects. In order to improve the accuracy of calculation, data was averaged over a 200 x 200 pixel area in the middle of the captured image to calculate the coupling effects. Without extra filters, i.e. using the built-in filters in the projector and camera, the following coupling matrix was obtained:
The coupling effects are weaker for red and blue projected fringe patterns (corresponding to the top and bottom rows of the matrix), while they are stronger for a green projected fringe pattern.
Using three additional dielectric color filters with 50% transition wavelengths of 465 nm and 650 nm for the blue and red images respectively and 540 nm centre wavelength for the green image (Comar product numbers 465 IK 50, 540 IB 50 and 650 IY 50) in front of the camera, twelve images were captured and the crosstalk matrix is now:
These results show that the coupling effects decrease dramatically when combinations of color filters are placed in front of the camera. However, this has the effect of reducing the illumination which must be compensated by increasing the gain of the camera or the aperture of the lens.
The operation of the system with only the new dielectric filters was evaluated as this represents what may be achieved with a single filter wheel and filters that are better matched to the spectral characteristics of the 3-chip CCD. For this experiment, the built-in color filter wheel in the projector was removed and each new filter inserted in turn in front of the camera. One color fringe pattern for each projected color and its three channels are displayed in Fig. 6. The coupling effects for this case are shown in the following matrix:
Again, the coupling effects for the green projected fringe pattern are larger than that of the red and blue fringe patterns. Compared to Eq. (6), the coupling effects are improved. Compared to the filter wheel in combination with the dielectric filters, Eq. (7), the results show increased crosstalk mainly in the red channel but nearly equivalent performance for the green and blue projected fringe patterns.
3.3 Phase noise evaluation
The presence of crosstalk between the color channels increases the noise in the phase measurements obtained. The phase noise has been evaluated for the three color channels in different situations: projecting the red, green, and blue fringe patterns separately with the standard DLP and camera, projecting the RGB fringe patterns simultaneously – a composite fringe pattern – with the standard DLP and camera, projecting a composite fringe pattern with the standard DLP and camera and compensating for the coupling effects using Huang’s method  and finally with the built-in color filter wheel removed but with the new dielectric color filters. Since the modulation depth affects the phase noise, the gain of the camera and the aperture of the lens were adjusted to make the captured fringe pattern have similar modulation in each color channel and for each setup. Table 1 shows the phase resolution (defined here as 2π/σϕ where σϕ is the standard deviation of the phase noise) in these situations. The phase resolutions for the standard DLP and camera with separate color projection are 153, 156, 120 for the R, G, B channels, respectively. With composite color projection crosstalk affects are present and the phase resolutions reduce to 44, 93, 121. The phase resolution in the red channel is reduced considerably as it can be seen that this channel contains the greatest amount of crosstalk (from the green channel, see the first column of the matrix in Eq. (6)). When Huang’s method was used to compensate for the coupling effects, the phase resolution for the red channel improved (75 versus 44) while it was almost the same for the green and blue channels. Therefore, we can compensate for the crosstalk in the red channel with composite fringe projection using Huang’s method.
When the built-in color filter wheel in the projector is removed, the sinusoidal fringe pattern generated is black and white. Putting the dielectric red, green and blue filters in front of the camera in turn, the phase resolutions obtained in the three color channels are 141, 180, 132 (see Table 1). The increase in phase resolution for the green channel compared to the standard DLP and camera setup is due to a combination of slightly increased fringe modulation and reduced camera gain. To simulate composite projection with the new filters, the appropriate crosstalk is introduced by adding together the images in the same color channel and with the same phase step from red, green and blue fringe projection. This process increases the intensity noise by a factor of √3. A measure of the phase resolution that would be obtained in composite fringe projection has been obtained by simulating the phase noise variation with respect to intensity noise for the same fringe modulation and offset as determined from the experiments. With the new dielectric filters the composite results are consistent with the separate fringe projection results allowing for the crosstalk matrix measured, see equation 8. The new filters offer a significant benefit in phase noise for composite projection compared to the standard DLP and camera. Because the coupling effects between channels are small, the phase resolutions in the red and green are much better than with the standard DLP and camera: red 102 versus 44, green 177 versus 93; with performance in the blue being comparable.
3.4 Composite fringe pattern
For measuring static objects where time is not critical, the best results will be obtained using separate projection of the R, G and B fringe patterns with either the standard DLP and camera or the new filters. Given the smallest phase resolution as 120 in the blue channel, the scaling factor limit is 14.1 for optimum 3-frequency interferometry and to obtain 6σ reliability in fringe order calculation . The limit on the maximum number of projected fringes is then Nf0 =200 giving a dynamic range up to 24000.
For composite fringe pattern projection, all the information required can be obtained in four RGB frames by using the phase stepping algorithm and in one RGB frame by using FFT phase analysis. Huang’s approach is used to reduce the coupling effects to improve the phase resolution in the red channel. Using the standard DLP and camera, the scaling factor limit is 8.8 and the maximum number of projected fringes is approximately Nf0 =78 giving an overall dynamic range up to 5850. Therefore, using optimum 3-frequency analysis, three fringe patterns were generated with 81, 80, and 72 fringes. With the optics used, it was found that the red channel contained a significant level of chromatic distortion compared to the blue and green channels. It is found that the generation of a single beat fringe is critical for successful fringe order calculation, in this case by forming a beat between 81 and 80 projected fringes. Therefore, to reduce the overall effects of chromatic distortion on the fringe order calculation the projected numbers of fringes were set to 81, 80 and 72 for the blue, green and red channels, respectively. The phase stepped composite RGB fringe patterns were generated and projected in sequence such that the phase and color information were captured in 4 frames.
A ceramic statue was profiled by projecting the composite fringe patterns onto its surface. A white plate was placed at the back of the measurement volume to test the performance of measuring objects with step height changes. Figure 7 shows the four captured color images and their component grayscale images corresponding to the red, green, and blue channels in the second, third, and fourth columns, respectively. Each phase stepped image is shown in a separate row of the figure. The corresponding wrapped phase maps for the RGB channels are shown in Fig. 8a)–c). Fig. 8d) is the absolute phase map obtained by using the proposed method with optimum 3-frequency analysis and it can be send that the phase of both the statue and the back plate have been correctly retrieved. During the process of phase calculation, pixels with a modulation less than 15 grayscales are marked as invalid and these pixels are shown in black. The noise on the right side of the unwrapped phase map is due to chromatic aberration and could be further mitigated by employing distortion correction separately on the R, G, B image channels. The modulations in the red, green and blue channels of the measured statue and the back plate were calculated, as shown in Fig. 9. Figure 10 displays a 3D representation of the shape and a pseudo color representation with appropriate lighting based on the surface gradient. The modulation images show good consistency between the color channels as expected for this object. Shadowing due to the angular offset between illumination and viewing can be overcome by incorporating data from multiple views.
Composite fringe projection offers the potential for high speed data acquisition of both shape and color. Using the phase stepping approach described 4 image frames are needed. Other workers have reported high speed phase shifting techniques in fringe projection to give potential measurement rates of 100 Hz . Alternatively, Fourier transform phase analysis has the potential to produce multi-frequency phase measurement from a single RGB frame.
In this paper we have shown that combined shape and color information can be obtained in a time efficient manner by using RGB fringe projection and a 3-chip CCD camera in combination with optimum 3-frequency interferometry analysis. The effects of crosstalk between color channels have been quantified showing that significant levels, up to 30%, are present with the standard filters in the projector and camera. It has been shown that by measuring the crosstalk a simple compensation scheme maybe implemented to increase the phase resolution obtained. Further reductions in crosstalk, and correspondingly higher phase resolution, can be achieved by using new dielectric filters in the projector that offer negligible chromatic overlap.
The presence of chromatic aberration in the projector and imaging optics introduce distortions between the images obtained in the 3 color channels. By suitable implementation of the optimum 3-frequency approach it has been shown that reliable fringe order calculation can be obtained over the majority of the field, with fringe order calculation errors confined to the extremes of the field of view. Further work will be done in this area to compensate for the chromatic distortion.
The authors would like to thank Scottish Enterprise for funding this work through the Proof of Concept scheme, grant reference 5EN-OPT002.
References and links
1. F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39, 10–22 (2000). [CrossRef]
2. M. Petrov, A. Talapov, T. Robertson, A. Lebedev, A. Zhilyaev, and L. Polonskiy, “Optical 3D digitizers: bringing life to the virtual world,” IEEE Comput. Graph. Appl. 18, 28–37 (1998). [CrossRef]
3. F. Blais, “Review of 20 years of range sensor development,” J. Electron Imaging 13, 231–240 (2004). [CrossRef]
4. K. Creath E. Wolf, “Phase measurement interferometry techniques,” in Progress in Optics XXVI, Ed. (North Holland Publ., Amsterdam, 1988).
6. X. Y. Su and W. J. Chen, “Reliability-guided phase unwrapping algorithm: a review,” Opt. Lasers Eng. 42, 245–261 (2004). [CrossRef]
9. J. M. Huntley and H. O. Saldner, “Error-reduction methods for shape measurement by temporal phase unwrapping,” J. Opt. Soc. Am. A 14, 3188–3196 (1997). [CrossRef]
10. H. O. Saldner and J. M. Huntley, “Shape measurement by temporal phase unwrapping: comparison of unwrapping algorithms,” Meas. Sci. Technol. 8, 986–992 (1997). [CrossRef]
12. D. P. Towers, C. E. Towers, and J. D. C. Jones, “Phase Measuring Method and Apparatus for Multi-Frequency Interferometry,” International Patent Application Number PCT/GB2003/003744.
14. C. E. Towers, D. P. Towers, and J. D. C. Jones, “Absolute fringe order calculation using optimised multi-frequency selection in full-field porfilometry,” Opt. Lasers Eng. 43, 788–800 (2005). [CrossRef]
16. P. S. Huang, Q. Y. Hu, F. Jin, and F. P. Chiang, “Color-encoded digital fringe projection technique for high-speed three-dimensional surface contouring,” Opt. Eng. 38, 1065–1071 (1999). [CrossRef]
18. S. Kakunai, T. Sakamoto, and K. Iwata, “Profile measurement taken with liquid-crystal gratings,” Appl. Opt. 38, 2824–2828 (1999). [CrossRef]
20. J. M. Younse, “Mirrors on a chip,” IEEE Spectrum 30, 27–31 (1993). [CrossRef]
21. P. S. Huang, C. P. Zhang, and F. P. Chiang, “High-speed 3-D shape measurement based on digital fringe projection,” Opt. Eng. 42, 163–168 (2003). [CrossRef]