Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

High dynamic range real-time 3D shape measurement

Open Access Open Access

Abstract

This paper proposes a method that can measure high-contrast surfaces in real-time without changing camera exposures. We propose to use 180-degree phase-shifted (or inverted) fringe patterns to complement regular fringe patterns. If not all of the regular patterns are saturated, inverted fringe patterns are used in lieu of original saturated patterns for phase retrieval, and if all of the regular fringe patterns are saturated, both the original and inverted fringe patterns are all used for phase computation to reduce phase error. Experimental results demonstrate that three-dimensional (3D) shape measurement can be achieved in real time by adopting the proposed high dynamic range method.

© 2016 Optical Society of America

1. Introduction

Optically measuring a high-contrast 3D surface poses significant challenges to the optical metrology community since it is difficult to achieve high-quality measurement across the whole surface. State-of-the-art methods for high-contrast 3D surface measurement are typically referred to as high-dynamic range (HDR) techniques.

HDR 3D shape measurement techniques can be classified into two categories: 1) using multiple exposures without pre-analysis [1, 2]; and 2) finding optimal exposures through pre-analysis [3–6]. All aforementioned HDR methods coincidently took advantage of pixel-by-pixel measurement of fringe analysis methods. The first category’s methods “blindly” capture a sequence of images with different exposures, select the best one for each measurement point, and then combine them into a whole 3D surface. The latter category’s methods capture a sequence of images with different exposures, analyze these images to determine the optimal exposure time for measurement, and then perform measurements with the optimal exposures. Though successful, all of these approaches require the acquisition of many images and require the changing of camera exposures; this poses several challenges for high-speed applications where changing camera exposure in real-time is not easy and using a lot images is not preferable.

In the meantime, there have been some methods developed to handle the 3D shape measurement of shiny surfaces, a special kind of high-contrast surfaces, using: polarizing filters [7, 8], template based texture analysis [9], and the combination of measurements from different perspectives [10]. These techniques all improved measurement capability yet have some drawbacks. Using polarizing filters is effective in reducing specular effects, but they are not so effective if the surface is diffused since the polarization states cannot be well preserved. Furthermore, such methods also substantially sacrifice light intensity due to polarization. The methods which analyze surface texture properties work well for surfaces with rich texture information, yet fail if the surface is uniform. For a shiny surface, measurement from different angles can substantially reduce the shiny areas, yet such a method increases both system complexity and the burden of post-capture data processing since neither 3D data registration nor fusion is trivial for high-accuracy measurements.

When comparing HDR methods and those methods developed for shiny surface measurement, one may notice that the former require the changing of exposures, while the latter do not. If camera exposures are properly changed, the former could provide very high quality 3D shape measurement for high contrast surfaces. However, it is difficult for such methods to achieve high-speed (e.g. real-time) 3D shape measurements since changing exposures of the system typically cannot be done instantaneously. In contrast, the latter approaches can achieve high speed since no camera exposure changes are necessary.

This paper proposes a method that does not require the changing of exposures to achieve real-time HDR 3D shape measurement. The fundamental idea is that besides capturing regular fringe patterns, 180-degree phase-shifted (or inverted) fringe patterns are captured to complement regular fringe patterns for phase retrieval. If not all of the regular patterns are saturated, the inverted fringe patterns are used in lieu of original saturated patterns for phase retrieval, and if all of the regular fringe patterns are saturated, both the original and inverted fringe patterns are combined for phase determination. Though not as robust as the previously proposed time-consuming HDR methods, the proposed method can substantially increase measurement quality for high-contract surfaces in real time. To verify the performance of this proposed method, we developed a real-time 3D shape measurement system that uses a three-step phase-shifting algorithm. By projecting three inverted fringe patterns, we will demonstrate that a drastically higher quality 3D shape measurement can be achieved for high contrast surfaces compared to the conventional method. By projecting and capturing fringe patterns at 160 frames per second (fps), we can achieve 26 fps 3D shape measurement speed.

Section 2 discusses the principles behind the proposed method. Section 3 shows simulation results. Section 4 presents experimental validation, and Sec. 5 summarizes this paper.

2. Principle

Phase-shifting algorithms are extensively adopted in 3D optical metrology because of their measurement speed, accuracy, and resolution [11]. Compared to other 3D shape measurement methods, phase-shifting-based methods typically have the advantageous features of 1) being less sensitive to local surface reflectivity variations, 2) being able to perform 3D shape measurement per camera pixel, and 3) being less sensitive to ambient light. Almost all previously proposed HDR 3D shape measurement methods took full advantage of the merit of pixel-wise phase retrieval, allowing for measurements to be carried out with different exposures for different pixels. Therefore, existing HDR methods reduce exposures (e.g., change exposure time of the camera, aperture of the lens, intensity of light, etc.) for saturated pixels. As discussed in Sec. 1, such methods are difficult to adopt in high-speed measurement systems since changing exposures of the system usually cannot be done instantaneously.

This section will explain the details about the proposed method that does not require the change of exposures.

2.1. Three-step phase-shifting algorithm

Numerous phase-shifting algorithms have been developed over their history for different purposes with each being different on the number of required phase-shifted fringe patterns and the phase shifts between fringe patterns. The three-step algorithm is the simplest one that requires the minimum number of fringe patterns for pixel-by-pixel phase calculation, and thus is preferable for high-speed measurement applications. For a three-step phase-shifting algorithm with equal phase shifts, the fringe patterns can be mathematically described as,

I1(x,y)=I(x,y)+I(x,y)cos[ϕ(x,y)2π/3],
I2(x,y)=I(x,y)+I(x,y)cos[ϕ(x,y)],
I3(x,y)=I(x,y)+I(x,y)cos[ϕ(x,y)+2π/3],
where I′(x, y) is the average intensity, I″(x, y) the intensity modulation, and ϕ(x, y) the phase to be solved for. Simultaneously solving the previous three equations leads to,
ϕ(x,y)=tan1[3(I1I3)2I2I1I3].
Since an arctangent function is used, the phase value obtained from Eq. (4) ranges from −π to +π with a 2π modulus. To obtain a continuous phase map, a spatial or temporal phase unwrapping algorithm can be used to unwrap the phase. The unwrapping process essentially determines the 2π discontinuous locations and removes the 2π jumps by adding or subtracting multiples of 2π. Once the continuous phase is obtained, 3D information can be reconstructed by using either a simple reference-plane-based method or by using the complex geometry calibration method discussed by Li et al. [12] for (x, y, z) coordinate computation.

2.2. High dynamic range (HDR) method

To properly compute phase using Eq. (4), none of the three phase-shifted fringe patterns can be saturated (i.e., for an 8-bit camera, I1 ≤ 255, I2 ≤ 255, and I3 ≤ 255). If any of the fringe patterns are saturated for a pixel, the phase obtained from Eq. (4) will carry error which will directly be incorporated into the 3D measurement error. To alleviate this problem, one could use multiple exposures, which is time consuming. Instead of using multiple exposures, we propose the idea of projecting additional inverted (or 180-degree phase-shifted) fringe patterns to complement the original three-phase-shifted fringe patterns. These proposed inverted fringe patterns are mathematically described as,

I1inv(x,y)=I(x,y)I(x,y)cos[ϕ(x,y)2π/3],
I2inv(x,y)=I(x,y)I(x,y)cos[ϕ(x,y)],
I3inv(x,y)=I(x,y)I(x,y)cos[ϕ(x,y)+2π/3].
If any of the fringe patterns are saturated, instead of using the original three phase-shifted fringe patterns and Eq. (4) for phase computation, the inverted patterns can be used to improve phase quality. Theoretically, there are the following cases:
  • Only I1(x, y) is saturated. We replace I1(x, y) with I1inv(x,y) for phase computation using the following equation
    ϕ(x,y)=tan1{3I1inv(x,y)+2I2(x,y)+I3(x,y)3[I1inv(x,y)I3(x,y)]}.
  • Only I2(x, y) is saturated. We replace I2(x, y) with I2inv(x,y) for phase computation using the following equation
    ϕ(x,y)=tan1{I1(x,y)I3(x,y)3[I1(x,y)+I3(x,y)2I2inv(x,y)]}.
  • Only I3(x, y) is saturated. We replace I3(x, y) with I3inv(x,y) for phase computation using the following equation
    ϕ(x,y)=tan1{I1(x,y)2I2(x,y)+3I3inv(x,y)3[I3inv(x,y)I1(x,y)]}.
  • Only I1(x, y) and I2(x, y) are saturated. We replace I1(x, y) and I2(x, y) with I1inv(x,y) and I2inv(x,y) for phase computation using the following equation
    ϕ(x,y)=tan1{I1inv(x,y)+2I2inv(x,y)3I3(x,y)3[I1inv(x,y)I3(x,y)]}.
  • Only I1(x, y) and I3(x, y) are saturated. We replace I1(x, y) and I3(x, y) with I1inv(x,y) and I3inv(x,y) for phase computation using the following equation
    ϕ(x,y)=tan1{I3inv(x,y)I1inv(x,y)3[2I2(x,y)I1inv(x,y)I3inv(x,y)]}.
  • Only I2(x, y) and I3(x, y) are saturated. We replace I2(x, y) and I3(x, y) with I2inv(x,y) and I3inv(x,y) for phase computation using the following equation
    ϕ(x,y)=tan1{3I1(x,y)2I2inv(x,y)I3inv(x,y)3[I3inv(x,y)I1(x,y)]}.
  • I1(x, y), I2(x, y) and I3(x, y) are all saturated. In this case, all regular and inverted patterns are used in a least square sense for phase computation to minimize the phase error caused by saturation,
    ϕ(x,y)=tan1{2B1(x,y)B3(x,y)3[B2(x,y)B1(x,y)B3(x,y)]},
    where B1(x,y)=I1I1inv, B2(x,y)=I2I2inv, and B3(x,y)=I3I3inv.

3. Simulations

Simulations were carried out to demonstrate the effectiveness of the proposed HDR algorithm. We assumed that the camera is an 8-bit camera that will be saturated if the image intensity value is above 255. For visualization purposes, only 1-D cross sections were used. Figure 1(a) shows three phase-shifted fringe patterns that are clearly saturated. We adopted the three-step phase-shifting algorithm to compute the wrapped phase and then unwrapped the phase using Matlab’s Unwrap function. Figure 1(b) shows the unwrapped phase, and Fig. 1(c) shows the phase error by subtracting the ideal phase from the unwrapped phase. The phase root-mean-square (rms) error is approximately 0.30 rad. We then generated the inverted phase shifted fringe patterns and computed phase error, as shown in Figs. 1(d)–1(f). Clearly, the inverted fringe patterns are also saturated and thus the recovered phase also has substantial phase error.

 figure: Fig. 1

Fig. 1 Example of regular and inverted fringe patterns when the patterns are saturated. (a) Three phase-shifted fringe patterns; (b) unwrapped phase obtained from fringe patterns in (a); (c) phase error (rms 0.30 rad); (d) three inverted fringe patterns; (e) unwrapped phase obtained from fringe patterns in (d); (f) phase error (rms 0.30 rad).

Download Full Size | PDF

We then employed the proposed HDR algorithm and combined these two sets of fringe patterns to compute the phase. Figure 2(a) shows the recovered phase, and Fig. 2(b) shows the phase error. The phase rms error is approximately 0. Clearly, if not all three patterns are saturated for a given point, the phase is accurately calculated.

 figure: Fig. 2

Fig. 2 Recovered phase using the proposed HDR algorithm and the patterns shown in Fig. 1. (a) Unwrapped phase; (b) phase error (rms 3.1 ×10−16 rad)

Download Full Size | PDF

To further understand the robustness of the algorithm, we further saturated the fringe patterns to ensure that there were areas where both the regular and inverted patterns were saturated, as illustrated by the windowed areas in Fig. 3(a). Figure 3(b) shows the phase error if a conventional three-step phase-shifting algorithm is used; the phase error is very large (rms 0.43). Figure 3(c) shows the phase error from our proposed HDR method. Apparently, if both inverted and regular fringe patterns are saturated for a given point, the phase error is not zero. However, this figure clearly demonstrates that even for such a case, the phase error resulting from the HDR algorithm is still quite small, approximately rms 0.03 rad, which is more than 14 times smaller.

 figure: Fig. 3

Fig. 3 Simulation results when both inverted and regular fringe patterns are saturated for certain pixels. (a) Example regular and inverted fringe patterns. The red windows highlight points to where both the regular and inverted fringe patterns are saturated; (b) phase error using conventional phase shifting algorithm (rms 0.43 rad); (c) phase error using proposed HDR algorithm (rms 0.03 rad).

Download Full Size | PDF

4. Experiments

We also verified the performance of the proposed HDR methods on a hardware system that includes a digital-light-processing (DLP) projector (Model: Dell M115HD), and a charge coupled device (CCD) camera (Model: the Imaging Source DMK 21U618). The projector has a resolution 1280 × 800 and the camera resolution is 640 × 480. With this system, we used a temporal phase unwrapping method that combines phase-shifting with binary coding [13] complex 3D shape measurement. The binary coded patterns are then used to uniquely determine the number of 2π for each pixel to obtain continuous phase from the phase-shifted patterns. We adopted the complex geometry calibration method discussed by Li et al. [12] to convert absolute phase to (x, y, z) coordinates.

We first measured a flat white board to experimentally verify the proposed HDR method; Fig. 4 shows the measurement results. Figure 4(a) shows a segment of the cross section of the represented fringe patterns. Apparently, these patterns are partially saturated. Figure 4(b) shows phase error if a conventional three-step phase-shifting algorithm was used, and Fig. 4(c) shows the corresponding phase error when the proposed HDR method was employed. For this example, no saturated pixels within the regular fringe patterns were also saturated on the inverted fringe patterns, and thus accurate phase was obtained. The phase rms error is approximately reduced from 0.29 rad to 0.02 rad, a 14.5 times reduction.

 figure: Fig. 4

Fig. 4 Measurement results of a flat white board. (a) Cross section of one of the regular and one of the inverted fringe patterns when either inverted or regular fringe pattern is not saturated; (b) phase error (rms 0.29 rad) using the conventional three-step phase-shifting algorithm; (c) phase error using our HDR algorithm (rms 0.02); (d) Cross section of one of the regular and one of the inverted fringe patterns when both inverted and regular fringe pattern are saturated for some pixels; (e) phase error (rms 0.52 rad) using the conventional three-step phase-shifting algorithm; (f) phase error using our HDR algorithm (rms 0.08).

Download Full Size | PDF

We also experimented with the case where some pixels in both the regular and inverted fringe patterns were saturated. Figure 4(d) shows the fringe patterns; apparently, there are some pixels which are saturated in both the regular and inverted patterns. Figure 4(b) and 4(c) respectively show the phase error using the regular phase-shifting algorithm and the HDR algorithm. The phase rms error is again substantially reduced: from 0.52 rad to 0.08 rad.

A more complex 3D object was tested to demonstrate the HDR capability. Figure 5(a) shows the photograph of the measured object: the surface color and contrast vary drastically from one area to another. If a conventional method is used and one hopes to measure the entire area properly, no point can be saturated. Apparently then, the conventional single-exposure method can only properly measure certain parts of the statue, leaving the rest with low measurement quality. Figure 5(b) shows one of the fringe patterns that ensures no saturation within the image occurs (i.e., set proper exposure for the bright part). Figure 5(e) shows the 3D result if such an exposure is used. The bright shirt area is properly measured, but the dark face area has very large amounts of noise. In contrast, if one ensures that the dark area is properly exposed, the face area can be properly measured. However, the bright shirt area has a large measurement error, as shown in Fig. 5(f).

 figure: Fig. 5

Fig. 5 Experimental results of a high contrast object with different degrees of saturation (fringe period is 36 pixels); (a) Photography of the measured object; (b) One of the regular, low exposure phase-shifted patterns; (c) One of the inverted phase-shifted patterns; (d) One of the combined fringe patterns for the HDR algorithm; (e) 3D reconstruction by conventional methods from the low exposure pattern (b); (f) 3D reconstruction by conventional methods from high exposure regular patterns; (g) 3D reconstruction by conventional methods from high exposure inverted patterns (c); (h) 3D reconstruction by the proposed HDR algorithm from high exposure patterns (d).

Download Full Size | PDF

We then capture another set of inverted fringe patterns with the same high exposure shown in Fig. 5(c). Apparently, if only these inverted patterns are used for 3D reconstruction, large measurement error is still present within the bright area, as shown in Fig. 5(g). However, if we combine all these regular and inverted high exposure fringe patterns, and apply our HDR algorithm, we can recover the high-quality 3D shape as shown in Fig. 5(h).

To better visualize the difference among these algorithms, we generated close-up views for the recovered 3D images. Figure 6(a) shows the dense texture in dark color on the object’s face area. Figure 6(b) shows the close-up view of the result shown in Fig. 5(e) when low exposure is used to ensure no saturation. Clearly, large noise is present in this dark area. Figure 6(c) and 6(d) respectively show the 3D reconstruction using the regular and inverted fringe patterns when high exposure is used. They are clearly much better than low exposure, but have problems with measuring the bright areas. Figure 6(e) shows the HDR result which is obviously good, as well. One may notice lower random noise on the HDR result than on the result using either regular or inverted patterns. This is a result of averaging since we used all of the regular and inverted images and a least square algorithm to compute the phase. In order to fairly compare the HDR result to the result using lower exposure, we generated the 3D results using the HDR algorithm but at a lower exposure, as shown in Fig. 6(f). The HDR algorithm can indeed reduce random noise even for the lower exposure, as expected. These experimental results clearly demonstrate that the proposed single-exposure HDR method has a greater capability than the conventional single-exposure method to accurately measure objects with high-contrast surfaces: random noise is lower and resolution higher.

 figure: Fig. 6

Fig. 6 Zoomed-in results of the head part from Fig. 5; (a) Photography of the zoomedin area; (b) 3D reconstruction by regular patterns on low exposure; (c) 3D reconstruction by regular patterns on high exposure; (d) 3D reconstruction by inverted pattens on high exposure; 3D reconstruction by HDR algorithm on high exposure; (f) 3D reconstruction by HDR algorithm on low exposure.

Download Full Size | PDF

To demonstrate the real-time capability, we then implement the proposed HDR algorithm on a real-time 3D shape measurement system that uses a high-speed DLP projector (LightCrafter 4500) and a high-speed CCD camera (Model: Jai TM-6740CL). The projector resolution is 912 × 1140, and the camera resolution is 640 × 480. The projector was configured to project fringe patterns at 160 fps, and the camera was precisely synchronized with the projector to capture images at 160 fps. Since six patterns are required to recover one 3D geometry, the achieved 3D shape measurement speed was approximately 26 fps (i.e., real time). For this system, we adopted the spatial phase unwrapping method [14] and a simple reference-plane-based method [15] to convert phase to coordinates.

We captured a dynamically changing human face since the geometry is complex and the contrast is high (i.e., some dark and some brighter areas). Figure 7, as wells as Visualization 1 Visualization 2 Visualization 3 Visualization 4, show the measurement results. Clearly, without adopting the proposed HDR method, the measurement surface is not smooth because fringe patterns are partially saturated (forehead area) in order to measure low reflectivity areas (hair). After adopting the proposed method, the overall surface is very smooth, as expected, with high quality across the whole image. One may also notice that within Fig. 7(a) and Fig. 7(b) that the spatial phase unwrapping does not work very well on the hair, within the top-right part of the figures, where the spatial phase unwrapping algorithm fails to properly unwrap the phase. However, the same problems are less severe when using the HDR method, which might be a result of lower noise on the HDR phase. These experimental results successfully demonstrate that our proposed method can indeed be employed within a real-time 3D shape measurement system for high contrast surface measurements.

 figure: Fig. 7

Fig. 7 Real-time 3D shape measurement using the proposed HDR algorithm (associated Visualization 1 Visualization 2 Visualization 3 Visualization 4). (a) 3D reconstruction by the conventional single-exposure method from regular patterns; (b) 3D reconstruction by the conventional single-exposure method from inverted patterns; (c) 3D reconstruction by the proposed HDR method; (d) close-up view of (a); (e) close-up view of (b); (f) close-up view of (c).

Download Full Size | PDF

5. Conclusions

This paper has presented a high-dynamic range 3D shape measurement method using digital fringe projection without the requirement of changing camera exposures. The proposed method uses 180-degree phase-shifted (or inverted) fringe patterns to complement regular fringe patterns. If not all of the regular patterns are saturated, the inverted fringe patterns are used in lieu of original saturated patterns for phase retrieval, and if all of the regular fringe patterns are saturated, both the original and inverted fringe patterns are all used for phase computation to reduce phase error. Both simulation and experimental results demonstrate that the measurement error can be substantially reduced even for highly saturated patterns. Since only a single exposure is needed, we have demonstrated that this proposed method can be adopted within real-time applications, albeit it requires six fringe patterns and thus reduces measurement speed by a factor of two.

Acknowledgments

This study was sponsored by the National Science Foundation (NSF) Directorate for Engineering (100000084) under grant numbers: CMMI-1523048. The views expressed in this paper are those of the authors and not necessarily those of the NSF.

References and links

1. S. Zhang and S.-T. Yau, “High dynamic range scanning technique,” Opt. Eng. 48, 033604 (2009). [CrossRef]  

2. C. Waddington and J. Kofman, “Analysis of measurement sensitivity to illuminance and fringe-pattern gray levels for fringe-pattern projection adaptive to ambient lighting,” Opt. Laser Eng. 48, 251–256 (2010). [CrossRef]  

3. H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-d scanning technique for high-reflective surfaces,” Opt. Laser Eng. 50, 1484–1493 (2012). [CrossRef]  

4. H. Zhao, X. Liang, X. Diao, and H. Jiang, “Rapid in-situ 3d measurement of shiny object based on fast and high dynamic range digital fringe projector,” Opt. Laser Eng. 54, 170–174 (2014). [CrossRef]  

5. L. Ekstrand and S. Zhang, “Auto-exposure for three-dimensional shape measurement with a digital-light-processing projector,” Opt. Eng. 50, 123603 (2011). [CrossRef]  

6. L. Ekstrand and S. Zhang, “Automated high-dynamic range three-dimensional optical metrology technique,” in Proceedings of the ASME 2014 International Manufacturing Science and Engineering Conference V001T05A005, (2014), pp. 2014–4101.

7. Y. Yamaguchi, H. Miyake, O. Nishikawa, and T. Iyoda, “Shape measurement of glossy objects by range finder with polarization optical system,” Gazo Denshi Gakkai Kenkyukai Koen Yoko (in Japanese) 200, 43–50 (2003).

8. B. Salahieh, Z. Chen, J. J. Rodriguez, and R. Liang, “Multi-polarization fringe projection imaging for high dynamic range objects,” Opt. Express 22, 10064–10071 (2014). [CrossRef]   [PubMed]  

9. R. Kokku and G. Brooksby, “Improving 3d surface measurement accuracy on metallic surfaces,” Proc. SPIE 5856, 618–624 (2005). [CrossRef]  

10. Q. Hu, K. G. Harding, X. Du, and D. Hamilton, “Shiny parts measurement using color separation,” Proc. SPIE 6000, 6000D1 (2005).

11. D. Malacara, ed., Optical Shop Testing, 3rd ed. (John Wiley and Sons, 2007). [CrossRef]  

12. B. Li, N. Karpinsky, and S. Zhang, “Novel calibration method for structured light system with an out-of-focus projector,” Appl. Opt. 56, 3415–3426 (2014). [CrossRef]  

13. S. Zhang, “Flexible 3d shape measurement using projector defocusing: Extended measurement range,” Opt. Lett. 35, 931–933 (2010).

14. S. Zhang, X. Li, and S.-T. Yau, “Multilevel quality-guided phase unwrapping algorithm for real-time three-dimensional shape reconstruction,” Appl. Opt. 46, 50–57 (2007). [CrossRef]  

15. Y. Xu, L. Ekstrand, J. Dai, and S. Zhang, “Phase error compensation for three-dimensional shape measurement with projector defocusing,” Appl. Opt. 50, 2572–2581 (2011). [CrossRef]   [PubMed]  

Supplementary Material (4)

NameDescription
Visualization 1: MP4 (5730 KB)      Visualization 1
Visualization 2: MP4 (5752 KB)      Visualization 2
Visualization 3: MP4 (5315 KB)      Visualization 3
Visualization 4: MP4 (2064 KB)      Visualization 4

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1
Fig. 1 Example of regular and inverted fringe patterns when the patterns are saturated. (a) Three phase-shifted fringe patterns; (b) unwrapped phase obtained from fringe patterns in (a); (c) phase error (rms 0.30 rad); (d) three inverted fringe patterns; (e) unwrapped phase obtained from fringe patterns in (d); (f) phase error (rms 0.30 rad).
Fig. 2
Fig. 2 Recovered phase using the proposed HDR algorithm and the patterns shown in Fig. 1. (a) Unwrapped phase; (b) phase error (rms 3.1 ×10−16 rad)
Fig. 3
Fig. 3 Simulation results when both inverted and regular fringe patterns are saturated for certain pixels. (a) Example regular and inverted fringe patterns. The red windows highlight points to where both the regular and inverted fringe patterns are saturated; (b) phase error using conventional phase shifting algorithm (rms 0.43 rad); (c) phase error using proposed HDR algorithm (rms 0.03 rad).
Fig. 4
Fig. 4 Measurement results of a flat white board. (a) Cross section of one of the regular and one of the inverted fringe patterns when either inverted or regular fringe pattern is not saturated; (b) phase error (rms 0.29 rad) using the conventional three-step phase-shifting algorithm; (c) phase error using our HDR algorithm (rms 0.02); (d) Cross section of one of the regular and one of the inverted fringe patterns when both inverted and regular fringe pattern are saturated for some pixels; (e) phase error (rms 0.52 rad) using the conventional three-step phase-shifting algorithm; (f) phase error using our HDR algorithm (rms 0.08).
Fig. 5
Fig. 5 Experimental results of a high contrast object with different degrees of saturation (fringe period is 36 pixels); (a) Photography of the measured object; (b) One of the regular, low exposure phase-shifted patterns; (c) One of the inverted phase-shifted patterns; (d) One of the combined fringe patterns for the HDR algorithm; (e) 3D reconstruction by conventional methods from the low exposure pattern (b); (f) 3D reconstruction by conventional methods from high exposure regular patterns; (g) 3D reconstruction by conventional methods from high exposure inverted patterns (c); (h) 3D reconstruction by the proposed HDR algorithm from high exposure patterns (d).
Fig. 6
Fig. 6 Zoomed-in results of the head part from Fig. 5; (a) Photography of the zoomedin area; (b) 3D reconstruction by regular patterns on low exposure; (c) 3D reconstruction by regular patterns on high exposure; (d) 3D reconstruction by inverted pattens on high exposure; 3D reconstruction by HDR algorithm on high exposure; (f) 3D reconstruction by HDR algorithm on low exposure.
Fig. 7
Fig. 7 Real-time 3D shape measurement using the proposed HDR algorithm (associated Visualization 1 Visualization 2 Visualization 3 Visualization 4). (a) 3D reconstruction by the conventional single-exposure method from regular patterns; (b) 3D reconstruction by the conventional single-exposure method from inverted patterns; (c) 3D reconstruction by the proposed HDR method; (d) close-up view of (a); (e) close-up view of (b); (f) close-up view of (c).

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

I 1 ( x , y ) = I ( x , y ) + I ( x , y ) cos [ ϕ ( x , y ) 2 π / 3 ] ,
I 2 ( x , y ) = I ( x , y ) + I ( x , y ) cos [ ϕ ( x , y ) ] ,
I 3 ( x , y ) = I ( x , y ) + I ( x , y ) cos [ ϕ ( x , y ) + 2 π / 3 ] ,
ϕ ( x , y ) = tan 1 [ 3 ( I 1 I 3 ) 2 I 2 I 1 I 3 ] .
I 1 inv ( x , y ) = I ( x , y ) I ( x , y ) cos [ ϕ ( x , y ) 2 π / 3 ] ,
I 2 inv ( x , y ) = I ( x , y ) I ( x , y ) cos [ ϕ ( x , y ) ] ,
I 3 inv ( x , y ) = I ( x , y ) I ( x , y ) cos [ ϕ ( x , y ) + 2 π / 3 ] .
ϕ ( x , y ) = tan 1 { 3 I 1 inv ( x , y ) + 2 I 2 ( x , y ) + I 3 ( x , y ) 3 [ I 1 inv ( x , y ) I 3 ( x , y ) ] } .
ϕ ( x , y ) = tan 1 { I 1 ( x , y ) I 3 ( x , y ) 3 [ I 1 ( x , y ) + I 3 ( x , y ) 2 I 2 inv ( x , y ) ] } .
ϕ ( x , y ) = tan 1 { I 1 ( x , y ) 2 I 2 ( x , y ) + 3 I 3 inv ( x , y ) 3 [ I 3 inv ( x , y ) I 1 ( x , y ) ] } .
ϕ ( x , y ) = tan 1 { I 1 inv ( x , y ) + 2 I 2 inv ( x , y ) 3 I 3 ( x , y ) 3 [ I 1 inv ( x , y ) I 3 ( x , y ) ] } .
ϕ ( x , y ) = tan 1 { I 3 inv ( x , y ) I 1 inv ( x , y ) 3 [ 2 I 2 ( x , y ) I 1 inv ( x , y ) I 3 inv ( x , y ) ] } .
ϕ ( x , y ) = tan 1 { 3 I 1 ( x , y ) 2 I 2 inv ( x , y ) I 3 inv ( x , y ) 3 [ I 3 inv ( x , y ) I 1 ( x , y ) ] } .
ϕ ( x , y ) = tan 1 { 2 B 1 ( x , y ) B 3 ( x , y ) 3 [ B 2 ( x , y ) B 1 ( x , y ) B 3 ( x , y ) ] } ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.