Abstract

Traditional fringe-projection three-dimensional (3D) imaging techniques struggle to estimate the shape of high dynamic range (HDR) objects where detected fringes are of limited visibility. Moreover, saturated regions of specular reflections can completely block any fringe patterns, leading to lost depth information. We propose a multi-polarization fringe projection (MPFP) imaging technique that eliminates saturated points and enhances the fringe contrast by selecting the proper polarized channel measurements. The developed technique can be easily extended to include measurements captured under different exposure times to obtain more accurate shape rendering for very HDR objects.

© 2014 Optical Society of America

1. Introduction

Shape acquisition of three-dimensional (3D) objects is of significant importance for various real-world applications including machine vision, reverse engineering, industrial inspections, and medical imaging. An economic reliable real-time technique that delivers such information is fringe projection imaging [13]. The imaging system comprises a projector-camera pair in which successive phase-shifted fringe patterns are projected onto objects, become distorted, and are then captured by the camera. These captured distorted fringes carry valuable information about the object’s depth, which can be retrieved through phase shifting algorithms [1, 2].

However, conventional fringe projection imagers fail to recover depth data from objects of high dynamic range (HDR) where fringe visibility is greatly reduced in dark regions, bright areas, or over surfaces of large reflectivity variations. For instance, shiny metal objects reflect illuminating light specularly and saturate the camera without carrying any depth content.

Researchers have tackled these challenges through various approaches, achieving restricted success. Employing polarizers to filter the projected and captured fringe images results in a reconstruction trade-off between complementary reflecting surfaces. A crossed polarizer-analyzer pair [4] eliminates shiny areas, but with the cost of unresolving the dark zones. On the contrary, a parallel polarizer-analyzer alignment [5, 6] can maintain good fringe quality at the dark regions, but not in the bright places.

A different approach that better suits HDR cases without such a trade-off is to take multiple-shots of fringe images at various exposures [7] or automatically adapt the exposure times to fit the scene [8]. In these schemes, areas of dark intensities are picked from the long exposure patterns while regions of bright appearance are chosen from the short exposure ones. An alternative technique to avoid saturation and maintain good fringe quality is to adaptively adjust the projected fringe pattern intensities and then combine the captured fringes [9]. A more developed fringe acquisition approach [10] combines both different camera exposures and various fringe projection intensities to guarantee good depth recovery. However, taking multiple shots or adaptively adjusting either the exposure times or the projected fringe intensities, or both, may not permit fast capture of dynamic scenes, hence limiting the technique to slow or static scenes.

In this paper, we contribute a single-shot multi-polarization fringe projection (MPFP) algorithm that combines the advantages of most previous solutions, allowing broader applications. Unlike prior techniques, the novelty of our approach is the use of snapshot multi-polarization measurements to process HDR dynamic scenes. Additionally, the MPFP algorithm can easily exploit combined-exposure measurements if further enhancement is desired.

2. Multi-polarization fringe projection imaging algorithm

In the MPFP imaging system, shown in Fig. 1, the projected fringes are linearly polarized prior to incidence on the object and being captured after reflection through a multi-polarization camera. The employed camera has a pixelated polarizer array of four states (P0°,P45°,P90°andP135°) attached to the sensor [11, 12]. Upon reflection, various object surfaces will modulate the polarized fringes differently, leading to dissimilar measurements in the multi-polarized channels.

 figure: Fig. 1

Fig. 1 Multi-polarization fringe projection (MPFP) imaging system.

Download Full Size | PPT Slide | PDF

The imaging equation of the proposed MPFP system can be mathematically described by

gp,k=Hp f Ik+ np,k,
where p and k subscripts represents the polarization and fringe indices, respectively. Also, g,f,Iand n are the measurements, object, fringe, and noise vectors, respectively. Finally, H forms the impulse response projection matrix that is dependent on the captured polarization state.

The aim of this imaging technique is to render the shape of HDR objects f from the captured multi-polarization distorted fringes gp,k through the six steps presented in Fig. 2.

 figure: Fig. 2

Fig. 2 Steps of multi-polarization fringe projection technique for HDR objects, vectors denote image-level operations while scalars denote pixel-level operations.

Download Full Size | PPT Slide | PDF

The algorithm starts by extracting M raw polarized images out of the B-bit sensor measurements for each of N sinusoidal fringe patterns. Next, steps 2 to 4 target eliminating any saturation and improving the fringes’ quality by finding the best polarization channel for each pixel across all fringe images. For each pixel at the kth fringe image, index of the maximum polarized channel is identified as pmax = argmaxp{gp,kforp=1,...,M}. If gpmax,k=2B1 then the maximum polarized channel is saturated at that pixel belonging to fringe k. In this case, the saturated channel is replaced by the next largest channel, p, across all fringe images: gpmax,k=gp,k for all k=1,...,N (can be repeated if there are multiple saturated channels as normally seen in specular reflection regions). If all channels are saturated, then the algorithm may simply pick any channel, and the fringes’ distortions cannot be restored (here it is recommended to recapture the fringe images using shorter exposure time to avoid any loss). Once saturation has been eliminated, a maximization decision map of selected channel indices is found: p" = argmaxk{gpmax,kwherek=1,...,N}. This map identifies the maximum non-saturated channel for each pixel. Next, the decision map is utilized to merge the M modified polarized images located in the same fringe pattern into a single high-contrast image, gk=gp",k. This results in a total of N enhanced fringe images. It is important to note that when the algorithm picks a certain polarization channel at a particular pixel, this channel will be utilized at the corresponding pixel in all other fringes to maintain the sinusoidal modulation needed for accurate depth estimation.

Afterward, the phase ϕ can be retrieved from the N merged distorted fringes through a phase shifting algorithm given by the following arctan equation [5]:

ϕ=tan1(k=1Ngksin(2πk/N)k=1Ngkcos(2πk/N)).
The final step is converting the phase information into depths that are calculated with respect to a reference plane determined at the calibration stage. Once depths are calculated, the object shape can be rendered in a 3D coordinate system.

3. Evaluation of multi-polarization fringe projection algorithm with various objects

To validate the performance of the MPFP technique, we tested it in three different scenarios using N = 5 successive equally phase-shifted sinusoidal fringe patterns, M = 4 polarization channels, and PolarCam micropolarizer camera [13] with B = 8 bits for the sensor resolution and pixelated sensor array of 1208 × 1608 pixels. The sinusoidal patterns are projected by a digital light projection (DLP 3000) device that has 768 × 1024 micro-mirrors. The experiments were carried in normal room light conditions where fringe illumination strength and image acquisition time were set to obtain good fringe contrast on the targeted objects.

3.1 Simple object with three different surfaces

A simple custom-made HDR object, shown in Fig. 3(a), is formed with three different material surfaces made of a black tape, a white tape, and a metal surface.

 figure: Fig. 3

Fig. 3 Single-polarization fringe projection imaging of simple object. (a) Simple three-surface object captured by unpolarized camera. (b) Raw polarized data of first distorted fringes. (c) Fringe contrast of various polarization channels at two cross-sections of first distorted fringes (black-white-black tapes on left and metal surface on right). (d) Shape rendering of five fringe images at separate polarizations.

Download Full Size | PPT Slide | PDF

The four captured polarization fringe images carry different reflectance content that are presented in Fig. 3(b) for the first fringe pattern. To gain more insight on fringe contrast of the raw images, two cross-sections have been taken across the black-white-black tapes and the metal surface; the locations and profiles are illustrated in Figs. 3(b) and 3(c), respectively. The fringe profiles elucidate the dynamic range for each surface and the differences between the polarization channels especially at the saturated regions.

If utilized separately, the distorted fringes belonging to the same polarization channel may not be sufficient to reconstruct a complete object’s shape, as shown by the rendered images in Fig. 3(d). Note that the specular reflections at the P0°,P90° and P135° channels prevent any depth estimation since fringes are absent; however, this is not the case in the P45° channel.

When all polarizations are evaluated together, the proposed algorithm attempts to find the best fringe visibility existing in them. Figure 4(a) shows a color-coded maximization decision map that delivers contrast-enhanced fringes after the merging process, presented in Fig. 4(b). As seen from the colored decision map and the MPFP cross-section profiles plotted in Fig. 3(c), the technique succeeded in assigning the unsaturated polarization intensity (P45°) to the bright metal surface while attaching the relatively high polarization intensity (P135°) to the black tape region. Figure 4(c) shows the phase image, which was calculated from the five enhanced fringes and used to find the depths of the targeted object. The whole 3D shape is successfully produced, as shown in Fig. 4(d), demonstrating the performance improvement of the MPFP technique over traditional fringe projection, which uses independent polarization channels, as shown in Fig. 3(d).

 figure: Fig. 4

Fig. 4 Multi-polarization fringe projection imaging of simple object. (a) Multi-polarization decision map. (b) Merging results of first fringe images. (c) Phase retrieval. (d) Shape rendering of five enhanced fringes.

Download Full Size | PPT Slide | PDF

The decision map is a result of a pixel-by-pixel maximization procedure after eliminating saturation. Hence, in regions where two or more different polarizers behave similarly (e.g. diffusing surfaces), the decision may alternate between the neighboring pixels, leading to a noise-like appearance in the decision map with negligible impact on the depth estimation, see the white tape area at the top-center of Fig. 4(a) where the decision alternates between P45°, P90°, and P135°. Such noisy appearance of the decision map can be removed by making decisions for blocks of pixels instead of individual pixels; however, this may limit the depth resolution.

Also, we note that the dynamic range of the merged fringe images has been extended when using four polarization channels as compared with two perpendicular ones. However, the significance of having further polarization channels (more degrees of freedom) may be reduced as channels of adjacent polarization angles will show insignificant visual differences. Besides, such a setup will either sacrifice the spatial resolution or imply sequential measurements to account for all employed polarizers.

3.2 Circuit board of various intensities

We further tested the MPFP algorithm by imaging a circuit board, Fig. 5(a), as an example of real-world objects that contain a variety of intensity levels. Again, the MPFP decision map, Fig. 5(b), successfully assigned the high reflectivity components (metal ports and capacitor tops) to the unsaturated polarization channels and most of the dark background to the brightest channel (P135°), producing high-contrast fringe images, Fig. 5(c). The rendered shape derived from the enhanced fringe images is presented in Fig. 5(d). The false shadows surrounding some components (e.g., capacitors) are not a shortcoming of the MPFP algorithm, but are produced by the components’ shadows in the original image, which blocked the projected fringes from reaching these surfaces. A little roughness may still appear on some surfaces where saturation dominates all polarization channels (e.g., bottom metal plugs in Fig. 5(d)). This occurs due to the upper truncation of sinusoidal fringes which distorts the phase calculations. Finally, the depth resolution can be further improved using a higher resolution camera lens and a denser sensor array.

 figure: Fig. 5

Fig. 5 Multi-polarization fringe projection imaging of circuit board object. (a) Circuit board captured by unpolarized camera. (b) Decision map. (c) Merging results shown for first fringe. (d) Shape rendering.

Download Full Size | PPT Slide | PDF

3.3 Object at different exposures

A further challenge occurs when the object has a very HDR or is unequally illuminated so that the rendered object’s shape is only partially visible even when incorporating all polarization channels acquired under the same exposure. Here, the MPFP algorithm can easily be extended to include sufficient measurements captured under various exposure times and through different polarization channels. The algorithm obtains a more complete view of the object by selecting the exposure and polarization pairs that yield the maximum unsaturated pixels.

As an example, we imaged scissors, Fig. 6(a), using four polarization angles and two different exposure times. The left-side images in Figs. 6(b) and 6(c) show that short exposure results in a poor decision map and imperfect rendering at the scissors’ handles, whereas the middle images in Figs. 6(b) and 6(c) reveal that long exposure produces holes on the metal surfaces due to the limited decision options. When using measurements from both exposure times, MPFP selects the exposure-polarization pair that yields the maximum unsaturated value at each pixel (see Fig. 6(b) right), leading to more complete 3D shape rendering (see Fig. 6(c) right). This illustrates the combined polarization/exposure capability of MPFP in the very HDR or non-uniform illumination cases.

 figure: Fig. 6

Fig. 6 Multi-polarization fringe projection imaging of an object under different exposures. (a) Scissors captured by unpolarized camera. (b) Decision map. (c) Shape rendering. The images shown in (b) and (c) are sorted left to right according to the utilized exposure time.

Download Full Size | PPT Slide | PDF

4. Conclusion

In summary, we propose a multi-polarization fringe projection imaging technique capable of delivering complete depth estimation of HDR objects. The algorithm eliminates saturated or low-contrast fringe regions by selecting different polarization measurements, or the right combination of polarization angle and exposure time, in order to maintain good fringe visibility. This leads to greater coverage of the object in the shape rendering, better measurement of object topography, and thus a more accurate rendering of the object shape.

References and links

1. S. Zhang, High-Resolution, Real-Time 3-D Shape Measurement, Ph.D. dissertation, Dept. of Mechanical Engineering, (Stony Brook University, Stony Brook, NY, 2005).

2. N. Karpinsky and S. Zhang, “High-resolution, real-time 3D imaging with fringe analysis,” J. Real-Time Image Process. 7(1), 55–66 (2012). [CrossRef]  

3. X. Su and Q. Zhang, “Dynamic 3-D shape measurement method: A review,” J. Opt. Lasers Eng. 48(2), 191–204 (2010). [CrossRef]  

4. L. Wolff, “Using polarization to separate reflection components, ” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 1989), pp. 363–369. [CrossRef]  

5. T. Chen, H. P. A. Lensch, C. Fuchs, and H. P. Seidel, “Polarization and phase-shifting for 3D scanning of translucent objects,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 2007), pp. 1–8. [CrossRef]  

6. R. Liang, “Short wavelength and polarized phase shifting fringe projection imaging of translucent objects,” J. Opt. Eng. 53(1), 014104 (2014). [CrossRef]  

7. S. Zhang and S. T. Yau, “High dynamic range scanning technique,” J. Opt. Eng. 48(3), 033604 (2009). [CrossRef]  

8. L. Ekstrand and S. Zhang, “Autoexposure for three-dimensional shape measurement using a digital-light-processing projector,” J. Opt. Eng. 50(12), 123603 (2011). [CrossRef]  

9. C. Waddington and J. Kofman, “Saturation avoidance by adaptive fringe projection in phase-shifting 3D surface-shape measurement,” in Proceedings of Intl. Symp. on Optomechatronic Technologies, (Institute of Electrical and Electronics Engineers, 2010), pp. 1–4. [CrossRef]  

10. H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high-reflective surfaces,” J. Opt. Lasers Eng. 50(10), 1484–1493 (2012). [CrossRef]  

11. N. J. Brock, B. T. Kimbrough, and J. E. Millerd, “A pixelated micropolarizer-based camera for instantaneous interferometric measurements,” Proc. SPIE 8160, 81600W (2011). [CrossRef]  

12. T. Kiire, S. Nakadate, M. Shibuya, and T. Yatagai, “Three-dimensional displacement measurement for diffuse object using phase-shifting digital holography with polarization imaging camera,” Appl. Opt. 50(34), H189–H194 (2011). [CrossRef]   [PubMed]  

13. 4D Technology Corporation, PolarCam Polarization Camera, available at http://www.4dtechnology.com, accessed Mar., 2014.

References

  • View by:

  1. S. Zhang, High-Resolution, Real-Time 3-D Shape Measurement, Ph.D. dissertation, Dept. of Mechanical Engineering, (Stony Brook University, Stony Brook, NY, 2005).
  2. N. Karpinsky and S. Zhang, “High-resolution, real-time 3D imaging with fringe analysis,” J. Real-Time Image Process. 7(1), 55–66 (2012).
    [Crossref]
  3. X. Su and Q. Zhang, “Dynamic 3-D shape measurement method: A review,” J. Opt. Lasers Eng. 48(2), 191–204 (2010).
    [Crossref]
  4. L. Wolff, “Using polarization to separate reflection components, ” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 1989), pp. 363–369.
    [Crossref]
  5. T. Chen, H. P. A. Lensch, C. Fuchs, and H. P. Seidel, “Polarization and phase-shifting for 3D scanning of translucent objects,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 2007), pp. 1–8.
    [Crossref]
  6. R. Liang, “Short wavelength and polarized phase shifting fringe projection imaging of translucent objects,” J. Opt. Eng. 53(1), 014104 (2014).
    [Crossref]
  7. S. Zhang and S. T. Yau, “High dynamic range scanning technique,” J. Opt. Eng. 48(3), 033604 (2009).
    [Crossref]
  8. L. Ekstrand and S. Zhang, “Autoexposure for three-dimensional shape measurement using a digital-light-processing projector,” J. Opt. Eng. 50(12), 123603 (2011).
    [Crossref]
  9. C. Waddington and J. Kofman, “Saturation avoidance by adaptive fringe projection in phase-shifting 3D surface-shape measurement,” in Proceedings of Intl. Symp. on Optomechatronic Technologies, (Institute of Electrical and Electronics Engineers, 2010), pp. 1–4.
    [Crossref]
  10. H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high-reflective surfaces,” J. Opt. Lasers Eng. 50(10), 1484–1493 (2012).
    [Crossref]
  11. N. J. Brock, B. T. Kimbrough, and J. E. Millerd, “A pixelated micropolarizer-based camera for instantaneous interferometric measurements,” Proc. SPIE 8160, 81600W (2011).
    [Crossref]
  12. T. Kiire, S. Nakadate, M. Shibuya, and T. Yatagai, “Three-dimensional displacement measurement for diffuse object using phase-shifting digital holography with polarization imaging camera,” Appl. Opt. 50(34), H189–H194 (2011).
    [Crossref] [PubMed]
  13. 4D Technology Corporation, PolarCam Polarization Camera, available at http://www.4dtechnology.com , accessed Mar., 2014.

2014 (1)

R. Liang, “Short wavelength and polarized phase shifting fringe projection imaging of translucent objects,” J. Opt. Eng. 53(1), 014104 (2014).
[Crossref]

2012 (2)

N. Karpinsky and S. Zhang, “High-resolution, real-time 3D imaging with fringe analysis,” J. Real-Time Image Process. 7(1), 55–66 (2012).
[Crossref]

H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high-reflective surfaces,” J. Opt. Lasers Eng. 50(10), 1484–1493 (2012).
[Crossref]

2011 (3)

N. J. Brock, B. T. Kimbrough, and J. E. Millerd, “A pixelated micropolarizer-based camera for instantaneous interferometric measurements,” Proc. SPIE 8160, 81600W (2011).
[Crossref]

T. Kiire, S. Nakadate, M. Shibuya, and T. Yatagai, “Three-dimensional displacement measurement for diffuse object using phase-shifting digital holography with polarization imaging camera,” Appl. Opt. 50(34), H189–H194 (2011).
[Crossref] [PubMed]

L. Ekstrand and S. Zhang, “Autoexposure for three-dimensional shape measurement using a digital-light-processing projector,” J. Opt. Eng. 50(12), 123603 (2011).
[Crossref]

2010 (1)

X. Su and Q. Zhang, “Dynamic 3-D shape measurement method: A review,” J. Opt. Lasers Eng. 48(2), 191–204 (2010).
[Crossref]

2009 (1)

S. Zhang and S. T. Yau, “High dynamic range scanning technique,” J. Opt. Eng. 48(3), 033604 (2009).
[Crossref]

Brock, N. J.

N. J. Brock, B. T. Kimbrough, and J. E. Millerd, “A pixelated micropolarizer-based camera for instantaneous interferometric measurements,” Proc. SPIE 8160, 81600W (2011).
[Crossref]

Ekstrand, L.

L. Ekstrand and S. Zhang, “Autoexposure for three-dimensional shape measurement using a digital-light-processing projector,” J. Opt. Eng. 50(12), 123603 (2011).
[Crossref]

Jiang, H.

H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high-reflective surfaces,” J. Opt. Lasers Eng. 50(10), 1484–1493 (2012).
[Crossref]

Karpinsky, N.

N. Karpinsky and S. Zhang, “High-resolution, real-time 3D imaging with fringe analysis,” J. Real-Time Image Process. 7(1), 55–66 (2012).
[Crossref]

Kiire, T.

Kimbrough, B. T.

N. J. Brock, B. T. Kimbrough, and J. E. Millerd, “A pixelated micropolarizer-based camera for instantaneous interferometric measurements,” Proc. SPIE 8160, 81600W (2011).
[Crossref]

Li, X.

H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high-reflective surfaces,” J. Opt. Lasers Eng. 50(10), 1484–1493 (2012).
[Crossref]

Liang, R.

R. Liang, “Short wavelength and polarized phase shifting fringe projection imaging of translucent objects,” J. Opt. Eng. 53(1), 014104 (2014).
[Crossref]

Millerd, J. E.

N. J. Brock, B. T. Kimbrough, and J. E. Millerd, “A pixelated micropolarizer-based camera for instantaneous interferometric measurements,” Proc. SPIE 8160, 81600W (2011).
[Crossref]

Nakadate, S.

Shibuya, M.

Su, X.

X. Su and Q. Zhang, “Dynamic 3-D shape measurement method: A review,” J. Opt. Lasers Eng. 48(2), 191–204 (2010).
[Crossref]

Yatagai, T.

Yau, S. T.

S. Zhang and S. T. Yau, “High dynamic range scanning technique,” J. Opt. Eng. 48(3), 033604 (2009).
[Crossref]

Zhang, Q.

X. Su and Q. Zhang, “Dynamic 3-D shape measurement method: A review,” J. Opt. Lasers Eng. 48(2), 191–204 (2010).
[Crossref]

Zhang, S.

N. Karpinsky and S. Zhang, “High-resolution, real-time 3D imaging with fringe analysis,” J. Real-Time Image Process. 7(1), 55–66 (2012).
[Crossref]

L. Ekstrand and S. Zhang, “Autoexposure for three-dimensional shape measurement using a digital-light-processing projector,” J. Opt. Eng. 50(12), 123603 (2011).
[Crossref]

S. Zhang and S. T. Yau, “High dynamic range scanning technique,” J. Opt. Eng. 48(3), 033604 (2009).
[Crossref]

Zhao, H.

H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high-reflective surfaces,” J. Opt. Lasers Eng. 50(10), 1484–1493 (2012).
[Crossref]

Appl. Opt. (1)

J. Opt. Eng. (3)

R. Liang, “Short wavelength and polarized phase shifting fringe projection imaging of translucent objects,” J. Opt. Eng. 53(1), 014104 (2014).
[Crossref]

S. Zhang and S. T. Yau, “High dynamic range scanning technique,” J. Opt. Eng. 48(3), 033604 (2009).
[Crossref]

L. Ekstrand and S. Zhang, “Autoexposure for three-dimensional shape measurement using a digital-light-processing projector,” J. Opt. Eng. 50(12), 123603 (2011).
[Crossref]

J. Opt. Lasers Eng. (2)

X. Su and Q. Zhang, “Dynamic 3-D shape measurement method: A review,” J. Opt. Lasers Eng. 48(2), 191–204 (2010).
[Crossref]

H. Jiang, H. Zhao, and X. Li, “High dynamic range fringe acquisition: A novel 3-D scanning technique for high-reflective surfaces,” J. Opt. Lasers Eng. 50(10), 1484–1493 (2012).
[Crossref]

J. Real-Time Image Process. (1)

N. Karpinsky and S. Zhang, “High-resolution, real-time 3D imaging with fringe analysis,” J. Real-Time Image Process. 7(1), 55–66 (2012).
[Crossref]

Proc. SPIE (1)

N. J. Brock, B. T. Kimbrough, and J. E. Millerd, “A pixelated micropolarizer-based camera for instantaneous interferometric measurements,” Proc. SPIE 8160, 81600W (2011).
[Crossref]

Other (5)

4D Technology Corporation, PolarCam Polarization Camera, available at http://www.4dtechnology.com , accessed Mar., 2014.

S. Zhang, High-Resolution, Real-Time 3-D Shape Measurement, Ph.D. dissertation, Dept. of Mechanical Engineering, (Stony Brook University, Stony Brook, NY, 2005).

L. Wolff, “Using polarization to separate reflection components, ” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 1989), pp. 363–369.
[Crossref]

T. Chen, H. P. A. Lensch, C. Fuchs, and H. P. Seidel, “Polarization and phase-shifting for 3D scanning of translucent objects,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (Institute of Electrical and Electronics Engineers, 2007), pp. 1–8.
[Crossref]

C. Waddington and J. Kofman, “Saturation avoidance by adaptive fringe projection in phase-shifting 3D surface-shape measurement,” in Proceedings of Intl. Symp. on Optomechatronic Technologies, (Institute of Electrical and Electronics Engineers, 2010), pp. 1–4.
[Crossref]

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 Multi-polarization fringe projection (MPFP) imaging system.
Fig. 2
Fig. 2 Steps of multi-polarization fringe projection technique for HDR objects, vectors denote image-level operations while scalars denote pixel-level operations.
Fig. 3
Fig. 3 Single-polarization fringe projection imaging of simple object. (a) Simple three-surface object captured by unpolarized camera. (b) Raw polarized data of first distorted fringes. (c) Fringe contrast of various polarization channels at two cross-sections of first distorted fringes (black-white-black tapes on left and metal surface on right). (d) Shape rendering of five fringe images at separate polarizations.
Fig. 4
Fig. 4 Multi-polarization fringe projection imaging of simple object. (a) Multi-polarization decision map. (b) Merging results of first fringe images. (c) Phase retrieval. (d) Shape rendering of five enhanced fringes.
Fig. 5
Fig. 5 Multi-polarization fringe projection imaging of circuit board object. (a) Circuit board captured by unpolarized camera. (b) Decision map. (c) Merging results shown for first fringe. (d) Shape rendering.
Fig. 6
Fig. 6 Multi-polarization fringe projection imaging of an object under different exposures. (a) Scissors captured by unpolarized camera. (b) Decision map. (c) Shape rendering. The images shown in (b) and (c) are sorted left to right according to the utilized exposure time.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

g p,k = H p   f   I k +  n p,k ,
ϕ = tan 1 ( k=1 N g k sin(2πk/N) k=1 N g k cos(2πk/N) ).

Metrics