Abstract

In imaging systems, when specular surfaces responding sensitively to varying illumination conditions are imaged on groups of CCD pixels using imaging optics, the obtained image usually suffers from pixel saturation, resulting in smearing or blooming phenomena. These problems are then serious obstacles when applying structured light-based optical profiling methods to the shape measurement of general objects with partially specular surfaces. Therefore, this paper combines a phase-based profiling system with an with an adaptive spatial light modulator in the imaging part for measuring the three-dimensional shapes of objects with an advanced dynamic range. The use of a spatial light modulator in front of a CCD camera prevents the image sensor from being saturated, as the pixel transmittance is controlled by monitoring the input images and providing modulator feedback signals over time and space. When using the proposed system, since the projected fringes are effectively imaged on the CCD without any pixel saturation, phase information according to the object’s shape can be correctly extracted from non-saturated images. The configuration of the proposed system and transmittance control scheme are explained in detail, plus the performance is verified through a series of experiments, in which phase information was successfully extracted from areas that are not normally measurable due to saturation. Based on the results, the proposed shape measurement system showed a more advanced adaptive dynamic range when compared with a conventional system.

© 2010 OSA

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. G. Healey and T. Binford, “Local shape from specularity,” Int. J. Comput. Vis. 42, 62–86 (1988).
  2. Y. Ryu and H. Cho, “New Optical Measuring System for Solder Joint Inspection,” Opt. Lasers Eng. 26(6), 487–514 (1997).
    [CrossRef]
  3. M. Yamamoto and ., “Surface profile measurement of specular objects by grating projection method,” Proc. SPIE 4567, 48–55 (2002).
    [CrossRef]
  4. S. Nayar, K. Ikeuchi, and T. Kanade, “Surface Reflection: Physical and Geometrical Perspectives,” IEEE Trans. Pattern Anal. Mach. Intell. 13(7), 611–634 (1991).
    [CrossRef]
  5. K. Gasvik, Optical Metrology (John Wiley & Sons, 2002).
  6. T. Lee, and H. Erhardt, “Charge-coupled imager with dual gate anti-blooming structure,” U.S. patent 4,975,777 (1990).
  7. S. Nayar, and V. Branzoi, “Adaptive Dynamic Range Imaging: Optical Control of Pixel Exposures Over Space and Time,” in Proceedings of IEEE International Conference on Computer Vision (Institute of Electrical and Electronics Engineers, Nice, France, 2003), pp. 1168–1175.
  8. J. Goodman, Introduction to Fourier Optics (Roberts & Company Publishers, 2005).
  9. S. Nayar, V. Branzoi, and T. E. Boult, “Programmable Imaging: Towards a Flexible Camera,” Int. J. Comput. Vis. 70(1), 7–22 (2006).
    [CrossRef]
  10. P. Huang and S. Zhang, “3-D Optical Measurement using Phase Shifting Based Methods,” Proc. SPIE 6000, 15–24 (2005).
  11. E. Reinhard, et al., High Dynamic Range Imaging: Acquisition, Display and Image-based Lighting (Morgan Kaufmann Publishers, 2005).
  12. R. Banyal and B. Prasad, “Nonlinear response studies and corrections for a liquid crystal spatial light modulator,” Pramana J. Phys. 74(6), 961–971 (2010).
    [CrossRef]
  13. R. Hartley, and A. Zisserman, Multiple View Geometry in Computer Vision (Cambridge Univ. Press, 2001).
  14. N. Menn, Practical Optics (Elsevier Academic Press, 2004).

2010

R. Banyal and B. Prasad, “Nonlinear response studies and corrections for a liquid crystal spatial light modulator,” Pramana J. Phys. 74(6), 961–971 (2010).
[CrossRef]

2006

S. Nayar, V. Branzoi, and T. E. Boult, “Programmable Imaging: Towards a Flexible Camera,” Int. J. Comput. Vis. 70(1), 7–22 (2006).
[CrossRef]

2005

P. Huang and S. Zhang, “3-D Optical Measurement using Phase Shifting Based Methods,” Proc. SPIE 6000, 15–24 (2005).

2002

M. Yamamoto and ., “Surface profile measurement of specular objects by grating projection method,” Proc. SPIE 4567, 48–55 (2002).
[CrossRef]

1997

Y. Ryu and H. Cho, “New Optical Measuring System for Solder Joint Inspection,” Opt. Lasers Eng. 26(6), 487–514 (1997).
[CrossRef]

1991

S. Nayar, K. Ikeuchi, and T. Kanade, “Surface Reflection: Physical and Geometrical Perspectives,” IEEE Trans. Pattern Anal. Mach. Intell. 13(7), 611–634 (1991).
[CrossRef]

1988

G. Healey and T. Binford, “Local shape from specularity,” Int. J. Comput. Vis. 42, 62–86 (1988).

Banyal, R.

R. Banyal and B. Prasad, “Nonlinear response studies and corrections for a liquid crystal spatial light modulator,” Pramana J. Phys. 74(6), 961–971 (2010).
[CrossRef]

Binford, T.

G. Healey and T. Binford, “Local shape from specularity,” Int. J. Comput. Vis. 42, 62–86 (1988).

Boult, T. E.

S. Nayar, V. Branzoi, and T. E. Boult, “Programmable Imaging: Towards a Flexible Camera,” Int. J. Comput. Vis. 70(1), 7–22 (2006).
[CrossRef]

Branzoi, V.

S. Nayar, V. Branzoi, and T. E. Boult, “Programmable Imaging: Towards a Flexible Camera,” Int. J. Comput. Vis. 70(1), 7–22 (2006).
[CrossRef]

Cho, H.

Y. Ryu and H. Cho, “New Optical Measuring System for Solder Joint Inspection,” Opt. Lasers Eng. 26(6), 487–514 (1997).
[CrossRef]

Healey, G.

G. Healey and T. Binford, “Local shape from specularity,” Int. J. Comput. Vis. 42, 62–86 (1988).

Huang, P.

P. Huang and S. Zhang, “3-D Optical Measurement using Phase Shifting Based Methods,” Proc. SPIE 6000, 15–24 (2005).

Ikeuchi, K.

S. Nayar, K. Ikeuchi, and T. Kanade, “Surface Reflection: Physical and Geometrical Perspectives,” IEEE Trans. Pattern Anal. Mach. Intell. 13(7), 611–634 (1991).
[CrossRef]

Kanade, T.

S. Nayar, K. Ikeuchi, and T. Kanade, “Surface Reflection: Physical and Geometrical Perspectives,” IEEE Trans. Pattern Anal. Mach. Intell. 13(7), 611–634 (1991).
[CrossRef]

Nayar, S.

S. Nayar, V. Branzoi, and T. E. Boult, “Programmable Imaging: Towards a Flexible Camera,” Int. J. Comput. Vis. 70(1), 7–22 (2006).
[CrossRef]

S. Nayar, K. Ikeuchi, and T. Kanade, “Surface Reflection: Physical and Geometrical Perspectives,” IEEE Trans. Pattern Anal. Mach. Intell. 13(7), 611–634 (1991).
[CrossRef]

Prasad, B.

R. Banyal and B. Prasad, “Nonlinear response studies and corrections for a liquid crystal spatial light modulator,” Pramana J. Phys. 74(6), 961–971 (2010).
[CrossRef]

Ryu, Y.

Y. Ryu and H. Cho, “New Optical Measuring System for Solder Joint Inspection,” Opt. Lasers Eng. 26(6), 487–514 (1997).
[CrossRef]

Yamamoto, M.

M. Yamamoto and ., “Surface profile measurement of specular objects by grating projection method,” Proc. SPIE 4567, 48–55 (2002).
[CrossRef]

Zhang, S.

P. Huang and S. Zhang, “3-D Optical Measurement using Phase Shifting Based Methods,” Proc. SPIE 6000, 15–24 (2005).

IEEE Trans. Pattern Anal. Mach. Intell.

S. Nayar, K. Ikeuchi, and T. Kanade, “Surface Reflection: Physical and Geometrical Perspectives,” IEEE Trans. Pattern Anal. Mach. Intell. 13(7), 611–634 (1991).
[CrossRef]

Int. J. Comput. Vis.

G. Healey and T. Binford, “Local shape from specularity,” Int. J. Comput. Vis. 42, 62–86 (1988).

S. Nayar, V. Branzoi, and T. E. Boult, “Programmable Imaging: Towards a Flexible Camera,” Int. J. Comput. Vis. 70(1), 7–22 (2006).
[CrossRef]

Opt. Lasers Eng.

Y. Ryu and H. Cho, “New Optical Measuring System for Solder Joint Inspection,” Opt. Lasers Eng. 26(6), 487–514 (1997).
[CrossRef]

Pramana J. Phys.

R. Banyal and B. Prasad, “Nonlinear response studies and corrections for a liquid crystal spatial light modulator,” Pramana J. Phys. 74(6), 961–971 (2010).
[CrossRef]

Proc. SPIE

M. Yamamoto and ., “Surface profile measurement of specular objects by grating projection method,” Proc. SPIE 4567, 48–55 (2002).
[CrossRef]

P. Huang and S. Zhang, “3-D Optical Measurement using Phase Shifting Based Methods,” Proc. SPIE 6000, 15–24 (2005).

Other

E. Reinhard, et al., High Dynamic Range Imaging: Acquisition, Display and Image-based Lighting (Morgan Kaufmann Publishers, 2005).

K. Gasvik, Optical Metrology (John Wiley & Sons, 2002).

T. Lee, and H. Erhardt, “Charge-coupled imager with dual gate anti-blooming structure,” U.S. patent 4,975,777 (1990).

S. Nayar, and V. Branzoi, “Adaptive Dynamic Range Imaging: Optical Control of Pixel Exposures Over Space and Time,” in Proceedings of IEEE International Conference on Computer Vision (Institute of Electrical and Electronics Engineers, Nice, France, 2003), pp. 1168–1175.

J. Goodman, Introduction to Fourier Optics (Roberts & Company Publishers, 2005).

R. Hartley, and A. Zisserman, Multiple View Geometry in Computer Vision (Cambridge Univ. Press, 2001).

N. Menn, Practical Optics (Elsevier Academic Press, 2004).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (15)

Fig. 1
Fig. 1

Reflectance model. When light is incident to a surface, the light is reflected as a diffuse lobe and specular lobe, and the difference between the diffuse and specular lobes is determined by the smoothness of the reflecting surface.

Fig. 2
Fig. 2

Fringe image distortion due to saturation and blooming. As shown inside the red-dashed-circle, the saturation and blooming cause a discontinuous and shrunken fringe image.

Fig. 3
Fig. 3

Schematic diagram of the proposed system, consisting of fringe projection part and imaging part with LC-SLM. L3 delivers a grating fringe image onto the object surface, and the scene of the object surface with the fringe is initially imaged using the LC-SLM. The image then reaches the camera through L5. After attenuation using the LC-SLM, the intensities of the saturated CCD pixels are below the saturation level of the CCD pixels.

Fig. 4
Fig. 4

Fringe changes without and with an object. (a) With no object, the fringe was projected on the plane as parallel and regular lines. This was called the reference fringe. (b) With an object, the projected fringe on the surface was distorted.

Fig. 5
Fig. 5

Fringe shift due to object shape. Using this geometrical relationship, the three-dimensional shape of an object can be measured from the phase information.

Fig. 6
Fig. 6

Adaptive high dynamic range method, suggested by Nayar et al. [7],which overcomes the camera saturation problem by attenuating the excessive light intensity from the LC-SLM.

Fig. 7
Fig. 7

Two-lens illumination system [14]. Light is transmitted through lenses, L1 and L2. Lens L3 delivers the fringe image of the grating onto the object surface. By changing the area of the aperture stop D1, the rest of the light is blocked, except for that entering the grating.

Fig. 8
Fig. 8

Unwanted grating pattern due to the SLM pixel boundary. The window of SLMs consists of liquid crystal pixels and their physical boundary. The pixel boundary prevents the incoming light from passing through itself. (a) SLM structure. Gray regions are individual pixels allowing the transmission of incoming light and the black lines are physical pixel boundary with a low transmittance. (b) Unwanted grating pattern generation. The incoming light is partially blocked by the SLM pixel boundary while passing through the SLM.

Fig. 9
Fig. 9

Imaged unwanted grating pattern and its removal by deviation of the SLM from the focal plane. (a) Imaged unwanted grating pattern. Light coming from A is blocked by SLM pixel boundary, when the SLM is positioned at the focus of two lenses. However, light from B is firstly imaged on the SLM pixel and passes through the pixel. Finally light from B is imaged on camera. (b) Removal of unwanted grating pattern by deviation of the SLM from the focal plane. As SLM is positioned at off-focus of two lenses, some of light bundle coming from A passes through SLM pixels. Light bundle 2 in dark gray is still blocked by SLM pixel boundary, yet light bundle 1 in light gray passes through the SLM and reaches the CCD image plane.

Fig. 10
Fig. 10

Diagram of experimental set-up for determining optimal position of SLM. When the SLM was moved in the S direction, the unwanted grating pattern became weaker. At the same time, the image contrast initially decreased and then increased again (second highest visibility position).

Fig. 11
Fig. 11

Fringe image contrast according to SLM displacement. The exact focal position, s = 0, produced the highest contrast, yet the unwanted grating due to SLM pixel boundary was also strong. When the SLM was moved 3 mm, the unwanted grating became invisible and the image contrast became very low. When the SLM was moved 6mm, the image contrast became stronger and the unwanted grating became invisible. When the SLM was moved 9 mm, the image contrast became weaker again. (a) s = 0 (mm), the highest contrast. (b) s = 3 (mm), the lowest contrast. (c) s = 6 (mm), second highest contrast. (d) s = 9 (mm), lower contrast. (e) visibility variation by SLM displacement in direction of s.

Fig. 12
Fig. 12

Experimental set-up, including fringe projection part with halogen lamp and imaging part with LC-SLM.

Fig. 13
Fig. 13

Experimental results for 3 geometric objects. In the original fringe images without adaptive attenuation (2nd column), region-wide saturation occurred. When applying the SLM mask (4th column), fringes were obtained in the original saturated region as shown in the 3rd column. When using a series of adaptive fringe images, the object shapes were truly reconstructed, as shown in the 5th column.

Fig. 14
Fig. 14

Experimental results for 3 objects. 1st column: object images without fringe, 2nd column: distorted fringe images due to saturation and no attenuation, 3rd column: adjusted images based on LC-SLM attenuation. After the attenuation procedure, the fringes appear effectively.

Fig. 15
Fig. 15

Three dimensional depth information for 3rd measurement object based on analyzing fringe images. (a) The red-dashed-circle part could not be analyzed due to the absence of fringe image, as seen in the 2nd column, 3rd row in Fig. 14 When the LC-SLM attenuated the corresponding pixels to the saturated parts, the fringes were well imaged and correct three-dimensional depth information was obtained.

Equations (15)

Equations on this page are rendered with MathJax. Learn more.

u = h ( tan θ p + tan θ c )
φ = 2 π λ u
I ( x , y ) = a ( x , y ) + b ( x , y ) cos { 2 π λ x x + 2 π λ y y + φ ( x , y ) }
h = λ 2 π ( tan θ p + tan θ c ) φ = k φ
I n ( x , y ) = a ( x , y ) + b ( x , y ) cos { 2 π λ x x + 2 π λ y y + φ ( x , y ) + π 2 n }
n = 0, 1, 2, 3
φ ( x , y ) = tan 1 [ I 4 ( x , y ) I 2 ( x , y ) I 1 ( x , y ) I 3 ( x , y ) ] ( 2 π λ x x + 2 π λ y y ) + 2 N π                                   = φ p ( x , y ) + 2 N π
u = H l
u = h 11 l + h 12 m + h 13 n h 31 l + h 32 m + h 33 n           ,         v = h 21 l + h 22 m + h 23 n h 31 l + h 32 m + h 33 n
u = ( w u w v w ) ,         H = ( h 11 h 12 h 13 h 21 h 22 h 23 h 31 h 32 h 33 ) ,         l = ( l m 1 )
E ( l , m ) ​ ​     =     T ( u , v )     L ( x , y , z )
I ( l , m ) = λ E ( l , m )
T t + 1     =     α     T t
T t + 1     =     T t β ( T max T t )
V ( x , y ) = { I 4 ( x , y ) I 2 ( x , y ) } 2 + { I 1 ( x , y ) I 3 ( x , y ) } 2 2

Metrics