Abstract

Reduction of image quality under the effects of wavefront aberration of the optical system has a direct impact on the vision system’s performance. This paper tries to estimate the amount of aberration with the use of wavelet transform profilometry. The basic idea is based on the principle that under aberration effects, the position of the fringes’ image on the image plane will change, and this change correlates with the amount of aberration. So the distribution of aberration function can directly be extracted through measuring the amount of changes in the fringes’ image on the image plane. Experimental results and the empirical validity of this idea are evaluated.

© 2012 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. K. Rahbar and K. Faez, “Blind correction of lens aberration using modified Zernike moments,” Journal of Information and Communication Technology 2(6–7), 37–44 (2011).
  2. K. Rahbar and K. Faez, “Blind correction of lens aberration using Zernike moments,” in Proceedings of International Conference on Image Processing (2011).
  3. S. Aritan, “Efficiency of non-linear lens distortion models in biomechanical analysis of human movement,” Measurement 43, 739–746 (2010).
    [CrossRef]
  4. R. Hartley and S. B. Kang, “Parameter-free radial distortion correction with center of distortion estimation,” IEEE Trans. Pattern Anal. Mach. Intell. 29, 1309–1321 (2007).
    [CrossRef]
  5. K. V. Asari, S. Kumar, and D. Radhakrishnan, “A new approach for nonlinear distortion correction in endoscopic images based on least squares estimation,” IEEE Trans. Med. Imaging 18, 345–354 (1999).
    [CrossRef]
  6. Y. Meng and H. Zhuang, “What you see is what you get [self-calibrating camera lens distortion],” IEEE Robot. Autom. Magazine 11(4), 123–127 (2004).
    [CrossRef]
  7. F. Devernay, and O. Faugeras, “Straight lines have to be straight,” Machine Vis. Appl. 13, 14–24 (2001).
    [CrossRef]
  8. Y. Nomura, M. Sagara, H. Naruse, and A. Ide, “Simple calibration algorithm for high-distortion lens camera,” IEEE Trans Pattern Anal. Mach. Intell. 14, 1095–1099 (1992).
    [CrossRef]
  9. W. Yuand, and Y. Chung, “A calibration-free lens distortion correction method for low cost digital imaging,” in Proceedings of International Conference of Image Procecessing (2003).
  10. S. W. Parkand, and K. S. Hong, “Practical ways to calculate camera lens distortion for real-time camera calibration,” Pattern Recogn. 34, 1199–1206 (2001).
    [CrossRef]
  11. H. Faridand, and A. C. Popescu, “Blind removal of lens distortion,” J. Opt. Soc. Am. A 18, 2072–2078 (2001).
    [CrossRef]
  12. W. Yu, “Image-based lens geometric distortion correction using minimization of average bicoherence index,” Pattern Recogn. 37, 1175–1187 (2004).
    [CrossRef]
  13. M. A. Gdeisat, D. R. Burton, and M. J. Lalor, “Spatial carrier fringe pattern demodulation by use of a two-dimensional continuous wavelet transform,” Appl. Opt. 45, 8722–8732(2006).
    [CrossRef]
  14. X. Suand, and W. Chen, “Fourier transform profilometry:: a review,” Opt. Lasers Eng. 35, 263–284 (2001).
    [CrossRef]
  15. M. Takedaand, and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. 22, 3977–3982 (1983).
    [CrossRef]
  16. T. Yatagai, “Fringe scanning Ronchi test for aspherical surfaces,” Appl Opt. 23, 3676–3679 (1984).
    [CrossRef]
  17. A. Dursun, S. Özder, and F. N. Ecevit, “Continuous wavelet transform analysis of projected fringe patterns,” Meas. Sci. Technol. 15, 1768 (2004).
    [CrossRef]
  18. P. Tomassini, A. Giulietti, L. A. Gizzi, M. Galimberti, and D. Giulietti, “Analyzing laser plasma interferograms with a continuous wavelet transform ridge extraction technique: the method,” Appl. Opt. 40, 6561–6568 (2001).
    [CrossRef]
  19. H. Liu, A. N. Cartwright, and C. Basaran, “Moire interferogram phase extraction: a ridge detection algorithm for continuous wavelet transforms,” Appl. Opt. 43, 850–857 (2004).
    [CrossRef]
  20. M. A. Herráez, D. R. Burton, M. J. Lalor, and M. A. Gdeisat, “Fast two-dimensional phase-unwrapping algorithm based on sorting by reliability following a noncontinuous path,” Appl. Opt. 41, 7437–7444 (2002).
    [CrossRef]
  21. J. H. Bruning, D. R. Herriott, J. E. Gallagher, D. P. Rosenfeld, A. D. White, and D. J. Brangaccio, “Digital wavefront measuring interferometer for testing optical surfaces and lenses,” Appl. Opt. 13, 2693–2703 (1974).
    [CrossRef]
  22. T. Yatagai and T. Kanou, “Aspherical surface testing with shearing interferometer using fringe scanning detection method,” Opt. Eng. 23, 357–360 (1984).
    [CrossRef]

2011

K. Rahbar and K. Faez, “Blind correction of lens aberration using modified Zernike moments,” Journal of Information and Communication Technology 2(6–7), 37–44 (2011).

2010

S. Aritan, “Efficiency of non-linear lens distortion models in biomechanical analysis of human movement,” Measurement 43, 739–746 (2010).
[CrossRef]

2007

R. Hartley and S. B. Kang, “Parameter-free radial distortion correction with center of distortion estimation,” IEEE Trans. Pattern Anal. Mach. Intell. 29, 1309–1321 (2007).
[CrossRef]

2006

2004

Y. Meng and H. Zhuang, “What you see is what you get [self-calibrating camera lens distortion],” IEEE Robot. Autom. Magazine 11(4), 123–127 (2004).
[CrossRef]

W. Yu, “Image-based lens geometric distortion correction using minimization of average bicoherence index,” Pattern Recogn. 37, 1175–1187 (2004).
[CrossRef]

A. Dursun, S. Özder, and F. N. Ecevit, “Continuous wavelet transform analysis of projected fringe patterns,” Meas. Sci. Technol. 15, 1768 (2004).
[CrossRef]

H. Liu, A. N. Cartwright, and C. Basaran, “Moire interferogram phase extraction: a ridge detection algorithm for continuous wavelet transforms,” Appl. Opt. 43, 850–857 (2004).
[CrossRef]

2002

2001

H. Faridand, and A. C. Popescu, “Blind removal of lens distortion,” J. Opt. Soc. Am. A 18, 2072–2078 (2001).
[CrossRef]

P. Tomassini, A. Giulietti, L. A. Gizzi, M. Galimberti, and D. Giulietti, “Analyzing laser plasma interferograms with a continuous wavelet transform ridge extraction technique: the method,” Appl. Opt. 40, 6561–6568 (2001).
[CrossRef]

S. W. Parkand, and K. S. Hong, “Practical ways to calculate camera lens distortion for real-time camera calibration,” Pattern Recogn. 34, 1199–1206 (2001).
[CrossRef]

X. Suand, and W. Chen, “Fourier transform profilometry:: a review,” Opt. Lasers Eng. 35, 263–284 (2001).
[CrossRef]

F. Devernay, and O. Faugeras, “Straight lines have to be straight,” Machine Vis. Appl. 13, 14–24 (2001).
[CrossRef]

1999

K. V. Asari, S. Kumar, and D. Radhakrishnan, “A new approach for nonlinear distortion correction in endoscopic images based on least squares estimation,” IEEE Trans. Med. Imaging 18, 345–354 (1999).
[CrossRef]

1992

Y. Nomura, M. Sagara, H. Naruse, and A. Ide, “Simple calibration algorithm for high-distortion lens camera,” IEEE Trans Pattern Anal. Mach. Intell. 14, 1095–1099 (1992).
[CrossRef]

1984

T. Yatagai, “Fringe scanning Ronchi test for aspherical surfaces,” Appl Opt. 23, 3676–3679 (1984).
[CrossRef]

T. Yatagai and T. Kanou, “Aspherical surface testing with shearing interferometer using fringe scanning detection method,” Opt. Eng. 23, 357–360 (1984).
[CrossRef]

1983

1974

Aritan, S.

S. Aritan, “Efficiency of non-linear lens distortion models in biomechanical analysis of human movement,” Measurement 43, 739–746 (2010).
[CrossRef]

Asari, K. V.

K. V. Asari, S. Kumar, and D. Radhakrishnan, “A new approach for nonlinear distortion correction in endoscopic images based on least squares estimation,” IEEE Trans. Med. Imaging 18, 345–354 (1999).
[CrossRef]

Basaran, C.

Brangaccio, D. J.

Bruning, J. H.

Burton, D. R.

Cartwright, A. N.

Chen, W.

X. Suand, and W. Chen, “Fourier transform profilometry:: a review,” Opt. Lasers Eng. 35, 263–284 (2001).
[CrossRef]

Chung, Y.

W. Yuand, and Y. Chung, “A calibration-free lens distortion correction method for low cost digital imaging,” in Proceedings of International Conference of Image Procecessing (2003).

Devernay, F.

F. Devernay, and O. Faugeras, “Straight lines have to be straight,” Machine Vis. Appl. 13, 14–24 (2001).
[CrossRef]

Dursun, A.

A. Dursun, S. Özder, and F. N. Ecevit, “Continuous wavelet transform analysis of projected fringe patterns,” Meas. Sci. Technol. 15, 1768 (2004).
[CrossRef]

Ecevit, F. N.

A. Dursun, S. Özder, and F. N. Ecevit, “Continuous wavelet transform analysis of projected fringe patterns,” Meas. Sci. Technol. 15, 1768 (2004).
[CrossRef]

Faez, K.

K. Rahbar and K. Faez, “Blind correction of lens aberration using modified Zernike moments,” Journal of Information and Communication Technology 2(6–7), 37–44 (2011).

K. Rahbar and K. Faez, “Blind correction of lens aberration using Zernike moments,” in Proceedings of International Conference on Image Processing (2011).

Faridand, H.

Faugeras, O.

F. Devernay, and O. Faugeras, “Straight lines have to be straight,” Machine Vis. Appl. 13, 14–24 (2001).
[CrossRef]

Galimberti, M.

Gallagher, J. E.

Gdeisat, M. A.

Giulietti, A.

Giulietti, D.

Gizzi, L. A.

Hartley, R.

R. Hartley and S. B. Kang, “Parameter-free radial distortion correction with center of distortion estimation,” IEEE Trans. Pattern Anal. Mach. Intell. 29, 1309–1321 (2007).
[CrossRef]

Herráez, M. A.

Herriott, D. R.

Hong, K. S.

S. W. Parkand, and K. S. Hong, “Practical ways to calculate camera lens distortion for real-time camera calibration,” Pattern Recogn. 34, 1199–1206 (2001).
[CrossRef]

Ide, A.

Y. Nomura, M. Sagara, H. Naruse, and A. Ide, “Simple calibration algorithm for high-distortion lens camera,” IEEE Trans Pattern Anal. Mach. Intell. 14, 1095–1099 (1992).
[CrossRef]

Kang, S. B.

R. Hartley and S. B. Kang, “Parameter-free radial distortion correction with center of distortion estimation,” IEEE Trans. Pattern Anal. Mach. Intell. 29, 1309–1321 (2007).
[CrossRef]

Kanou, T.

T. Yatagai and T. Kanou, “Aspherical surface testing with shearing interferometer using fringe scanning detection method,” Opt. Eng. 23, 357–360 (1984).
[CrossRef]

Kumar, S.

K. V. Asari, S. Kumar, and D. Radhakrishnan, “A new approach for nonlinear distortion correction in endoscopic images based on least squares estimation,” IEEE Trans. Med. Imaging 18, 345–354 (1999).
[CrossRef]

Lalor, M. J.

Liu, H.

Meng, Y.

Y. Meng and H. Zhuang, “What you see is what you get [self-calibrating camera lens distortion],” IEEE Robot. Autom. Magazine 11(4), 123–127 (2004).
[CrossRef]

Mutoh, K.

Naruse, H.

Y. Nomura, M. Sagara, H. Naruse, and A. Ide, “Simple calibration algorithm for high-distortion lens camera,” IEEE Trans Pattern Anal. Mach. Intell. 14, 1095–1099 (1992).
[CrossRef]

Nomura, Y.

Y. Nomura, M. Sagara, H. Naruse, and A. Ide, “Simple calibration algorithm for high-distortion lens camera,” IEEE Trans Pattern Anal. Mach. Intell. 14, 1095–1099 (1992).
[CrossRef]

Özder, S.

A. Dursun, S. Özder, and F. N. Ecevit, “Continuous wavelet transform analysis of projected fringe patterns,” Meas. Sci. Technol. 15, 1768 (2004).
[CrossRef]

Parkand, S. W.

S. W. Parkand, and K. S. Hong, “Practical ways to calculate camera lens distortion for real-time camera calibration,” Pattern Recogn. 34, 1199–1206 (2001).
[CrossRef]

Popescu, A. C.

Radhakrishnan, D.

K. V. Asari, S. Kumar, and D. Radhakrishnan, “A new approach for nonlinear distortion correction in endoscopic images based on least squares estimation,” IEEE Trans. Med. Imaging 18, 345–354 (1999).
[CrossRef]

Rahbar, K.

K. Rahbar and K. Faez, “Blind correction of lens aberration using modified Zernike moments,” Journal of Information and Communication Technology 2(6–7), 37–44 (2011).

K. Rahbar and K. Faez, “Blind correction of lens aberration using Zernike moments,” in Proceedings of International Conference on Image Processing (2011).

Rosenfeld, D. P.

Sagara, M.

Y. Nomura, M. Sagara, H. Naruse, and A. Ide, “Simple calibration algorithm for high-distortion lens camera,” IEEE Trans Pattern Anal. Mach. Intell. 14, 1095–1099 (1992).
[CrossRef]

Suand, X.

X. Suand, and W. Chen, “Fourier transform profilometry:: a review,” Opt. Lasers Eng. 35, 263–284 (2001).
[CrossRef]

Takedaand, M.

Tomassini, P.

White, A. D.

Yatagai, T.

T. Yatagai, “Fringe scanning Ronchi test for aspherical surfaces,” Appl Opt. 23, 3676–3679 (1984).
[CrossRef]

T. Yatagai and T. Kanou, “Aspherical surface testing with shearing interferometer using fringe scanning detection method,” Opt. Eng. 23, 357–360 (1984).
[CrossRef]

Yu, W.

W. Yu, “Image-based lens geometric distortion correction using minimization of average bicoherence index,” Pattern Recogn. 37, 1175–1187 (2004).
[CrossRef]

Yuand, W.

W. Yuand, and Y. Chung, “A calibration-free lens distortion correction method for low cost digital imaging,” in Proceedings of International Conference of Image Procecessing (2003).

Zhuang, H.

Y. Meng and H. Zhuang, “What you see is what you get [self-calibrating camera lens distortion],” IEEE Robot. Autom. Magazine 11(4), 123–127 (2004).
[CrossRef]

Appl Opt.

T. Yatagai, “Fringe scanning Ronchi test for aspherical surfaces,” Appl Opt. 23, 3676–3679 (1984).
[CrossRef]

Appl. Opt.

IEEE Robot. Autom. Magazine

Y. Meng and H. Zhuang, “What you see is what you get [self-calibrating camera lens distortion],” IEEE Robot. Autom. Magazine 11(4), 123–127 (2004).
[CrossRef]

IEEE Trans Pattern Anal. Mach. Intell.

Y. Nomura, M. Sagara, H. Naruse, and A. Ide, “Simple calibration algorithm for high-distortion lens camera,” IEEE Trans Pattern Anal. Mach. Intell. 14, 1095–1099 (1992).
[CrossRef]

IEEE Trans. Med. Imaging

K. V. Asari, S. Kumar, and D. Radhakrishnan, “A new approach for nonlinear distortion correction in endoscopic images based on least squares estimation,” IEEE Trans. Med. Imaging 18, 345–354 (1999).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell.

R. Hartley and S. B. Kang, “Parameter-free radial distortion correction with center of distortion estimation,” IEEE Trans. Pattern Anal. Mach. Intell. 29, 1309–1321 (2007).
[CrossRef]

J. Opt. Soc. Am. A

Journal of Information and Communication Technology

K. Rahbar and K. Faez, “Blind correction of lens aberration using modified Zernike moments,” Journal of Information and Communication Technology 2(6–7), 37–44 (2011).

Machine Vis. Appl.

F. Devernay, and O. Faugeras, “Straight lines have to be straight,” Machine Vis. Appl. 13, 14–24 (2001).
[CrossRef]

Meas. Sci. Technol.

A. Dursun, S. Özder, and F. N. Ecevit, “Continuous wavelet transform analysis of projected fringe patterns,” Meas. Sci. Technol. 15, 1768 (2004).
[CrossRef]

Measurement

S. Aritan, “Efficiency of non-linear lens distortion models in biomechanical analysis of human movement,” Measurement 43, 739–746 (2010).
[CrossRef]

Opt. Eng.

T. Yatagai and T. Kanou, “Aspherical surface testing with shearing interferometer using fringe scanning detection method,” Opt. Eng. 23, 357–360 (1984).
[CrossRef]

Opt. Lasers Eng.

X. Suand, and W. Chen, “Fourier transform profilometry:: a review,” Opt. Lasers Eng. 35, 263–284 (2001).
[CrossRef]

Pattern Recogn.

S. W. Parkand, and K. S. Hong, “Practical ways to calculate camera lens distortion for real-time camera calibration,” Pattern Recogn. 34, 1199–1206 (2001).
[CrossRef]

W. Yu, “Image-based lens geometric distortion correction using minimization of average bicoherence index,” Pattern Recogn. 37, 1175–1187 (2004).
[CrossRef]

Other

K. Rahbar and K. Faez, “Blind correction of lens aberration using Zernike moments,” in Proceedings of International Conference on Image Processing (2011).

W. Yuand, and Y. Chung, “A calibration-free lens distortion correction method for low cost digital imaging,” in Proceedings of International Conference of Image Procecessing (2003).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1.

Geometry of imaginative scheme to extract the wavefront aberration function.

Fig. 2.
Fig. 2.

Phase demodulation of aberrational fringes image. (a) Predefine phase aberration function, (b) image of fringe pattern on the exit pupil of the visual system, (c) row 300 of the fringe pattern, (d) the modulus of wavelet transform of the raw 100, (e) the argument of wavelet transform the row 100, (f) the wrapped phase of the row 100, (g) the wrapped phase of the fringe pattern, and (h) the unwrapped phase of the fringe pattern.

Fig. 3.
Fig. 3.

Phase demodulation of aberrational fringes image. (a) Predefine phase aberration function, (b) image of fringe pattern on the exit pupil of the visual system, (c) row 300 of the fringe pattern, (d) the modulus of wavelet transform of the raw 300, (e) the argument of wavelet transform the row 300, (f) the wrapped phase of the row 300, (g) the wrapped phase of the fringe pattern in direction, (h) the unwrapped phase of the fringe pattern in direction, (i) the unwrapped phase of the fringe pattern in direction, (j) total aberration recovered, and (k) the absolute error of the recovered aberration function.

Fig. 4.
Fig. 4.

Graph of the variation of root mean square error through changes in Gaussian noise amplitude and its standard deviation.

Fig. 5.
Fig. 5.

Experimental arrangement for the estimation of phase aberration function, (b) test arrangement, (b) the image of aberrational fringe.

Fig. 6.
Fig. 6.

Phase demodulation of aberrational fringe’s image. (a) Slice of the fringe pattern of image plane of vision system, (b) row 300 of the fringe pattern, (c) the modulus of wavelet transform of the raw 300, (d) the argument of wavelet transform the row 300, (e) the wrapped phase of the row 300, (f) the wrapped phase of the fringe pattern in ξ direction, (g) the unwrapped phase of the fringe pattern in ξ direction, (h) the unwrapped phase of the fringe pattern in δ direction, and (i) total aberration recovered.

Tables (1)

Tables Icon

Table 1. Zernike Polynomial Coefficients Corresponding to the Seidel Aberration

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

W(ξ,δ)ξ=xf,W(ξ,δ)δ=yf.
f(x,δ)=a(x,δ)+γ(x,δ)cos[2πd(x+η)],
f(x,δ)=a(x,δ)+γ(x,δ)cos[2πd(fW(ξ,δ)ξ+η)]
2πd(fW(ξ,δ)ξ)=2πm,m=0,±1,.
W(ξ,δ)ξ=dfm.
ψ(ξ)=π1/4eicξeξ2/2,
ψb,ξ(ξ)=1sψ(ξbs).
W(s,b)=1sψ*(sbs)g(ξ)dξ.
ϕ(s,b)=tg1[IW(s,b)RW(s,b)]
2πd(fW(ξ,δ)δξ)=tg1[IW(s,b)RW(s,b)].
W(ξ)=W(ξ,δ)ξdξ.
W(ξ,δ)=jωjZj(ξ,δ)
W(ξ,δ)=jωjZj(ξ,δ)+αjφn(μ,σ),

Metrics