Abstract

The key issue in passive autofocus is to choose robust focus measures to judge optical blur in defocused images. Existing focus measures are sensitive to image contrast (illumination) as they use image intensity. In this report we demonstrate a focus measure using phase congruency. The proposed focus measure is robust for noisy imaging sensors in varying illumination conditions, and has great balance of defocus sensitivity and effective range. Its advantages are shown with a number of synthetic image sequences.

© 2010 OSA

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. M. Subbarao, T. Choi, and A. Nikzad, “Focusing techniques,” Opt. Eng. 32(11), 2824–2836 (1993).
    [CrossRef]
  2. Y. Tian, K. Shieh, and C. F. Wildsoet, “Performance of focus measures in the presence of nondefocus aberrations,” J. Opt. Soc. Am. A 24(12), B165–B173 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=josaa-24-12-B165 .
    [CrossRef]
  3. M. Subbarao and G. Surya, “Depth from defocus: A spatial domain approach,” Int. J. Comput. Vis. 13(3), 271–294 (1994).
    [CrossRef]
  4. A. Pentland, S. Scherock, T. Darrell, and B. Girod, “Simple range cameras based on focal error,” J. Opt. Soc. Am. A 11(11), 2925–2934 (1994), http://www.opticsinfobase.org/abstract.cfm?URI=josaa-11-11-2925 .
    [CrossRef]
  5. S. K. Nayar, M. Watanabe, and M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern Anal. Mach. Intell. 18(12), 1186–1198 (1996).
    [CrossRef]
  6. A. N. Rajagopalan and S. Chaudhuri, “A variational approach to recovering depth from defocused images,” IEEE Trans. Pattern Anal. Mach. Intell. 19(10), 1158–1164 (1997).
    [CrossRef]
  7. V. Aslantas and D. T. Pham, “Depth from automatic defocusing,” Opt. Express 15(3), 1011–1023 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=oe-15-3-1011 .
    [CrossRef] [PubMed]
  8. S. O. Shim and T. S. Choi, “Depth from focus based on combinatorial optimization,” Opt. Lett. 35(12), 1956–1958 (2010), http://www.opticsinfobase.org/abstract.cfm?URI=ol-35-12-1956 .
    [CrossRef] [PubMed]
  9. W. Huang and Z. Jing, “Evaluation of focus measures in multi-focus image fusion,” Pattern Recognit. Lett. 28(4), 493–500 (2007).
    [CrossRef]
  10. V. Aslantas and R. Kurban, “A comparison of criterion functions for fusion of multi-focus noisy images,” Opt. Commun. 82(16), 3231–3242 (2009).
    [CrossRef]
  11. S. Nayar and Y. Nakagawa, “Shape from focus,” IEEE Trans. Pattern Anal. Mach. Intell. 16(8), 824–831 (1994).
    [CrossRef]
  12. P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via diffusion,” IEEE Trans. Pattern Anal. Mach. Intell. 30(3), 518–531 (2008).
    [CrossRef] [PubMed]
  13. M. Subbarao and J. K. Tyan, “Selecting the optimal focus measure for autofocusing and depth-from-focus,” IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 864–870 (1998).
    [CrossRef]
  14. T. Aydin and Y. S. Akgul, “An occlusion insensitive adaptive focus measurement method,” Opt. Express 18(13), 14212–14224 (2010), http://www.opticsinfobase.org/abstract.cfm?URI=oe-18-13-14212 .
    [CrossRef] [PubMed]
  15. M.C. Morrone, J. R. Ross, D.C. Burr, and R. A. Owens, “Mach bands are phase dependent,” Nature 324, 250–253 (1986).
    [CrossRef]
  16. M. C. Morrone and R. A. Owens, “Feature detection from local energy,” Pattern Recognit. Lett. 6(5), 303–313 (1987).
    [CrossRef]
  17. M. C. Morrone and D. C. Burr, “Feature detection in human vision: a phase-dependent energy model,” Proc. R. Soc. Lond. B Biol. Sci. 235(1280), 221–245 (1988).
    [CrossRef] [PubMed]
  18. Z. Wang, and E. P. Simoncelli, “Local phase coherence and the perception of blur,” in Advances in Neural Information Processing Systems, S. Thurn, L. Saul, and B. Schölkopf, ed. (MIT Press, Boston, M.A., 2004), pp. 1435–1442.
    [PubMed]
  19. P. Kovesi, “Image features from phase congruency,” VIDERE: J. Comput. Vis. Res. 1, 1–27 (1999).
  20. D. J. Field, “Relations between the statistics of natural image and the response properties of cortical cells,” J. Opt. Soc. Am. 4(12), 2379–2394 (1987), http://www.opticsinfobase.org/abstract.cfm?URI=josaa-4-12-2379 .
    [CrossRef]
  21. Y. Tian, H. Feng, Z. Xu, and J. Huang, “Dynamic focus window selection strategy for digital cameras,” Proc. SPIE 5678, 219–229 (2005).
    [CrossRef]
  22. Y. Tian, “Dynamic focus window selection using a statistical color model,” Proc. SPIE 6069, 98–106 (2006).
  23. H. Lin, and K. Gu, “Depth recovery using defocus blur as infinity,” in Proceedings of the 19th International Conference on Pattern Recognitions, (IEEE, Tampa, FL, 2008), pp. 1–4.

2010 (2)

2009 (1)

V. Aslantas and R. Kurban, “A comparison of criterion functions for fusion of multi-focus noisy images,” Opt. Commun. 82(16), 3231–3242 (2009).
[CrossRef]

2008 (1)

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via diffusion,” IEEE Trans. Pattern Anal. Mach. Intell. 30(3), 518–531 (2008).
[CrossRef] [PubMed]

2007 (3)

2006 (1)

Y. Tian, “Dynamic focus window selection using a statistical color model,” Proc. SPIE 6069, 98–106 (2006).

2005 (1)

Y. Tian, H. Feng, Z. Xu, and J. Huang, “Dynamic focus window selection strategy for digital cameras,” Proc. SPIE 5678, 219–229 (2005).
[CrossRef]

1999 (1)

P. Kovesi, “Image features from phase congruency,” VIDERE: J. Comput. Vis. Res. 1, 1–27 (1999).

1998 (1)

M. Subbarao and J. K. Tyan, “Selecting the optimal focus measure for autofocusing and depth-from-focus,” IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 864–870 (1998).
[CrossRef]

1997 (1)

A. N. Rajagopalan and S. Chaudhuri, “A variational approach to recovering depth from defocused images,” IEEE Trans. Pattern Anal. Mach. Intell. 19(10), 1158–1164 (1997).
[CrossRef]

1996 (1)

S. K. Nayar, M. Watanabe, and M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern Anal. Mach. Intell. 18(12), 1186–1198 (1996).
[CrossRef]

1994 (3)

M. Subbarao and G. Surya, “Depth from defocus: A spatial domain approach,” Int. J. Comput. Vis. 13(3), 271–294 (1994).
[CrossRef]

S. Nayar and Y. Nakagawa, “Shape from focus,” IEEE Trans. Pattern Anal. Mach. Intell. 16(8), 824–831 (1994).
[CrossRef]

A. Pentland, S. Scherock, T. Darrell, and B. Girod, “Simple range cameras based on focal error,” J. Opt. Soc. Am. A 11(11), 2925–2934 (1994), http://www.opticsinfobase.org/abstract.cfm?URI=josaa-11-11-2925 .
[CrossRef]

1993 (1)

M. Subbarao, T. Choi, and A. Nikzad, “Focusing techniques,” Opt. Eng. 32(11), 2824–2836 (1993).
[CrossRef]

1988 (1)

M. C. Morrone and D. C. Burr, “Feature detection in human vision: a phase-dependent energy model,” Proc. R. Soc. Lond. B Biol. Sci. 235(1280), 221–245 (1988).
[CrossRef] [PubMed]

1987 (2)

D. J. Field, “Relations between the statistics of natural image and the response properties of cortical cells,” J. Opt. Soc. Am. 4(12), 2379–2394 (1987), http://www.opticsinfobase.org/abstract.cfm?URI=josaa-4-12-2379 .
[CrossRef]

M. C. Morrone and R. A. Owens, “Feature detection from local energy,” Pattern Recognit. Lett. 6(5), 303–313 (1987).
[CrossRef]

1986 (1)

M.C. Morrone, J. R. Ross, D.C. Burr, and R. A. Owens, “Mach bands are phase dependent,” Nature 324, 250–253 (1986).
[CrossRef]

Akgul, Y. S.

Aslantas, V.

V. Aslantas and R. Kurban, “A comparison of criterion functions for fusion of multi-focus noisy images,” Opt. Commun. 82(16), 3231–3242 (2009).
[CrossRef]

V. Aslantas and D. T. Pham, “Depth from automatic defocusing,” Opt. Express 15(3), 1011–1023 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=oe-15-3-1011 .
[CrossRef] [PubMed]

Aydin, T.

Burger, M.

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via diffusion,” IEEE Trans. Pattern Anal. Mach. Intell. 30(3), 518–531 (2008).
[CrossRef] [PubMed]

Burr, D. C.

M. C. Morrone and D. C. Burr, “Feature detection in human vision: a phase-dependent energy model,” Proc. R. Soc. Lond. B Biol. Sci. 235(1280), 221–245 (1988).
[CrossRef] [PubMed]

Burr, D.C.

M.C. Morrone, J. R. Ross, D.C. Burr, and R. A. Owens, “Mach bands are phase dependent,” Nature 324, 250–253 (1986).
[CrossRef]

Chaudhuri, S.

A. N. Rajagopalan and S. Chaudhuri, “A variational approach to recovering depth from defocused images,” IEEE Trans. Pattern Anal. Mach. Intell. 19(10), 1158–1164 (1997).
[CrossRef]

Choi, T.

M. Subbarao, T. Choi, and A. Nikzad, “Focusing techniques,” Opt. Eng. 32(11), 2824–2836 (1993).
[CrossRef]

Choi, T. S.

Darrell, T.

Favaro, P.

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via diffusion,” IEEE Trans. Pattern Anal. Mach. Intell. 30(3), 518–531 (2008).
[CrossRef] [PubMed]

Feng, H.

Y. Tian, H. Feng, Z. Xu, and J. Huang, “Dynamic focus window selection strategy for digital cameras,” Proc. SPIE 5678, 219–229 (2005).
[CrossRef]

Field, D. J.

D. J. Field, “Relations between the statistics of natural image and the response properties of cortical cells,” J. Opt. Soc. Am. 4(12), 2379–2394 (1987), http://www.opticsinfobase.org/abstract.cfm?URI=josaa-4-12-2379 .
[CrossRef]

Girod, B.

Huang, J.

Y. Tian, H. Feng, Z. Xu, and J. Huang, “Dynamic focus window selection strategy for digital cameras,” Proc. SPIE 5678, 219–229 (2005).
[CrossRef]

Huang, W.

W. Huang and Z. Jing, “Evaluation of focus measures in multi-focus image fusion,” Pattern Recognit. Lett. 28(4), 493–500 (2007).
[CrossRef]

Jing, Z.

W. Huang and Z. Jing, “Evaluation of focus measures in multi-focus image fusion,” Pattern Recognit. Lett. 28(4), 493–500 (2007).
[CrossRef]

Kovesi, P.

P. Kovesi, “Image features from phase congruency,” VIDERE: J. Comput. Vis. Res. 1, 1–27 (1999).

Kurban, R.

V. Aslantas and R. Kurban, “A comparison of criterion functions for fusion of multi-focus noisy images,” Opt. Commun. 82(16), 3231–3242 (2009).
[CrossRef]

Morrone, M. C.

M. C. Morrone and D. C. Burr, “Feature detection in human vision: a phase-dependent energy model,” Proc. R. Soc. Lond. B Biol. Sci. 235(1280), 221–245 (1988).
[CrossRef] [PubMed]

M. C. Morrone and R. A. Owens, “Feature detection from local energy,” Pattern Recognit. Lett. 6(5), 303–313 (1987).
[CrossRef]

Morrone, M.C.

M.C. Morrone, J. R. Ross, D.C. Burr, and R. A. Owens, “Mach bands are phase dependent,” Nature 324, 250–253 (1986).
[CrossRef]

Nakagawa, Y.

S. Nayar and Y. Nakagawa, “Shape from focus,” IEEE Trans. Pattern Anal. Mach. Intell. 16(8), 824–831 (1994).
[CrossRef]

Nayar, S.

S. Nayar and Y. Nakagawa, “Shape from focus,” IEEE Trans. Pattern Anal. Mach. Intell. 16(8), 824–831 (1994).
[CrossRef]

Nayar, S. K.

S. K. Nayar, M. Watanabe, and M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern Anal. Mach. Intell. 18(12), 1186–1198 (1996).
[CrossRef]

Nikzad, A.

M. Subbarao, T. Choi, and A. Nikzad, “Focusing techniques,” Opt. Eng. 32(11), 2824–2836 (1993).
[CrossRef]

Noguchi, M.

S. K. Nayar, M. Watanabe, and M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern Anal. Mach. Intell. 18(12), 1186–1198 (1996).
[CrossRef]

Osher, S. J.

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via diffusion,” IEEE Trans. Pattern Anal. Mach. Intell. 30(3), 518–531 (2008).
[CrossRef] [PubMed]

Owens, R. A.

M. C. Morrone and R. A. Owens, “Feature detection from local energy,” Pattern Recognit. Lett. 6(5), 303–313 (1987).
[CrossRef]

M.C. Morrone, J. R. Ross, D.C. Burr, and R. A. Owens, “Mach bands are phase dependent,” Nature 324, 250–253 (1986).
[CrossRef]

Pentland, A.

Pham, D. T.

Rajagopalan, A. N.

A. N. Rajagopalan and S. Chaudhuri, “A variational approach to recovering depth from defocused images,” IEEE Trans. Pattern Anal. Mach. Intell. 19(10), 1158–1164 (1997).
[CrossRef]

Ross, J. R.

M.C. Morrone, J. R. Ross, D.C. Burr, and R. A. Owens, “Mach bands are phase dependent,” Nature 324, 250–253 (1986).
[CrossRef]

Scherock, S.

Shieh, K.

Shim, S. O.

Soatto, S.

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via diffusion,” IEEE Trans. Pattern Anal. Mach. Intell. 30(3), 518–531 (2008).
[CrossRef] [PubMed]

Subbarao, M.

M. Subbarao and J. K. Tyan, “Selecting the optimal focus measure for autofocusing and depth-from-focus,” IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 864–870 (1998).
[CrossRef]

M. Subbarao and G. Surya, “Depth from defocus: A spatial domain approach,” Int. J. Comput. Vis. 13(3), 271–294 (1994).
[CrossRef]

M. Subbarao, T. Choi, and A. Nikzad, “Focusing techniques,” Opt. Eng. 32(11), 2824–2836 (1993).
[CrossRef]

Surya, G.

M. Subbarao and G. Surya, “Depth from defocus: A spatial domain approach,” Int. J. Comput. Vis. 13(3), 271–294 (1994).
[CrossRef]

Tian, Y.

Y. Tian, K. Shieh, and C. F. Wildsoet, “Performance of focus measures in the presence of nondefocus aberrations,” J. Opt. Soc. Am. A 24(12), B165–B173 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=josaa-24-12-B165 .
[CrossRef]

Y. Tian, “Dynamic focus window selection using a statistical color model,” Proc. SPIE 6069, 98–106 (2006).

Y. Tian, H. Feng, Z. Xu, and J. Huang, “Dynamic focus window selection strategy for digital cameras,” Proc. SPIE 5678, 219–229 (2005).
[CrossRef]

Tyan, J. K.

M. Subbarao and J. K. Tyan, “Selecting the optimal focus measure for autofocusing and depth-from-focus,” IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 864–870 (1998).
[CrossRef]

Watanabe, M.

S. K. Nayar, M. Watanabe, and M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern Anal. Mach. Intell. 18(12), 1186–1198 (1996).
[CrossRef]

Wildsoet, C. F.

Xu, Z.

Y. Tian, H. Feng, Z. Xu, and J. Huang, “Dynamic focus window selection strategy for digital cameras,” Proc. SPIE 5678, 219–229 (2005).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell. (5)

S. K. Nayar, M. Watanabe, and M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern Anal. Mach. Intell. 18(12), 1186–1198 (1996).
[CrossRef]

A. N. Rajagopalan and S. Chaudhuri, “A variational approach to recovering depth from defocused images,” IEEE Trans. Pattern Anal. Mach. Intell. 19(10), 1158–1164 (1997).
[CrossRef]

S. Nayar and Y. Nakagawa, “Shape from focus,” IEEE Trans. Pattern Anal. Mach. Intell. 16(8), 824–831 (1994).
[CrossRef]

P. Favaro, S. Soatto, M. Burger, and S. J. Osher, “Shape from defocus via diffusion,” IEEE Trans. Pattern Anal. Mach. Intell. 30(3), 518–531 (2008).
[CrossRef] [PubMed]

M. Subbarao and J. K. Tyan, “Selecting the optimal focus measure for autofocusing and depth-from-focus,” IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 864–870 (1998).
[CrossRef]

Int. J. Comput. Vis. (1)

M. Subbarao and G. Surya, “Depth from defocus: A spatial domain approach,” Int. J. Comput. Vis. 13(3), 271–294 (1994).
[CrossRef]

J. Opt. Soc. Am. (1)

D. J. Field, “Relations between the statistics of natural image and the response properties of cortical cells,” J. Opt. Soc. Am. 4(12), 2379–2394 (1987), http://www.opticsinfobase.org/abstract.cfm?URI=josaa-4-12-2379 .
[CrossRef]

J. Opt. Soc. Am. A (2)

Nature (1)

M.C. Morrone, J. R. Ross, D.C. Burr, and R. A. Owens, “Mach bands are phase dependent,” Nature 324, 250–253 (1986).
[CrossRef]

Opt. Commun. (1)

V. Aslantas and R. Kurban, “A comparison of criterion functions for fusion of multi-focus noisy images,” Opt. Commun. 82(16), 3231–3242 (2009).
[CrossRef]

Opt. Eng. (1)

M. Subbarao, T. Choi, and A. Nikzad, “Focusing techniques,” Opt. Eng. 32(11), 2824–2836 (1993).
[CrossRef]

Opt. Express (2)

Opt. Lett. (1)

Pattern Recognit. Lett. (2)

W. Huang and Z. Jing, “Evaluation of focus measures in multi-focus image fusion,” Pattern Recognit. Lett. 28(4), 493–500 (2007).
[CrossRef]

M. C. Morrone and R. A. Owens, “Feature detection from local energy,” Pattern Recognit. Lett. 6(5), 303–313 (1987).
[CrossRef]

Proc. R. Soc. Lond. B Biol. Sci. (1)

M. C. Morrone and D. C. Burr, “Feature detection in human vision: a phase-dependent energy model,” Proc. R. Soc. Lond. B Biol. Sci. 235(1280), 221–245 (1988).
[CrossRef] [PubMed]

Proc. SPIE (2)

Y. Tian, H. Feng, Z. Xu, and J. Huang, “Dynamic focus window selection strategy for digital cameras,” Proc. SPIE 5678, 219–229 (2005).
[CrossRef]

Y. Tian, “Dynamic focus window selection using a statistical color model,” Proc. SPIE 6069, 98–106 (2006).

VIDERE: J. Comput. Vis. Res. (1)

P. Kovesi, “Image features from phase congruency,” VIDERE: J. Comput. Vis. Res. 1, 1–27 (1999).

Other (2)

H. Lin, and K. Gu, “Depth recovery using defocus blur as infinity,” in Proceedings of the 19th International Conference on Pattern Recognitions, (IEEE, Tampa, FL, 2008), pp. 1–4.

Z. Wang, and E. P. Simoncelli, “Local phase coherence and the perception of blur,” in Advances in Neural Information Processing Systems, S. Thurn, L. Saul, and B. Schölkopf, ed. (MIT Press, Boston, M.A., 2004), pp. 1435–1442.
[PubMed]

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1

Thin-lens imaging system model. O, I and C represent the object, image and lens center respectively. The object distance, image distance, lens focal length and lens diameter are labeled p, q, f and d respectively. The image is focused when the image sensor is positioned at A, but blurred when the image sensor is positioned at B (the diameter of the blur circle is b).

Fig. 2
Fig. 2

Point spread functions (PSFs) for 9 different amounts of defocus. Each PSF is a Gaussian function of 11x11 pixels, The standard deviation of each PSF is 1/3 of the blur radius from the corresponding amount of defocus. The energy of each PSF is normalized to 1.

Fig. 3
Fig. 3

images with variable amount of defocus and three types of noise.

Fig. 4
Fig. 4

Focus measure performance with (a) no noise, (b) Poisson noise, (c) Gaussian noise and (d) Speckle noise.

Fig. 5
Fig. 5

Sample images with variable amount of defocus and different types of noise, further processed by a 5x5 median filter.

Fig. 6
Fig. 6

Focus measure performance with (a) no noise, (b) Poisson noise, (c) Gaussian noise and (d) Speckle noise. Except for IPC, noise in input images is reduced by a 5x5 median filter.

Fig. 7
Fig. 7

Sample images with variable amount of defocus, different types of noise, and variable contrast (scaling factor 0.5, offset 20), further processed by a 5x5 median filter.

Fig. 8
Fig. 8

Focus measure performance with contrast variations (scaling factor 0.5, offset 20): (a) no noise, (b) Poisson noise, (c) Gaussian noise and (d) Speckle noise. Except for IPC, noise in input images is reduced by a 5x5 median filter.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

1 p + 1 q = 1 f
P C ( x ) = max φ ( x ) [ 0 , 2 π ] ( n A n cos [ φ n ( x ) φ ( x ) ] n A n ) .
P C ( x , y ) = i = 1 m j = 1 n W i ( x , y ) f p { A j i ( x , y ) [ Δ ϕ j i ( x , y ) ] T i } i = 1 m j = 1 n A j i ( x , y ) .
Δ ϕ ( x , y ) = cos [ ϕ n ( x , y ) ϕ ( x , y ) ] | sin [ ϕ n ( x , y ) ϕ ( x , y ) ] | .
I P C ( i ) = ( x , y ) i f w P C ( x , y ) d x d y .
σ = R q 3 ( 1 f 1 q 1 p ) .
V A R ( i ) = [ i ( x , y ) i ( x , y ) ¯ ] 2 i ( x , y ) ¯ 2 d x d y .
E I G ( i ) = ( i x ) 2 + ( i y ) 2 i ( x , y ) ¯ 2 d x d y .
E O S ( i ) = 0 + 0 + I 2 ( u , v ) d u d v .
W B R ( i ) = k = 2 4 [ c k ( u , v ) ] 2 d u d v c 1 2 ( u , v ) d u d v .

Metrics