Abstract

A robust stereo matching method based on a comprehensive mathematical model for color formation process is proposed to estimate the disparity map of stereo images with noise and photometric variations. The band-pass filter with DoP kernel is firstly used to filter out noise component of the stereo images. Then the log-chromaticity normalization process is applied to eliminate the influence of lightning geometry. All the other factors that may influence the color formation process are removed through the disparity estimation process with a specific matching cost. Performance of the developed method is evaluated by comparing with some up-to-date algorithms. Experimental results are presented to demonstrate the robustness and accuracy of the method.

© 2015 Optical Society of America

Full Article  |  PDF Article
OSA Recommended Articles
Intensity guided cost metric for fast stereo matching under radiometric variations

Asim Khan, Muhammad Umar Karim Khan, and Chong-Min Kyung
Opt. Express 26(4) 4096-4111 (2018)

Robust disparity estimation based on color monogenic curvature phase

Di Zang, Jie Li, Dongdong Zhang, and Junqi Zhang
Opt. Express 20(14) 14971-14979 (2012)

Disparity-selective stereo matching using correlation confidence measure

Sijung Kim, Jinbeum Jang, Jaeseung Lim, Joonki Paik, and Sangkeun Lee
J. Opt. Soc. Am. A 35(9) 1653-1662 (2018)

References

  • View by:
  • |
  • |
  • |

  1. S. Daniel and R. Szeliski, “A taxonomy and evaluation of dense two-frame stereo correspondence algorithms,” Int. J. Comput. Vis. 47(1–3), 7–42 (2002).
  2. H. Sunyoto, W. van der Mark, and D. M. Gavrila, “A comparative study of fast dense stereo vision algorithms,” in Proceedings of IEEE Intelli. Veh. Symp. (IEEE, 2004), pp. 319–324.
    [Crossref]
  3. N. Lazaros, G. C. Sirakoulis, and A. Gasteratos, “Review of stereo vision algorithms: from software to hardware,” Int. J. Optomechatroni. 2(4), 435–462 (2008).
  4. T. Kanade, A. Yoshida, K. Oda, H. Kano, and M. Tanaka, “A stereo machine for video-rate dense depth mapping and its new applications,” in Proceedings of IEEE Intl. Conf. on Computer Vision and Pattern Recognition (IEEE, 1996), pp. 196–202.
    [Crossref]
  5. Z. Ramin and J. Woodfill, “Non-parametric local transforms for computing visual correspondence,” in Proceedings of IEEE Intl. Conf. on European Conf. Computer Vision (IEEE, 1994), pp. 151–158.
  6. I. L. Jung, T. Y. Chung, J. Y. Sim, and C. S. Kim, “Consistent Stereo Matching Under Varying Radiometric Conditions,” IEEE Trans. Multimed. 15(1), 56–69 (2013).
    [Crossref]
  7. G. Finlayson, S. Hordley, G. Schaefer, and G. Y. Tian, “Illuminant and device invariant colour using histogram equalisation,” Pattern Recognit. 38(2), 179–190 (2005).
    [Crossref]
  8. H. Hirschmüller and D. Scharstein, “Evaluation of stereo matching costs on images with radiometric differences,” IEEE Trans. Pattern Anal. Mach. Intell. 31(9), 1582–1599 (2009).
    [Crossref] [PubMed]
  9. H. Hirschmüller, “Stereo processing by semiglobal matching and mutual information,” IEEE Trans. Pattern Anal. Mach. Intell. 30(2), 328–341 (2008).
    [Crossref] [PubMed]
  10. Y. S. Heo, K. M. Lee, and S. U. Lee, “Robust stereo matching using adaptive normalized cross-correlation,” IEEE Trans. Pattern Anal. Mach. Intell. 33(4), 807–822 (2011).
    [Crossref] [PubMed]
  11. O. Faugeras, B. Hotz, H. Mathieu, T. Viéville, Z. Y. Zhang, P. Fua, E. Theron, L. Moll, G. Berry, and J. Vuillemin, “ Real time correlation-based stereo: algorithm, implementations and applications,” Inria (1993).
  12. K. Konolige, Small Vision Systems: Hardware and Implementation (Springer 1998).
  13. H. Heiko, P. R. Innocent, and J. Garibaldi, “Real-time correlation-based stereo vision with reduced border errors,” Int. J. Comput. Vis. 47(1–3), 229–246 (2002).
  14. M. Unser, D. Sage, and D. Van De Ville, “Multiresolution monogenic signal analysis using the Riesz-Laplace wavelet transform,” IEEE Trans. Image Process. 18(11), 2402–2418 (2009).
    [Crossref] [PubMed]
  15. S. G. Krantz, “Calculation and estimation of the Poisson kernel,” J. Math. Anal. Appl. 302(1), 143–148 (2005).
    [Crossref]
  16. M. Felsberg and G. Sommer, “The monogenic scale-space: A unifying approach to phase-based image processing in scale-space,” J. Math. Imaging Vis. 21(1), 5–26 (2004).
    [Crossref]
  17. G. Finlayson and R. Xu, “Illuminant and gamma comprehensive normalisation in log RGB space,” Pattern Recognit. Lett. 24(2), 1679–1690 (2003).
    [Crossref]
  18. Y. S. Heo, K. M. Lee, and S. U. Lee, “Joint depth map and color consistency estimation for stereo images with different illuminations and cameras,” IEEE Trans. Pattern Anal. Mach. Intell. 35(5), 1094–1106 (2013).
    [Crossref] [PubMed]
  19. D. Scharstein, R. Szeliski, and H. Heiko, http://vision.middlebury.edu/stereo/data/ .
  20. L. Wang, R. Yang, M. Gong, and M. Liao, “Real-time stereo using approximated joint bilateral filtering and dynamic programming,” J. Real-time Image Pr. 9(3), 447–461 (2014).
    [Crossref]
  21. S. Birchfield and C. Tomasi, “A pixel dissimilarity measure that is insensitive to image sampling,” IEEE Trans. Pattern Anal. Mach. Intell. 20(4), 401–406 (1998).
    [Crossref]

2014 (1)

L. Wang, R. Yang, M. Gong, and M. Liao, “Real-time stereo using approximated joint bilateral filtering and dynamic programming,” J. Real-time Image Pr. 9(3), 447–461 (2014).
[Crossref]

2013 (2)

Y. S. Heo, K. M. Lee, and S. U. Lee, “Joint depth map and color consistency estimation for stereo images with different illuminations and cameras,” IEEE Trans. Pattern Anal. Mach. Intell. 35(5), 1094–1106 (2013).
[Crossref] [PubMed]

I. L. Jung, T. Y. Chung, J. Y. Sim, and C. S. Kim, “Consistent Stereo Matching Under Varying Radiometric Conditions,” IEEE Trans. Multimed. 15(1), 56–69 (2013).
[Crossref]

2011 (1)

Y. S. Heo, K. M. Lee, and S. U. Lee, “Robust stereo matching using adaptive normalized cross-correlation,” IEEE Trans. Pattern Anal. Mach. Intell. 33(4), 807–822 (2011).
[Crossref] [PubMed]

2009 (2)

H. Hirschmüller and D. Scharstein, “Evaluation of stereo matching costs on images with radiometric differences,” IEEE Trans. Pattern Anal. Mach. Intell. 31(9), 1582–1599 (2009).
[Crossref] [PubMed]

M. Unser, D. Sage, and D. Van De Ville, “Multiresolution monogenic signal analysis using the Riesz-Laplace wavelet transform,” IEEE Trans. Image Process. 18(11), 2402–2418 (2009).
[Crossref] [PubMed]

2008 (2)

H. Hirschmüller, “Stereo processing by semiglobal matching and mutual information,” IEEE Trans. Pattern Anal. Mach. Intell. 30(2), 328–341 (2008).
[Crossref] [PubMed]

N. Lazaros, G. C. Sirakoulis, and A. Gasteratos, “Review of stereo vision algorithms: from software to hardware,” Int. J. Optomechatroni. 2(4), 435–462 (2008).

2005 (2)

G. Finlayson, S. Hordley, G. Schaefer, and G. Y. Tian, “Illuminant and device invariant colour using histogram equalisation,” Pattern Recognit. 38(2), 179–190 (2005).
[Crossref]

S. G. Krantz, “Calculation and estimation of the Poisson kernel,” J. Math. Anal. Appl. 302(1), 143–148 (2005).
[Crossref]

2004 (1)

M. Felsberg and G. Sommer, “The monogenic scale-space: A unifying approach to phase-based image processing in scale-space,” J. Math. Imaging Vis. 21(1), 5–26 (2004).
[Crossref]

2003 (1)

G. Finlayson and R. Xu, “Illuminant and gamma comprehensive normalisation in log RGB space,” Pattern Recognit. Lett. 24(2), 1679–1690 (2003).
[Crossref]

2002 (2)

S. Daniel and R. Szeliski, “A taxonomy and evaluation of dense two-frame stereo correspondence algorithms,” Int. J. Comput. Vis. 47(1–3), 7–42 (2002).

H. Heiko, P. R. Innocent, and J. Garibaldi, “Real-time correlation-based stereo vision with reduced border errors,” Int. J. Comput. Vis. 47(1–3), 229–246 (2002).

1998 (1)

S. Birchfield and C. Tomasi, “A pixel dissimilarity measure that is insensitive to image sampling,” IEEE Trans. Pattern Anal. Mach. Intell. 20(4), 401–406 (1998).
[Crossref]

Birchfield, S.

S. Birchfield and C. Tomasi, “A pixel dissimilarity measure that is insensitive to image sampling,” IEEE Trans. Pattern Anal. Mach. Intell. 20(4), 401–406 (1998).
[Crossref]

Chung, T. Y.

I. L. Jung, T. Y. Chung, J. Y. Sim, and C. S. Kim, “Consistent Stereo Matching Under Varying Radiometric Conditions,” IEEE Trans. Multimed. 15(1), 56–69 (2013).
[Crossref]

Daniel, S.

S. Daniel and R. Szeliski, “A taxonomy and evaluation of dense two-frame stereo correspondence algorithms,” Int. J. Comput. Vis. 47(1–3), 7–42 (2002).

Felsberg, M.

M. Felsberg and G. Sommer, “The monogenic scale-space: A unifying approach to phase-based image processing in scale-space,” J. Math. Imaging Vis. 21(1), 5–26 (2004).
[Crossref]

Finlayson, G.

G. Finlayson, S. Hordley, G. Schaefer, and G. Y. Tian, “Illuminant and device invariant colour using histogram equalisation,” Pattern Recognit. 38(2), 179–190 (2005).
[Crossref]

G. Finlayson and R. Xu, “Illuminant and gamma comprehensive normalisation in log RGB space,” Pattern Recognit. Lett. 24(2), 1679–1690 (2003).
[Crossref]

Garibaldi, J.

H. Heiko, P. R. Innocent, and J. Garibaldi, “Real-time correlation-based stereo vision with reduced border errors,” Int. J. Comput. Vis. 47(1–3), 229–246 (2002).

Gasteratos, A.

N. Lazaros, G. C. Sirakoulis, and A. Gasteratos, “Review of stereo vision algorithms: from software to hardware,” Int. J. Optomechatroni. 2(4), 435–462 (2008).

Gavrila, D. M.

H. Sunyoto, W. van der Mark, and D. M. Gavrila, “A comparative study of fast dense stereo vision algorithms,” in Proceedings of IEEE Intelli. Veh. Symp. (IEEE, 2004), pp. 319–324.
[Crossref]

Gong, M.

L. Wang, R. Yang, M. Gong, and M. Liao, “Real-time stereo using approximated joint bilateral filtering and dynamic programming,” J. Real-time Image Pr. 9(3), 447–461 (2014).
[Crossref]

Heiko, H.

H. Heiko, P. R. Innocent, and J. Garibaldi, “Real-time correlation-based stereo vision with reduced border errors,” Int. J. Comput. Vis. 47(1–3), 229–246 (2002).

Heo, Y. S.

Y. S. Heo, K. M. Lee, and S. U. Lee, “Joint depth map and color consistency estimation for stereo images with different illuminations and cameras,” IEEE Trans. Pattern Anal. Mach. Intell. 35(5), 1094–1106 (2013).
[Crossref] [PubMed]

Y. S. Heo, K. M. Lee, and S. U. Lee, “Robust stereo matching using adaptive normalized cross-correlation,” IEEE Trans. Pattern Anal. Mach. Intell. 33(4), 807–822 (2011).
[Crossref] [PubMed]

Hirschmüller, H.

H. Hirschmüller and D. Scharstein, “Evaluation of stereo matching costs on images with radiometric differences,” IEEE Trans. Pattern Anal. Mach. Intell. 31(9), 1582–1599 (2009).
[Crossref] [PubMed]

H. Hirschmüller, “Stereo processing by semiglobal matching and mutual information,” IEEE Trans. Pattern Anal. Mach. Intell. 30(2), 328–341 (2008).
[Crossref] [PubMed]

Hordley, S.

G. Finlayson, S. Hordley, G. Schaefer, and G. Y. Tian, “Illuminant and device invariant colour using histogram equalisation,” Pattern Recognit. 38(2), 179–190 (2005).
[Crossref]

Innocent, P. R.

H. Heiko, P. R. Innocent, and J. Garibaldi, “Real-time correlation-based stereo vision with reduced border errors,” Int. J. Comput. Vis. 47(1–3), 229–246 (2002).

Jung, I. L.

I. L. Jung, T. Y. Chung, J. Y. Sim, and C. S. Kim, “Consistent Stereo Matching Under Varying Radiometric Conditions,” IEEE Trans. Multimed. 15(1), 56–69 (2013).
[Crossref]

Kanade, T.

T. Kanade, A. Yoshida, K. Oda, H. Kano, and M. Tanaka, “A stereo machine for video-rate dense depth mapping and its new applications,” in Proceedings of IEEE Intl. Conf. on Computer Vision and Pattern Recognition (IEEE, 1996), pp. 196–202.
[Crossref]

Kano, H.

T. Kanade, A. Yoshida, K. Oda, H. Kano, and M. Tanaka, “A stereo machine for video-rate dense depth mapping and its new applications,” in Proceedings of IEEE Intl. Conf. on Computer Vision and Pattern Recognition (IEEE, 1996), pp. 196–202.
[Crossref]

Kim, C. S.

I. L. Jung, T. Y. Chung, J. Y. Sim, and C. S. Kim, “Consistent Stereo Matching Under Varying Radiometric Conditions,” IEEE Trans. Multimed. 15(1), 56–69 (2013).
[Crossref]

Krantz, S. G.

S. G. Krantz, “Calculation and estimation of the Poisson kernel,” J. Math. Anal. Appl. 302(1), 143–148 (2005).
[Crossref]

Lazaros, N.

N. Lazaros, G. C. Sirakoulis, and A. Gasteratos, “Review of stereo vision algorithms: from software to hardware,” Int. J. Optomechatroni. 2(4), 435–462 (2008).

Lee, K. M.

Y. S. Heo, K. M. Lee, and S. U. Lee, “Joint depth map and color consistency estimation for stereo images with different illuminations and cameras,” IEEE Trans. Pattern Anal. Mach. Intell. 35(5), 1094–1106 (2013).
[Crossref] [PubMed]

Y. S. Heo, K. M. Lee, and S. U. Lee, “Robust stereo matching using adaptive normalized cross-correlation,” IEEE Trans. Pattern Anal. Mach. Intell. 33(4), 807–822 (2011).
[Crossref] [PubMed]

Lee, S. U.

Y. S. Heo, K. M. Lee, and S. U. Lee, “Joint depth map and color consistency estimation for stereo images with different illuminations and cameras,” IEEE Trans. Pattern Anal. Mach. Intell. 35(5), 1094–1106 (2013).
[Crossref] [PubMed]

Y. S. Heo, K. M. Lee, and S. U. Lee, “Robust stereo matching using adaptive normalized cross-correlation,” IEEE Trans. Pattern Anal. Mach. Intell. 33(4), 807–822 (2011).
[Crossref] [PubMed]

Liao, M.

L. Wang, R. Yang, M. Gong, and M. Liao, “Real-time stereo using approximated joint bilateral filtering and dynamic programming,” J. Real-time Image Pr. 9(3), 447–461 (2014).
[Crossref]

Oda, K.

T. Kanade, A. Yoshida, K. Oda, H. Kano, and M. Tanaka, “A stereo machine for video-rate dense depth mapping and its new applications,” in Proceedings of IEEE Intl. Conf. on Computer Vision and Pattern Recognition (IEEE, 1996), pp. 196–202.
[Crossref]

Sage, D.

M. Unser, D. Sage, and D. Van De Ville, “Multiresolution monogenic signal analysis using the Riesz-Laplace wavelet transform,” IEEE Trans. Image Process. 18(11), 2402–2418 (2009).
[Crossref] [PubMed]

Schaefer, G.

G. Finlayson, S. Hordley, G. Schaefer, and G. Y. Tian, “Illuminant and device invariant colour using histogram equalisation,” Pattern Recognit. 38(2), 179–190 (2005).
[Crossref]

Scharstein, D.

H. Hirschmüller and D. Scharstein, “Evaluation of stereo matching costs on images with radiometric differences,” IEEE Trans. Pattern Anal. Mach. Intell. 31(9), 1582–1599 (2009).
[Crossref] [PubMed]

Sim, J. Y.

I. L. Jung, T. Y. Chung, J. Y. Sim, and C. S. Kim, “Consistent Stereo Matching Under Varying Radiometric Conditions,” IEEE Trans. Multimed. 15(1), 56–69 (2013).
[Crossref]

Sirakoulis, G. C.

N. Lazaros, G. C. Sirakoulis, and A. Gasteratos, “Review of stereo vision algorithms: from software to hardware,” Int. J. Optomechatroni. 2(4), 435–462 (2008).

Sommer, G.

M. Felsberg and G. Sommer, “The monogenic scale-space: A unifying approach to phase-based image processing in scale-space,” J. Math. Imaging Vis. 21(1), 5–26 (2004).
[Crossref]

Sunyoto, H.

H. Sunyoto, W. van der Mark, and D. M. Gavrila, “A comparative study of fast dense stereo vision algorithms,” in Proceedings of IEEE Intelli. Veh. Symp. (IEEE, 2004), pp. 319–324.
[Crossref]

Szeliski, R.

S. Daniel and R. Szeliski, “A taxonomy and evaluation of dense two-frame stereo correspondence algorithms,” Int. J. Comput. Vis. 47(1–3), 7–42 (2002).

Tanaka, M.

T. Kanade, A. Yoshida, K. Oda, H. Kano, and M. Tanaka, “A stereo machine for video-rate dense depth mapping and its new applications,” in Proceedings of IEEE Intl. Conf. on Computer Vision and Pattern Recognition (IEEE, 1996), pp. 196–202.
[Crossref]

Tian, G. Y.

G. Finlayson, S. Hordley, G. Schaefer, and G. Y. Tian, “Illuminant and device invariant colour using histogram equalisation,” Pattern Recognit. 38(2), 179–190 (2005).
[Crossref]

Tomasi, C.

S. Birchfield and C. Tomasi, “A pixel dissimilarity measure that is insensitive to image sampling,” IEEE Trans. Pattern Anal. Mach. Intell. 20(4), 401–406 (1998).
[Crossref]

Unser, M.

M. Unser, D. Sage, and D. Van De Ville, “Multiresolution monogenic signal analysis using the Riesz-Laplace wavelet transform,” IEEE Trans. Image Process. 18(11), 2402–2418 (2009).
[Crossref] [PubMed]

Van De Ville, D.

M. Unser, D. Sage, and D. Van De Ville, “Multiresolution monogenic signal analysis using the Riesz-Laplace wavelet transform,” IEEE Trans. Image Process. 18(11), 2402–2418 (2009).
[Crossref] [PubMed]

van der Mark, W.

H. Sunyoto, W. van der Mark, and D. M. Gavrila, “A comparative study of fast dense stereo vision algorithms,” in Proceedings of IEEE Intelli. Veh. Symp. (IEEE, 2004), pp. 319–324.
[Crossref]

Wang, L.

L. Wang, R. Yang, M. Gong, and M. Liao, “Real-time stereo using approximated joint bilateral filtering and dynamic programming,” J. Real-time Image Pr. 9(3), 447–461 (2014).
[Crossref]

Xu, R.

G. Finlayson and R. Xu, “Illuminant and gamma comprehensive normalisation in log RGB space,” Pattern Recognit. Lett. 24(2), 1679–1690 (2003).
[Crossref]

Yang, R.

L. Wang, R. Yang, M. Gong, and M. Liao, “Real-time stereo using approximated joint bilateral filtering and dynamic programming,” J. Real-time Image Pr. 9(3), 447–461 (2014).
[Crossref]

Yoshida, A.

T. Kanade, A. Yoshida, K. Oda, H. Kano, and M. Tanaka, “A stereo machine for video-rate dense depth mapping and its new applications,” in Proceedings of IEEE Intl. Conf. on Computer Vision and Pattern Recognition (IEEE, 1996), pp. 196–202.
[Crossref]

IEEE Trans. Image Process. (1)

M. Unser, D. Sage, and D. Van De Ville, “Multiresolution monogenic signal analysis using the Riesz-Laplace wavelet transform,” IEEE Trans. Image Process. 18(11), 2402–2418 (2009).
[Crossref] [PubMed]

IEEE Trans. Multimed. (1)

I. L. Jung, T. Y. Chung, J. Y. Sim, and C. S. Kim, “Consistent Stereo Matching Under Varying Radiometric Conditions,” IEEE Trans. Multimed. 15(1), 56–69 (2013).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (5)

Y. S. Heo, K. M. Lee, and S. U. Lee, “Joint depth map and color consistency estimation for stereo images with different illuminations and cameras,” IEEE Trans. Pattern Anal. Mach. Intell. 35(5), 1094–1106 (2013).
[Crossref] [PubMed]

H. Hirschmüller and D. Scharstein, “Evaluation of stereo matching costs on images with radiometric differences,” IEEE Trans. Pattern Anal. Mach. Intell. 31(9), 1582–1599 (2009).
[Crossref] [PubMed]

H. Hirschmüller, “Stereo processing by semiglobal matching and mutual information,” IEEE Trans. Pattern Anal. Mach. Intell. 30(2), 328–341 (2008).
[Crossref] [PubMed]

Y. S. Heo, K. M. Lee, and S. U. Lee, “Robust stereo matching using adaptive normalized cross-correlation,” IEEE Trans. Pattern Anal. Mach. Intell. 33(4), 807–822 (2011).
[Crossref] [PubMed]

S. Birchfield and C. Tomasi, “A pixel dissimilarity measure that is insensitive to image sampling,” IEEE Trans. Pattern Anal. Mach. Intell. 20(4), 401–406 (1998).
[Crossref]

Int. J. Comput. Vis. (2)

H. Heiko, P. R. Innocent, and J. Garibaldi, “Real-time correlation-based stereo vision with reduced border errors,” Int. J. Comput. Vis. 47(1–3), 229–246 (2002).

S. Daniel and R. Szeliski, “A taxonomy and evaluation of dense two-frame stereo correspondence algorithms,” Int. J. Comput. Vis. 47(1–3), 7–42 (2002).

Int. J. Optomechatroni. (1)

N. Lazaros, G. C. Sirakoulis, and A. Gasteratos, “Review of stereo vision algorithms: from software to hardware,” Int. J. Optomechatroni. 2(4), 435–462 (2008).

J. Math. Anal. Appl. (1)

S. G. Krantz, “Calculation and estimation of the Poisson kernel,” J. Math. Anal. Appl. 302(1), 143–148 (2005).
[Crossref]

J. Math. Imaging Vis. (1)

M. Felsberg and G. Sommer, “The monogenic scale-space: A unifying approach to phase-based image processing in scale-space,” J. Math. Imaging Vis. 21(1), 5–26 (2004).
[Crossref]

J. Real-time Image Pr. (1)

L. Wang, R. Yang, M. Gong, and M. Liao, “Real-time stereo using approximated joint bilateral filtering and dynamic programming,” J. Real-time Image Pr. 9(3), 447–461 (2014).
[Crossref]

Pattern Recognit. (1)

G. Finlayson, S. Hordley, G. Schaefer, and G. Y. Tian, “Illuminant and device invariant colour using histogram equalisation,” Pattern Recognit. 38(2), 179–190 (2005).
[Crossref]

Pattern Recognit. Lett. (1)

G. Finlayson and R. Xu, “Illuminant and gamma comprehensive normalisation in log RGB space,” Pattern Recognit. Lett. 24(2), 1679–1690 (2003).
[Crossref]

Other (6)

D. Scharstein, R. Szeliski, and H. Heiko, http://vision.middlebury.edu/stereo/data/ .

T. Kanade, A. Yoshida, K. Oda, H. Kano, and M. Tanaka, “A stereo machine for video-rate dense depth mapping and its new applications,” in Proceedings of IEEE Intl. Conf. on Computer Vision and Pattern Recognition (IEEE, 1996), pp. 196–202.
[Crossref]

Z. Ramin and J. Woodfill, “Non-parametric local transforms for computing visual correspondence,” in Proceedings of IEEE Intl. Conf. on European Conf. Computer Vision (IEEE, 1994), pp. 151–158.

H. Sunyoto, W. van der Mark, and D. M. Gavrila, “A comparative study of fast dense stereo vision algorithms,” in Proceedings of IEEE Intelli. Veh. Symp. (IEEE, 2004), pp. 319–324.
[Crossref]

O. Faugeras, B. Hotz, H. Mathieu, T. Viéville, Z. Y. Zhang, P. Fua, E. Theron, L. Moll, G. Berry, and J. Vuillemin, “ Real time correlation-based stereo: algorithm, implementations and applications,” Inria (1993).

K. Konolige, Small Vision Systems: Hardware and Implementation (Springer 1998).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1
Fig. 1 Filter results based on DoP-K:(a) image in high exposure, followed by (b)its R channel response and (d)its DoP-K filter results; and (f) image in low exposure, followed by (g)its R channel response and (i)its DoP-K filter results;(c)&(h) are partial enlarged details of (b)&(g), (e)&(j)are partial enlarged details of (d)&(i)respectively.
Fig. 2
Fig. 2 ‘Cloth1’of(a)reference image taken under Exp2&Illu2, followed by its matching images taken under (b) Exp2&Illu1, (c) Exp1&Illu3,(d) Exp0&Illu3; ‘Baby1’ of (e)reference image taken under Exp2&Illu2, followed by its matching images taken under(f) Exp2&Illu1, (g) Exp1&Illu3,(h) Exp0&Illu3.
Fig. 3
Fig. 3 Evaluation regions of Cloth1:(a)reference image,(b)ground truth map,(c) texture-less region(white) and occlusion region(black),(d)depth discontinuity region(white) and occlusion region(black);and evaluation regions of Baby1:(e)reference image,(f)ground truth map,(g) texture-less region (white) and occlusion region(black),(h)depth discontinuity region(white) and occlusion region(black).
Fig. 4
Fig. 4 Disparity estimation results of Cloth1 with the proposed algorithm:(a) ground truth, followed by estimated disparity map of images under (b) Exp2&Illu1, (c) Exp1&Illu3, (d) Exp0&Illu3.
Fig. 5
Fig. 5 Disparity estimation results of Baby1 with the proposed algorithm: (a) ground truth, followed by disparity map of images with (b) Exp2&Illu1, (c) Exp1&Illu3, (d) Exp0&Illu3.
Fig. 6
Fig. 6 Results of tested stereo algorithms on Cloth1 under different photometric variations. (a) estimated disparity map of image pair under Exp2&Illu1 based on NCC.(b)Rank transform.(c) DoP-K filtering.(d)ANCC.(e) the proposed method.(f) estimated disparity map of image pair under Exp1&Illu3 based on NCC.(g) Rank transform.(h) DoP-K filtering.(i)ANCC.(j) the proposed method. (k) estimated disparity map of image pair under Exp0&Illu3 based on NCC.(l) Rank transform.(m) DoP-K filtering.(n)ANCC.(o) the proposed method.
Fig. 7
Fig. 7 Results of tested stereo algorithms on Baby1 under different photometric variations. (a) estimated disparity map of image pair under Exp2&Illu1 based on NCC.(b)Rank transform.(c) DoP-K filtering.(d)ANCC.(e) the proposed method.(f) estimated disparity map of image pair under Exp1&Illu3 based on NCC.(g) Rank transform.(h) DoP-K filtering.(i)ANCC.(j) the proposed method. (k) estimated disparity map of image pair under Exp0&Illu3 based on NCC.(l) Rank transform.(m) DoP-K filtering.(n)ANCC.(o) the proposed method.
Fig. 8
Fig. 8 Disparity estimation errors of tested stereo algorithms on ‘Cloth1’in different regions.(a) PBD values in different regions of image pair under Exp2&Illu1.(b) Exp1&Illu3.(c) Exp0&Illu3.(d)RMS errors in different regions of image pair under Exp2&Illu1.(e) Exp1&Illu3.(f) Exp0&Illu3. In (a),(b) and(c), 1、2、3、4 along x-axis represents P ζ T ¯ , P ζ T , P ζ O ¯ and P ζ D respectively. In(d),(e)and(f), 1、2、3、4 along x-axis represents R ζ T ¯ , R ζ T , R ζ O ¯ and R ζ D respectively.
Fig. 9
Fig. 9 Disparity estimation errors of tested stereo algorithms on ‘Baby1’in different regions.(a) PBD values in different regions of image pair under Exp2&Illu1.(b) Exp1&Illu3.(c) Exp0&Illu3.(d)RMS errors in different regions of image pair under Exp2&Illu1.(e) Exp1&Illu3.(f) Exp0&Illu3. In (a),(b) and(c), 1、2、3、4 along x-axis represent P ζ T ¯ , P ζ T , P ζ O ¯ and P ζ D respectively. In(d),(e)and(f), 1、2、3、4 along x-axis represent R ζ T ¯ , R ζ T , R ζ O ¯ and R ζ D respectively.
Fig. 10
Fig. 10 Errors comparison of ANCC and the proposed algorithm with Gaussian noise on ‘Cloth1’ under Exp0&Illu3.(a) PBD in texture less region. (b)PBD in textured region. (b)PBD in non-occlusion region.(d)RMS in texture less region. (e) RMS in textured region. (f) RMS in non-occlusion region.
Fig. 11
Fig. 11 Errors comparison of ANCC and the proposed algorithm with Gaussian noise on ‘Baby1’ under Exp0&Illu3.(a) PBD in texture less region. (b)PBD in textured region. (b)PBD in non-occlusion region. (d)RMS in texture less region. (e) RMS in textured region. (f) RMS in non-occlusion region.

Tables (2)

Tables Icon

Table 1 Coefficient Values Used in the Experiment.

Tables Icon

Table 2 Disparity Estimation Errors of ‘Cloth1’ and ‘Baby1’ Under Different Exposures and Illuminations Based on the Proposed Method.

Equations (21)

Equations on this page are rendered with MathJax. Learn more.

( R(κ) G(κ) B(κ) )( R ˜ (κ) G ˜ (κ) B ˜ (κ) )=( ρ(κ)aR (κ) γ + n R (κ) ρ(κ)bG (κ) γ + n G (κ) ρ(κ)cB (κ) γ + n B (κ) )
p(z;s)= s 2π ( s 2 + | z | 2 ) 3/2 P(u;s)= p(z;s)exp(j2π u T z) dz=exp(2π| u |s)
p s f , s c (z)= p s f (z)/ p s f (z) p s c (z)/ p s c (z)
I ˜ L,p C (z)=( p s f , s c I L )(z), I ˜ R,p C (z)=( p s f , s c I R )(z) (C=R,G,B)
( R ˜ (κ) G ˜ (κ) B ˜ (κ) )=( ρ ˜ (κ) a ˜ R (κ) γ ρ ˜ (κ) b ˜ G (κ) γ ρ ˜ (κ) c ˜ B (κ) γ )
( R ˜ L (κ) G ˜ L (κ) B ˜ L (κ) )=( ρ ˜ L (κ) a ˜ L R L (κ) γ L ρ ˜ L (κ) b ˜ L G L (κ) γ L ρ ˜ L (κ) c ˜ L B L (κ) γ L )
log( R ˜ L (κ)) ρ ˜ L (κ)+ a ˜ L + γ L log( R L (κ)) log( G ˜ L (κ)) ρ ˜ L (κ)+ b ˜ L + γ L log( G L (κ)) log( B ˜ L (κ)) ρ ˜ L (κ)+ c ˜ L + γ L log( B L (κ))
RGB ¯ L (κ)= ρ ˜ L (κ)+ a ˜ L + b ˜ L + c ˜ L 3 + γ L log( R L (κ) G L (κ) B L (κ)) 3
R ˜ L (κ) (2 a ˜ L b ˜ L c ˜ L )/3+ γ L log[ R L (κ)/ R L (κ) G L (κ) B L (κ) 3 ] α ˜ L R + γ L K L R (κ) G ˜ L (κ) (2 b ˜ L a ˜ L c ˜ L )/3+ γ L log[ G L (κ)/ R L (κ) G L (κ) B L (κ) 3 ] α ˜ L G + γ L K L G (κ) B ˜ L (κ) (2 c ˜ L b ˜ L a ˜ L )/3+ γ L log[ B L (κ)/ R L (κ) G L (κ) B L (κ) 3 ] α ˜ L B + γ L K L B (κ)
R ˜ R (κ+ d κ ) α ˜ R R + γ R K R R (κ+ d κ ) G ˜ R (κ+ d κ ) α ˜ R G + γ R K R G (κ+ d κ ) B ˜ R (κ+ d κ ) α ˜ R B + γ R K R B (κ+ d κ )
w L (κ;ς)=exp( I ˜ L,p C (κ) I ˜ L,p C (ς) σ c ) exp( κς σ g )
Cos t s f , s c C (κ; d κ )= ς L W L (κ), ς R W R (κ+ d κ ) [ w L (κ; ς L ) I L,p C (κ; ς L )× w R (κ+ d κ ; ς R )( I R,p C (κ+ d κ ; ς R ) ] ς L W L (κ) [ w L (κ; ς L ) I L,p C (κ; ς L )] 2 × ς R W R (κ+ d κ ) [ w R (κ+ d κ ; ς R )( I R,p C (κ+ d κ ; ς R )] 2
I L,p C (κ; ς L )= I ˜ L,p C (κ)( ς L W L (κ) I ˜ L,p C (κ; ς L ) )/(m×n)
I R,p C (κ+ d κ ; ς R )= I ˜ R,p C (κ+ d κ )( ς R W R (κ+ d κ ) I ˜ L,p C (κ+ d κ ; ς R ) )/(m×n)
Cos t s f , s c C (κ; d κ )= ς L W L (κ), ς R W R (κ+ d κ ) [ w L (κ; ς L ) K L C (κ; ς L )× w R (κ+ d κ ; ς R )( K R C (κ+ d κ ; ς R ) ] ς L W L (κ) [ w L (κ; ς L ) K L C (κ; ς L )] 2 × ς R W R (κ+ d κ ) [ w R (κ+ d κ ; ς R )( K R C (κ+ d κ ; ς R )] 2
K L C (κ; ς L )= K L C (κ)( ς L W(κ) K L C (κ; ς L ) )/(m×n)
K R C (κ+ d κ ; ς R )= K R C (κ+ d κ )( ς R W(κ+ d κ ) K R C (κ+ d κ ; ς R ) )/(m×n)
d κ =arg min d κ Ω d {αCos t s f , s c R (κ; d κ )+βCos t s f , s c G (κ; d κ )+χCos t s f , s c B (κ; d κ )}
Cos t s f , s c C (κ; d κ )=Cos t s f , s c C (κ; d κ )+λ κ W 1 2 d κ ( d κ Ω d )
R ζ = 1 N ζ (x,y) | d C (x,y) d G (x,y) | 2
P ζ = 1 N ζ (x,y) ( | d C (x,y) d G (x,y) |> δ d )

Metrics