Abstract

In an underwater imaging system, a perspective camera is often placed outside a tank or in waterproof housing with a flat glass window. The refraction of light occurs when a light ray passes through the water–glass and air–glass interface, rendering the conventional multiple view geometry based on the single viewpoint (SVP) camera model invalid. While most recent underwater vision studies mainly focus on the challenging topic of calibrating such systems, no previous work has systematically studied the influence of refraction on underwater three-dimensional (3D) reconstruction. This paper demonstrates the possibility of using the SVP camera model in underwater 3D reconstruction through theoretical analysis of refractive distortion and simulations. Then, the performance of the SVP camera model in multiview underwater 3D reconstruction is quantitatively evaluated. The experimental results reveal a rather surprising and useful yet overlooked fact that the SVP camera model with radial distortion correction and focal length adjustment can compensate for refraction and achieve high accuracy in multiview underwater 3D reconstruction (within 0.7 mm for an object of dimension 200 mm) compared with the results of land-based systems. Such an observation justifies the use of the SVP camera model in underwater application for reconstructing reliable 3D scenes. Our results can be used to guide the selection of system parameters in the design of an underwater 3D imaging setup.

© 2012 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, 2nd ed. (Cambridge University, 2004).
  2. N. Snavely, S. M. Seitz, and R. Szeliski, “Modeling the world from internet photo collections,” Int. J. Comput. Vis. 80, 189–210 (2008).
    [CrossRef]
  3. V. Chari and P. Sturm, “Multiple-view geometry of the refractive plane,” in Proceedings of British Machine Vision Conference (BMVA, 2009).
  4. Y. Chang and T. Chen, “Multi-view 3D reconstruction for scenes under the refractive plane with known vertical direction,” in Proceedings of International Conference on Computer Vision (IEEE, 2011).
  5. P. C. Y. Chang, J. C. Flitton, K. I. Hopcraft, E. Jakeman, D. L. Jordan, and J. G. Walker, “Improving visibility depth in passive underwater imaging by use of polarization,” Appl. Opt. 42, 2794–2803 (2003).
    [CrossRef]
  6. W. Hou, S. Woods, E. Jarosz, W. Goode, and A. Weidemann, “Optical turbulence on underwater image degradation in natural environments,” Appl. Opt. 51, 2678–2686 (2012).
    [CrossRef]
  7. T. Treibitz, Y. Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 51–65 (2012).
  8. P. Sturm and J. P. Barreto, “General imaging geometry for central catadioptric cameras,” in Proceedings of European Conference on Computer Vision (Springer, 2008), pp. 609–622.
  9. Z. Kukelova, M. Bujnak, and T. Pajdla, “Closed-form solutions to minimal absolute pose problems with known vertical direction,” in Proceedings of Asian Conference on Computer Vision (Springer, 2010), pp. 216–229.
  10. A. Sedlazeck and R. Koch, “Calibration of housing parameters for underwater stereo-camera rigs,” in Proceedings of British Machine Vision Conference (BMVA, 2011), paper 118.
  11. G. Telem and S. Filin, “Photogrammetric modeling of underwater environments,” ISPRS J. Photogramm. Remote Sens. 65, 433–444 (2010).
  12. A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 3346–3353.
  13. S. Ramalingam, P. Sturm, and S. K. Lodha, “Theory and calibration algorithms for axial cameras,” in Proceedings of Asian Conference on Computer Vision (Springer, 2006), pp. 704–713.
  14. R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “A perspective on distortions,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2003), pp. 594–601.
  15. Y. Swirski, Y. Y. Schechner, B. Herzberg, and S. Negahdaripour, “CauStereo: range from light in nature,” Appl. Opt. 50, F89–F101 (2011).
    [CrossRef]
  16. L. Bartolini, L. De Dominicis, M. Ferri de Collibus, G. Fornetti, M. Guarneri, E. Paglia, C. Poggi, and R. Ricci, “Underwater three-dimensional imaging with an amplitude-modulated laser radar at a 405 nm wavelength,” Appl. Opt. 44, 7130–7135 (2005).
    [CrossRef]
  17. C. Kunz and H. Singh, “Hemispherical refraction and camera calibration in underwater vision,” in Proceedings of MTS/IEEE Oceans (IEEE, 2008), pp. 1–7.
  18. S. M. Seitz, B. Curless, J. Diebel, D. Scharstein, and R. Szeliski, “A comparison and evaluation of multi-view stereo reconstruction algorithms,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 519–528.
  19. G. Glaeser and H.-P. Schrocker, “Reflections on refractions,” J. Geom. Graph. 4, 1–18 (2000).
  20. B. Triggs, P. F. Mclauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment—a modern synthesis,” in Proceedings of the International Workshop on Vision Algorithms: Theory and Practice (Springer, 1999), pp. 153–177.
  21. http://www.povray.org .
  22. http://graphics.stanford.edu/data .
  23. Y. Furukawa and J. Ponce, “Accurate, dense, and robust multiview stereopsis,” IEEE Trans. Pattern Anal. Mach. Intell. 32, 1362–1376 (2010).
  24. C. Strecha, W. von Hansen, L. Van Gool, P. Fua, and U. Thoennessen, “On benchmarking camera calibration and multi-view stereo for high resolution imagery,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.
  25. P. J. Besl and N. D. McKay, “A method for registration of 3-D shapes,” IEEE Trans. Pattern Anal. Mach. Intell. 14, 239–256 (1992).
  26. D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60, 91–110 (2004).
  27. M. Muja and D. G. Lowe, “Fast approximate nearest neighbors with automatic algorithm configuration,” in Proceedings of International Conference on Computer Vision Theory and Applications (INSTICC, 2009).
  28. B. K. P. Horn, “Closed-form solution of absolute orientation using unit quaternions,” J. Opt. Soc. Am. A 4, 629–642 (1987).
    [CrossRef]
  29. D. S. D. Chetverikov, D. Svirko, and P. Krsek, “The trimmed iterative closest point algorithm,” in Proceedings of International Conference on Pattern Recognition(IEEE, 2002), Vol. 3, pp. 545–548.

2012 (2)

T. Treibitz, Y. Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 51–65 (2012).

W. Hou, S. Woods, E. Jarosz, W. Goode, and A. Weidemann, “Optical turbulence on underwater image degradation in natural environments,” Appl. Opt. 51, 2678–2686 (2012).
[CrossRef]

2011 (1)

2010 (2)

G. Telem and S. Filin, “Photogrammetric modeling of underwater environments,” ISPRS J. Photogramm. Remote Sens. 65, 433–444 (2010).

Y. Furukawa and J. Ponce, “Accurate, dense, and robust multiview stereopsis,” IEEE Trans. Pattern Anal. Mach. Intell. 32, 1362–1376 (2010).

2008 (1)

N. Snavely, S. M. Seitz, and R. Szeliski, “Modeling the world from internet photo collections,” Int. J. Comput. Vis. 80, 189–210 (2008).
[CrossRef]

2005 (1)

2004 (1)

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60, 91–110 (2004).

2003 (1)

2000 (1)

G. Glaeser and H.-P. Schrocker, “Reflections on refractions,” J. Geom. Graph. 4, 1–18 (2000).

1992 (1)

P. J. Besl and N. D. McKay, “A method for registration of 3-D shapes,” IEEE Trans. Pattern Anal. Mach. Intell. 14, 239–256 (1992).

1987 (1)

Agrawal, A.

A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 3346–3353.

Barreto, J. P.

P. Sturm and J. P. Barreto, “General imaging geometry for central catadioptric cameras,” in Proceedings of European Conference on Computer Vision (Springer, 2008), pp. 609–622.

Bartolini, L.

Besl, P. J.

P. J. Besl and N. D. McKay, “A method for registration of 3-D shapes,” IEEE Trans. Pattern Anal. Mach. Intell. 14, 239–256 (1992).

Bujnak, M.

Z. Kukelova, M. Bujnak, and T. Pajdla, “Closed-form solutions to minimal absolute pose problems with known vertical direction,” in Proceedings of Asian Conference on Computer Vision (Springer, 2010), pp. 216–229.

Chang, P. C. Y.

Chang, Y.

Y. Chang and T. Chen, “Multi-view 3D reconstruction for scenes under the refractive plane with known vertical direction,” in Proceedings of International Conference on Computer Vision (IEEE, 2011).

Chari, V.

V. Chari and P. Sturm, “Multiple-view geometry of the refractive plane,” in Proceedings of British Machine Vision Conference (BMVA, 2009).

A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 3346–3353.

Chen, T.

Y. Chang and T. Chen, “Multi-view 3D reconstruction for scenes under the refractive plane with known vertical direction,” in Proceedings of International Conference on Computer Vision (IEEE, 2011).

Chetverikov, D. S. D.

D. S. D. Chetverikov, D. Svirko, and P. Krsek, “The trimmed iterative closest point algorithm,” in Proceedings of International Conference on Pattern Recognition(IEEE, 2002), Vol. 3, pp. 545–548.

Curless, B.

S. M. Seitz, B. Curless, J. Diebel, D. Scharstein, and R. Szeliski, “A comparison and evaluation of multi-view stereo reconstruction algorithms,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 519–528.

De Dominicis, L.

Diebel, J.

S. M. Seitz, B. Curless, J. Diebel, D. Scharstein, and R. Szeliski, “A comparison and evaluation of multi-view stereo reconstruction algorithms,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 519–528.

Ferri de Collibus, M.

Filin, S.

G. Telem and S. Filin, “Photogrammetric modeling of underwater environments,” ISPRS J. Photogramm. Remote Sens. 65, 433–444 (2010).

Fitzgibbon, A. W.

B. Triggs, P. F. Mclauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment—a modern synthesis,” in Proceedings of the International Workshop on Vision Algorithms: Theory and Practice (Springer, 1999), pp. 153–177.

Flitton, J. C.

Fornetti, G.

Fua, P.

C. Strecha, W. von Hansen, L. Van Gool, P. Fua, and U. Thoennessen, “On benchmarking camera calibration and multi-view stereo for high resolution imagery,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Furukawa, Y.

Y. Furukawa and J. Ponce, “Accurate, dense, and robust multiview stereopsis,” IEEE Trans. Pattern Anal. Mach. Intell. 32, 1362–1376 (2010).

Glaeser, G.

G. Glaeser and H.-P. Schrocker, “Reflections on refractions,” J. Geom. Graph. 4, 1–18 (2000).

Goode, W.

Grossberg, M. D.

R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “A perspective on distortions,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2003), pp. 594–601.

Guarneri, M.

Hartley, R.

R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, 2nd ed. (Cambridge University, 2004).

Hartley, R. I.

B. Triggs, P. F. Mclauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment—a modern synthesis,” in Proceedings of the International Workshop on Vision Algorithms: Theory and Practice (Springer, 1999), pp. 153–177.

Herzberg, B.

Hopcraft, K. I.

Horn, B. K. P.

Hou, W.

Jakeman, E.

Jarosz, E.

Jordan, D. L.

Koch, R.

A. Sedlazeck and R. Koch, “Calibration of housing parameters for underwater stereo-camera rigs,” in Proceedings of British Machine Vision Conference (BMVA, 2011), paper 118.

Krsek, P.

D. S. D. Chetverikov, D. Svirko, and P. Krsek, “The trimmed iterative closest point algorithm,” in Proceedings of International Conference on Pattern Recognition(IEEE, 2002), Vol. 3, pp. 545–548.

Kukelova, Z.

Z. Kukelova, M. Bujnak, and T. Pajdla, “Closed-form solutions to minimal absolute pose problems with known vertical direction,” in Proceedings of Asian Conference on Computer Vision (Springer, 2010), pp. 216–229.

Kunz, C.

T. Treibitz, Y. Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 51–65 (2012).

C. Kunz and H. Singh, “Hemispherical refraction and camera calibration in underwater vision,” in Proceedings of MTS/IEEE Oceans (IEEE, 2008), pp. 1–7.

Lodha, S. K.

S. Ramalingam, P. Sturm, and S. K. Lodha, “Theory and calibration algorithms for axial cameras,” in Proceedings of Asian Conference on Computer Vision (Springer, 2006), pp. 704–713.

Lowe, D. G.

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60, 91–110 (2004).

M. Muja and D. G. Lowe, “Fast approximate nearest neighbors with automatic algorithm configuration,” in Proceedings of International Conference on Computer Vision Theory and Applications (INSTICC, 2009).

McKay, N. D.

P. J. Besl and N. D. McKay, “A method for registration of 3-D shapes,” IEEE Trans. Pattern Anal. Mach. Intell. 14, 239–256 (1992).

Mclauchlan, P. F.

B. Triggs, P. F. Mclauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment—a modern synthesis,” in Proceedings of the International Workshop on Vision Algorithms: Theory and Practice (Springer, 1999), pp. 153–177.

Muja, M.

M. Muja and D. G. Lowe, “Fast approximate nearest neighbors with automatic algorithm configuration,” in Proceedings of International Conference on Computer Vision Theory and Applications (INSTICC, 2009).

Nayar, S. K.

R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “A perspective on distortions,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2003), pp. 594–601.

Negahdaripour, S.

Paglia, E.

Pajdla, T.

Z. Kukelova, M. Bujnak, and T. Pajdla, “Closed-form solutions to minimal absolute pose problems with known vertical direction,” in Proceedings of Asian Conference on Computer Vision (Springer, 2010), pp. 216–229.

Poggi, C.

Ponce, J.

Y. Furukawa and J. Ponce, “Accurate, dense, and robust multiview stereopsis,” IEEE Trans. Pattern Anal. Mach. Intell. 32, 1362–1376 (2010).

Ramalingam, S.

S. Ramalingam, P. Sturm, and S. K. Lodha, “Theory and calibration algorithms for axial cameras,” in Proceedings of Asian Conference on Computer Vision (Springer, 2006), pp. 704–713.

A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 3346–3353.

Ricci, R.

Scharstein, D.

S. M. Seitz, B. Curless, J. Diebel, D. Scharstein, and R. Szeliski, “A comparison and evaluation of multi-view stereo reconstruction algorithms,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 519–528.

Schechner, Y. Y.

T. Treibitz, Y. Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 51–65 (2012).

Y. Swirski, Y. Y. Schechner, B. Herzberg, and S. Negahdaripour, “CauStereo: range from light in nature,” Appl. Opt. 50, F89–F101 (2011).
[CrossRef]

Schrocker, H.-P.

G. Glaeser and H.-P. Schrocker, “Reflections on refractions,” J. Geom. Graph. 4, 1–18 (2000).

Sedlazeck, A.

A. Sedlazeck and R. Koch, “Calibration of housing parameters for underwater stereo-camera rigs,” in Proceedings of British Machine Vision Conference (BMVA, 2011), paper 118.

Seitz, S. M.

N. Snavely, S. M. Seitz, and R. Szeliski, “Modeling the world from internet photo collections,” Int. J. Comput. Vis. 80, 189–210 (2008).
[CrossRef]

S. M. Seitz, B. Curless, J. Diebel, D. Scharstein, and R. Szeliski, “A comparison and evaluation of multi-view stereo reconstruction algorithms,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 519–528.

Singh, H.

T. Treibitz, Y. Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 51–65 (2012).

C. Kunz and H. Singh, “Hemispherical refraction and camera calibration in underwater vision,” in Proceedings of MTS/IEEE Oceans (IEEE, 2008), pp. 1–7.

Snavely, N.

N. Snavely, S. M. Seitz, and R. Szeliski, “Modeling the world from internet photo collections,” Int. J. Comput. Vis. 80, 189–210 (2008).
[CrossRef]

Strecha, C.

C. Strecha, W. von Hansen, L. Van Gool, P. Fua, and U. Thoennessen, “On benchmarking camera calibration and multi-view stereo for high resolution imagery,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Sturm, P.

V. Chari and P. Sturm, “Multiple-view geometry of the refractive plane,” in Proceedings of British Machine Vision Conference (BMVA, 2009).

P. Sturm and J. P. Barreto, “General imaging geometry for central catadioptric cameras,” in Proceedings of European Conference on Computer Vision (Springer, 2008), pp. 609–622.

S. Ramalingam, P. Sturm, and S. K. Lodha, “Theory and calibration algorithms for axial cameras,” in Proceedings of Asian Conference on Computer Vision (Springer, 2006), pp. 704–713.

Svirko, D.

D. S. D. Chetverikov, D. Svirko, and P. Krsek, “The trimmed iterative closest point algorithm,” in Proceedings of International Conference on Pattern Recognition(IEEE, 2002), Vol. 3, pp. 545–548.

Swaminathan, R.

R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “A perspective on distortions,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2003), pp. 594–601.

Swirski, Y.

Szeliski, R.

N. Snavely, S. M. Seitz, and R. Szeliski, “Modeling the world from internet photo collections,” Int. J. Comput. Vis. 80, 189–210 (2008).
[CrossRef]

S. M. Seitz, B. Curless, J. Diebel, D. Scharstein, and R. Szeliski, “A comparison and evaluation of multi-view stereo reconstruction algorithms,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 519–528.

Taguchi, Y.

A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 3346–3353.

Telem, G.

G. Telem and S. Filin, “Photogrammetric modeling of underwater environments,” ISPRS J. Photogramm. Remote Sens. 65, 433–444 (2010).

Thoennessen, U.

C. Strecha, W. von Hansen, L. Van Gool, P. Fua, and U. Thoennessen, “On benchmarking camera calibration and multi-view stereo for high resolution imagery,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Treibitz, T.

T. Treibitz, Y. Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 51–65 (2012).

Triggs, B.

B. Triggs, P. F. Mclauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment—a modern synthesis,” in Proceedings of the International Workshop on Vision Algorithms: Theory and Practice (Springer, 1999), pp. 153–177.

Van Gool, L.

C. Strecha, W. von Hansen, L. Van Gool, P. Fua, and U. Thoennessen, “On benchmarking camera calibration and multi-view stereo for high resolution imagery,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

von Hansen, W.

C. Strecha, W. von Hansen, L. Van Gool, P. Fua, and U. Thoennessen, “On benchmarking camera calibration and multi-view stereo for high resolution imagery,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

Walker, J. G.

Weidemann, A.

Woods, S.

Zisserman, A.

R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, 2nd ed. (Cambridge University, 2004).

Appl. Opt. (4)

IEEE Trans. Pattern Anal. Mach. Intell. (3)

T. Treibitz, Y. Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34, 51–65 (2012).

Y. Furukawa and J. Ponce, “Accurate, dense, and robust multiview stereopsis,” IEEE Trans. Pattern Anal. Mach. Intell. 32, 1362–1376 (2010).

P. J. Besl and N. D. McKay, “A method for registration of 3-D shapes,” IEEE Trans. Pattern Anal. Mach. Intell. 14, 239–256 (1992).

Int. J. Comput. Vis. (2)

D. G. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60, 91–110 (2004).

N. Snavely, S. M. Seitz, and R. Szeliski, “Modeling the world from internet photo collections,” Int. J. Comput. Vis. 80, 189–210 (2008).
[CrossRef]

ISPRS J. Photogramm. Remote Sens. (1)

G. Telem and S. Filin, “Photogrammetric modeling of underwater environments,” ISPRS J. Photogramm. Remote Sens. 65, 433–444 (2010).

J. Geom. Graph. (1)

G. Glaeser and H.-P. Schrocker, “Reflections on refractions,” J. Geom. Graph. 4, 1–18 (2000).

J. Opt. Soc. Am. A (1)

Other (17)

D. S. D. Chetverikov, D. Svirko, and P. Krsek, “The trimmed iterative closest point algorithm,” in Proceedings of International Conference on Pattern Recognition(IEEE, 2002), Vol. 3, pp. 545–548.

M. Muja and D. G. Lowe, “Fast approximate nearest neighbors with automatic algorithm configuration,” in Proceedings of International Conference on Computer Vision Theory and Applications (INSTICC, 2009).

V. Chari and P. Sturm, “Multiple-view geometry of the refractive plane,” in Proceedings of British Machine Vision Conference (BMVA, 2009).

Y. Chang and T. Chen, “Multi-view 3D reconstruction for scenes under the refractive plane with known vertical direction,” in Proceedings of International Conference on Computer Vision (IEEE, 2011).

C. Strecha, W. von Hansen, L. Van Gool, P. Fua, and U. Thoennessen, “On benchmarking camera calibration and multi-view stereo for high resolution imagery,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2008), pp. 1–8.

R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, 2nd ed. (Cambridge University, 2004).

B. Triggs, P. F. Mclauchlan, R. I. Hartley, and A. W. Fitzgibbon, “Bundle adjustment—a modern synthesis,” in Proceedings of the International Workshop on Vision Algorithms: Theory and Practice (Springer, 1999), pp. 153–177.

http://www.povray.org .

http://graphics.stanford.edu/data .

C. Kunz and H. Singh, “Hemispherical refraction and camera calibration in underwater vision,” in Proceedings of MTS/IEEE Oceans (IEEE, 2008), pp. 1–7.

S. M. Seitz, B. Curless, J. Diebel, D. Scharstein, and R. Szeliski, “A comparison and evaluation of multi-view stereo reconstruction algorithms,” in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2006), pp. 519–528.

A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2012), pp. 3346–3353.

S. Ramalingam, P. Sturm, and S. K. Lodha, “Theory and calibration algorithms for axial cameras,” in Proceedings of Asian Conference on Computer Vision (Springer, 2006), pp. 704–713.

R. Swaminathan, M. D. Grossberg, and S. K. Nayar, “A perspective on distortions,” in IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 2003), pp. 594–601.

P. Sturm and J. P. Barreto, “General imaging geometry for central catadioptric cameras,” in Proceedings of European Conference on Computer Vision (Springer, 2008), pp. 609–622.

Z. Kukelova, M. Bujnak, and T. Pajdla, “Closed-form solutions to minimal absolute pose problems with known vertical direction,” in Proceedings of Asian Conference on Computer Vision (Springer, 2010), pp. 216–229.

A. Sedlazeck and R. Koch, “Calibration of housing parameters for underwater stereo-camera rigs,” in Proceedings of British Machine Vision Conference (BMVA, 2011), paper 118.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1.

(a) An example showing the refraction effect: an image of a checkerboard submerged in a water-filled tank captured by a camera placed in front of a flat glass interface of the tank. Straight lines are bent in the image due to the refraction of light. (b) 2D illustration of a general underwater imaging system: a camera placed in a waterproof housing with a flat glass interface observing a straight line. See Table 1 for detailed description of notations.

Fig. 2.
Fig. 2.

(a) Refractive distortion versus image coordinate under different system configurations. (b) Refractive distortion (denoted as original E1) versus image coordinate under a general configuration (da=100, dg=10, dw=1000, θc=10°, θc=30°) and approximation to the refractive refraction using two methods based on the SVP camera model. For all imaging setups used in this figure, the focal length f is 800 pixels and the unit of da, dg, and dw is pixels. See text for details.

Fig. 3.
Fig. 3.

Image acquisition setups used in our study. See text for details.

Fig. 4.
Fig. 4.

Sample images of real objects captured in the air. The size (L×W×Hmm) of (a) shell is 165×108×95, (b) well is 229×152×165, and (c) arch is 178×76×165.

Fig. 5.
Fig. 5.

Visual comparison of images captured under different system configurations. The image in (a) is the image captured in the air. The image in (b) is the underwater image with (da=1000, dg=100, θc=0°). The image in (c) is an overlay of the two images in (a) and (b). An overlay of the image in (b) and the underwater image with (da=10,000, dg=100, θc=0°, (da=1000, dg=5000, θc=0°) and da=1000, dg=100, θc=30° are shown in (d), (e), and (f), respectively.

Fig. 6.
Fig. 6.

Quantitative evaluation of underwater 3D reconstruction using synthetic datasets listed in Table 2. Shown in (a), (b), and (c) are the results on dataset SC1. Shown in (d), (e), and (f) are the results on dataset SC2. Shown in (g), (h), and (i) are the results on dataset SC3. Shown in (j), (k), and (l) are the results on dataset SC4. See text for details.

Fig. 7.
Fig. 7.

Comparison of 3D reconstruction using different methods. The results of NRF and RDist+FAdj on dataset cf(16/1000/100/0°) are shown in the first and second rows, respectively. The results of FAdj and RDist+FAdj on dataset cf(32/1000/100/15°) are shown in the third and last rows, respectively. From left to right: the initial pose of reconstructed point clouds (green) with respect to the ground-truth model (blue); a side view of reconstructed point cloud and the ground-truth model after applying the S2D point clouds alignment algorithm; a top view of reconstructed point cloud and the ground truth model after applying the S2D point clouds alignment algorithm; mesh generated from underwater 3D reconstruction without texture mapping; texture-mapped mesh generated from underwater 3D reconstruction. (Regions of interest are highlighted.)

Fig. 8.
Fig. 8.

Distribution of 3D errors for reconstruction on real underwater images. The 3D error of a reconstructed 3D point is defined by Eq. (23). Errors larger than 1% of the size of the ground-truth model are collected in the last bin.

Fig. 9.
Fig. 9.

Comparison of 3D reconstruction using different methods. Shown in the first four rows are the results of RDist+FAdj on the dataset shell_Ring, well_Ring, arch_Ring, and arch_Sparse, respectively. From left to right: a sample underwater image from the dataset; the initial pose of reconstructed point clouds (green) with respect to the ground-truth model (blue); a side view of the reconstructed point cloud and the ground-truth model after applying the new S2D point clouds alignment algorithm; mesh generated from underwater 3D reconstruction without texture mapping; texture-mapped mesh generated from underwater 3D reconstruction. The last row shows the results of NRF on dataset well_Ring. From left to right: the initial pose of reconstructed point cloud; two images of point cloud alignment; two images of mesh generated from underwater 3D reconstruction. (Regions of interest are highlighted.)

Tables (4)

Tables Icon

Table 1. Notations Used in the Underwater Imaging System Shown in Fig. 1(b)

Tables Icon

Table 2. Synthetic Datasets and System Configurations (da, dg are measured in pixels)

Tables Icon

Table 3. Real Datasets and the System Configurations (da and dw are measured in millimeters)

Tables Icon

Table 4. Performance of the SVP Camera Model on Real Images

Equations (26)

Equations on this page are rendered with MathJax. Learn more.

{rg=datan(θc+arctanuaf)zg=da.
nasinθa=ngsinθg,
{sinθa=rgrg2+da2sinθg=rwrg(rwrg)2+dg2.
{rw=rg+nargdgda2ng2na2rg2+ng2rg2zw=da+dg.
{ngsinθg=nwsinθwsinθw=rorw(rorw)2+(dwrosinθo)2,
{ro=Aozo=da+dg+dwAosinθo,
ro=rwB2+ng(dwrwsinθo)(rwrg)Bdwng2(rwrg)2sinθoB2ng2(rwrg)2sin2θo,
B=dg2nw2+(nw2ng2)(rgrw)2.
up=p(xo)=ftan(arctan(rozo)θc),
E1(ua,Θc,Θo)=|uaup|.
ua=p(up)=up(1+k1(upf)2+k2(upf)4),
k2up5+k1f2up3+f4upf4ua=0.
E(ua,Θs)=|uaup|.
argmin{Θs}1iM|E1(uai,Θc,Θo)E(uai,Θs)|2.
argmin{Θs,T2}1iM|E2(uai,Θc,Θo)E(uai,Θs)|2,
E2(uai,Θc,Θo)=|uaip(T2(xoi))|
T2(xo)=sR2xo+t2,
argmin{Θs,yo}1iM|E3(uai,xoi)E(uai,Θs)|2,
E3(uai,xoi)=|uaip(xoi)|.
Desc·(X)=1Nti=1NtDesc·(xi),
T3(Y)=sR3Y+t3,
Eff(M,D)=|D||D|,
d(Yi,M)=min1jMT(Yi)Xj2
Comp(M,D)=|M||M|,
M={Xk|1kM,d(Xk,D)<δ·BB(M)}.
Acc(M,D)=1|D|i=1|D|d((D)i,M)2,

Metrics