Abstract

We report on application of multi-frame super-resolution (SR) to sampling limited imagery that models space objects (SOs). The difficulties of multi-frame image processing of SOs include abrupt illumination changes and complex in scene SO motion. These conditions adversely affect the accuracy of motion estimation necessary for resolution enhancement. We analyze the motion estimation errors from the standpoint of an optical flow (OF) interpolation error metric and show dependence of the object tracking accuracy on brightness changes and on the pixel displacement values between subsequent images. Despite inaccuracies of motion estimation, we demonstrate spatial acuity enhancement of the pixel limited resolution of model SO motion imagery by applying a SR algorithm that accounts for OF errors. In addition to visual inspection, image resolution improvement attained in the experiments is assessed quantitatively; a 1.8× resolution enhancement is demonstrated.

© 2014 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. J. Murray-Krezan, W. C. Inbody, P. D. Dao, A. Dentamaro, D. Fulcoly, and S. A. Gregory, “Algorithms for automated characterization of three-axis stabilized GEOS using non-resolved optical observations,” Advanced Maui Optical and Space Technologies Conference Technical Papers (2012).
  2. R. Hindsley, J. T. Armstrong, H. Schmitt, and E. Baines, “Small glints as an aid for imaging geosats using an optical Michelson interferometer,” J. Appl. Remote Sens. 7, 073549 (2013).
    [CrossRef]
  3. H. Ding, L. Xudong, and H. Zhao, “An approach for autonomous space object identification based on normalized AMI and illumination invariant MSA,” Acta Astronaut. 84, 173–181 (2013).
    [CrossRef]
  4. B. Horn and B. Schunck, “Determining optical flow,” Artif. Intell. 17, 185–203 (1981).
    [CrossRef]
  5. A. Bruhn, J. Weickert, and C. Schnoerr, “Lucas/Kanade meets Horn/Schunck: combining local and global optic flow methods,” Int. J. Comput. Vis. 61, 1–21 (2005).
    [CrossRef]
  6. T. Brox, A. Bruhn, N. Papenberg, and J. Weickert, “High accuracy optical flow estimation based on a theory for warping,” in Proceedings of 8th European Conference on Computer Vision (2004), Vol. 4, pp. 25–36.
  7. D. Sun, S. Roth, and M. J. Black, “Secrets of optical flow estimation and their principles,” in Proceedings of International Conference on Computer Vision & Pattern Recognition (2010), pp. 2432–2439.
  8. H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).
  9. S. Huang and R. Y. Tsai, “Multi-frame image restoration and registration,” Adv. Comp. Vision Image Process 1, 317–339 (1984).
  10. S. C. Park, M. K. Park, and M. G. Kang, “Super-resolution image reconstruction: a technical overview,” IEEE Signal Process. Mag. 20, 21–36 (2003).
    [CrossRef]
  11. H. Zimmer, A. Bruhn, and J. Weickert, “Freehand HDR imaging of moving scenes with simultaneous resolution enhancement,” in Proceeding of Eurographics, Llandudno, UK, April 11–15, 2011.
  12. S. P. Belekos, N. P. Galatsanos, and A. K. Katsaggelos, “Maximum a posteriori video super-resolution using a new multichannel image prior,” IEEE Trans. Image Process. 19, 1451–1464 (2010).
    [CrossRef]
  13. M. Protter, M. Elad, H. Takeda, and P. Milanfar, “Generalizing the nonlocal-means to super-resolution reconstruction,” IEEE Trans. Image Process. 18, 36–51 (2009).
    [CrossRef]
  14. H. Takeda, P. Milanfar, M. Protter, and M. Elad, “Super-resolution without explicit subpixel motion estimation,” IEEE Trans. Image Process. 18, 1958–1975 (2009).
    [CrossRef]
  15. D. Mitzel, T. Pock, T. Schoenemann, and D. Cremers, “Video super resolution using duality based TV-L1 optical flow,” Lec. Notes Comp. Sc. 5748, 432–441 (2009).
  16. A. V. Kanaev and C. W. Miller, “Multi-frame super-resolution algorithm for complex motion patterns,” Opt. Express 21, 19850–19866 (2013).
    [CrossRef]
  17. R. C. Hardie and K. J. Barnard, “Fast super-resolution using an adaptive Wiener filter with robustness to local motion,” Opt. Express 20, 21053–21073 (2012).
    [CrossRef]
  18. S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” Int. J. Comp. Vis. 92, 1–31 (2011) and references herein.
  19. H. Haussecker and H. Spies, “Motion,” in Handbook of Computer Vision and Applications, B. Jähne, H. Haussecker, and P. Geissler, eds. (Academic, 1999), Vol. 2, pp. 336–338.
  20. A. Bruhn and J. Weickert, “A confidence measure for variational optic flow methods,” in Geometric Properties for Incomplete Data (Springer, 2006), pp. 283–298.
  21. C. Kondermann, R. Mester, and C. Garbe, “A statistical confidence measure for optical flows,” Lec. Notes Comp. Sc. 5304, 290–301 (2008).
  22. J. Kybic and C. Nieuwenhuis, “Bootstrap optical flow confidence and uncertainty measure,” Comput. Vis. Image Underst. 115, 1449–1462 (2011).
    [CrossRef]
  23. D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.
  24. A. V. Kanaev and C. W. Miller, “Confidence measures of optical flow for multi-frame image reconstruction,” in Computational Optical Sensing and Imaging (COSI), 2012 OSA Technical Digest Series (Optical Society of America, 2012), postdeadline paper CW2C.2.
  25. K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3D transform-domain collaborative filtering,” IEEE Trans. Image Process. 16, 2080–2095 (2007).
    [CrossRef]
  26. “Photography–Electronic Still-Picture Cameras–Resolution Measurements,” , International Organization for Standardization (2011).
  27. K. Dabov, A. Foi, and K. Egiazarian, “Image restoration by sparse 3D transform-domain collaborative filtering,” Proc. SPIE 6812, 681207 (2008).
    [CrossRef]

2013 (3)

R. Hindsley, J. T. Armstrong, H. Schmitt, and E. Baines, “Small glints as an aid for imaging geosats using an optical Michelson interferometer,” J. Appl. Remote Sens. 7, 073549 (2013).
[CrossRef]

H. Ding, L. Xudong, and H. Zhao, “An approach for autonomous space object identification based on normalized AMI and illumination invariant MSA,” Acta Astronaut. 84, 173–181 (2013).
[CrossRef]

A. V. Kanaev and C. W. Miller, “Multi-frame super-resolution algorithm for complex motion patterns,” Opt. Express 21, 19850–19866 (2013).
[CrossRef]

2012 (2)

R. C. Hardie and K. J. Barnard, “Fast super-resolution using an adaptive Wiener filter with robustness to local motion,” Opt. Express 20, 21053–21073 (2012).
[CrossRef]

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

2011 (2)

S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” Int. J. Comp. Vis. 92, 1–31 (2011) and references herein.

J. Kybic and C. Nieuwenhuis, “Bootstrap optical flow confidence and uncertainty measure,” Comput. Vis. Image Underst. 115, 1449–1462 (2011).
[CrossRef]

2010 (1)

S. P. Belekos, N. P. Galatsanos, and A. K. Katsaggelos, “Maximum a posteriori video super-resolution using a new multichannel image prior,” IEEE Trans. Image Process. 19, 1451–1464 (2010).
[CrossRef]

2009 (4)

M. Protter, M. Elad, H. Takeda, and P. Milanfar, “Generalizing the nonlocal-means to super-resolution reconstruction,” IEEE Trans. Image Process. 18, 36–51 (2009).
[CrossRef]

H. Takeda, P. Milanfar, M. Protter, and M. Elad, “Super-resolution without explicit subpixel motion estimation,” IEEE Trans. Image Process. 18, 1958–1975 (2009).
[CrossRef]

D. Mitzel, T. Pock, T. Schoenemann, and D. Cremers, “Video super resolution using duality based TV-L1 optical flow,” Lec. Notes Comp. Sc. 5748, 432–441 (2009).

H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).

2008 (2)

C. Kondermann, R. Mester, and C. Garbe, “A statistical confidence measure for optical flows,” Lec. Notes Comp. Sc. 5304, 290–301 (2008).

K. Dabov, A. Foi, and K. Egiazarian, “Image restoration by sparse 3D transform-domain collaborative filtering,” Proc. SPIE 6812, 681207 (2008).
[CrossRef]

2007 (1)

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3D transform-domain collaborative filtering,” IEEE Trans. Image Process. 16, 2080–2095 (2007).
[CrossRef]

2005 (1)

A. Bruhn, J. Weickert, and C. Schnoerr, “Lucas/Kanade meets Horn/Schunck: combining local and global optic flow methods,” Int. J. Comput. Vis. 61, 1–21 (2005).
[CrossRef]

2003 (1)

S. C. Park, M. K. Park, and M. G. Kang, “Super-resolution image reconstruction: a technical overview,” IEEE Signal Process. Mag. 20, 21–36 (2003).
[CrossRef]

1984 (1)

S. Huang and R. Y. Tsai, “Multi-frame image restoration and registration,” Adv. Comp. Vision Image Process 1, 317–339 (1984).

1981 (1)

B. Horn and B. Schunck, “Determining optical flow,” Artif. Intell. 17, 185–203 (1981).
[CrossRef]

Abraham, S.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Armstrong, J. T.

R. Hindsley, J. T. Armstrong, H. Schmitt, and E. Baines, “Small glints as an aid for imaging geosats using an optical Michelson interferometer,” J. Appl. Remote Sens. 7, 073549 (2013).
[CrossRef]

Baines, E.

R. Hindsley, J. T. Armstrong, H. Schmitt, and E. Baines, “Small glints as an aid for imaging geosats using an optical Michelson interferometer,” J. Appl. Remote Sens. 7, 073549 (2013).
[CrossRef]

Baker, S.

S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” Int. J. Comp. Vis. 92, 1–31 (2011) and references herein.

Barnard, K. J.

Belekos, S. P.

S. P. Belekos, N. P. Galatsanos, and A. K. Katsaggelos, “Maximum a posteriori video super-resolution using a new multichannel image prior,” IEEE Trans. Image Process. 19, 1451–1464 (2010).
[CrossRef]

Black, M. J.

S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” Int. J. Comp. Vis. 92, 1–31 (2011) and references herein.

D. Sun, S. Roth, and M. J. Black, “Secrets of optical flow estimation and their principles,” in Proceedings of International Conference on Computer Vision & Pattern Recognition (2010), pp. 2432–2439.

Brostow, G.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Brox, T.

T. Brox, A. Bruhn, N. Papenberg, and J. Weickert, “High accuracy optical flow estimation based on a theory for warping,” in Proceedings of 8th European Conference on Computer Vision (2004), Vol. 4, pp. 25–36.

Bruhn, A.

H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).

A. Bruhn, J. Weickert, and C. Schnoerr, “Lucas/Kanade meets Horn/Schunck: combining local and global optic flow methods,” Int. J. Comput. Vis. 61, 1–21 (2005).
[CrossRef]

T. Brox, A. Bruhn, N. Papenberg, and J. Weickert, “High accuracy optical flow estimation based on a theory for warping,” in Proceedings of 8th European Conference on Computer Vision (2004), Vol. 4, pp. 25–36.

H. Zimmer, A. Bruhn, and J. Weickert, “Freehand HDR imaging of moving scenes with simultaneous resolution enhancement,” in Proceeding of Eurographics, Llandudno, UK, April 11–15, 2011.

A. Bruhn and J. Weickert, “A confidence measure for variational optic flow methods,” in Geometric Properties for Incomplete Data (Springer, 2006), pp. 283–298.

Cremers, D.

D. Mitzel, T. Pock, T. Schoenemann, and D. Cremers, “Video super resolution using duality based TV-L1 optical flow,” Lec. Notes Comp. Sc. 5748, 432–441 (2009).

Dabov, K.

K. Dabov, A. Foi, and K. Egiazarian, “Image restoration by sparse 3D transform-domain collaborative filtering,” Proc. SPIE 6812, 681207 (2008).
[CrossRef]

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3D transform-domain collaborative filtering,” IEEE Trans. Image Process. 16, 2080–2095 (2007).
[CrossRef]

Dao, P. D.

J. Murray-Krezan, W. C. Inbody, P. D. Dao, A. Dentamaro, D. Fulcoly, and S. A. Gregory, “Algorithms for automated characterization of three-axis stabilized GEOS using non-resolved optical observations,” Advanced Maui Optical and Space Technologies Conference Technical Papers (2012).

Dentamaro, A.

J. Murray-Krezan, W. C. Inbody, P. D. Dao, A. Dentamaro, D. Fulcoly, and S. A. Gregory, “Algorithms for automated characterization of three-axis stabilized GEOS using non-resolved optical observations,” Advanced Maui Optical and Space Technologies Conference Technical Papers (2012).

Ding, H.

H. Ding, L. Xudong, and H. Zhao, “An approach for autonomous space object identification based on normalized AMI and illumination invariant MSA,” Acta Astronaut. 84, 173–181 (2013).
[CrossRef]

Egiazarian, K.

K. Dabov, A. Foi, and K. Egiazarian, “Image restoration by sparse 3D transform-domain collaborative filtering,” Proc. SPIE 6812, 681207 (2008).
[CrossRef]

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3D transform-domain collaborative filtering,” IEEE Trans. Image Process. 16, 2080–2095 (2007).
[CrossRef]

Elad, M.

H. Takeda, P. Milanfar, M. Protter, and M. Elad, “Super-resolution without explicit subpixel motion estimation,” IEEE Trans. Image Process. 18, 1958–1975 (2009).
[CrossRef]

M. Protter, M. Elad, H. Takeda, and P. Milanfar, “Generalizing the nonlocal-means to super-resolution reconstruction,” IEEE Trans. Image Process. 18, 36–51 (2009).
[CrossRef]

Foi, A.

K. Dabov, A. Foi, and K. Egiazarian, “Image restoration by sparse 3D transform-domain collaborative filtering,” Proc. SPIE 6812, 681207 (2008).
[CrossRef]

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3D transform-domain collaborative filtering,” IEEE Trans. Image Process. 16, 2080–2095 (2007).
[CrossRef]

Förstner, W.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Fulcoly, D.

J. Murray-Krezan, W. C. Inbody, P. D. Dao, A. Dentamaro, D. Fulcoly, and S. A. Gregory, “Algorithms for automated characterization of three-axis stabilized GEOS using non-resolved optical observations,” Advanced Maui Optical and Space Technologies Conference Technical Papers (2012).

Galatsanos, N. P.

S. P. Belekos, N. P. Galatsanos, and A. K. Katsaggelos, “Maximum a posteriori video super-resolution using a new multichannel image prior,” IEEE Trans. Image Process. 19, 1451–1464 (2010).
[CrossRef]

Garbe, C.

C. Kondermann, R. Mester, and C. Garbe, “A statistical confidence measure for optical flows,” Lec. Notes Comp. Sc. 5304, 290–301 (2008).

Gehrig, S.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Gregory, S. A.

J. Murray-Krezan, W. C. Inbody, P. D. Dao, A. Dentamaro, D. Fulcoly, and S. A. Gregory, “Algorithms for automated characterization of three-axis stabilized GEOS using non-resolved optical observations,” Advanced Maui Optical and Space Technologies Conference Technical Papers (2012).

Hardie, R. C.

Haussecker, H.

H. Haussecker and H. Spies, “Motion,” in Handbook of Computer Vision and Applications, B. Jähne, H. Haussecker, and P. Geissler, eds. (Academic, 1999), Vol. 2, pp. 336–338.

Hindsley, R.

R. Hindsley, J. T. Armstrong, H. Schmitt, and E. Baines, “Small glints as an aid for imaging geosats using an optical Michelson interferometer,” J. Appl. Remote Sens. 7, 073549 (2013).
[CrossRef]

Horn, B.

B. Horn and B. Schunck, “Determining optical flow,” Artif. Intell. 17, 185–203 (1981).
[CrossRef]

Huang, S.

S. Huang and R. Y. Tsai, “Multi-frame image restoration and registration,” Adv. Comp. Vision Image Process 1, 317–339 (1984).

Imiya, A.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Inbody, W. C.

J. Murray-Krezan, W. C. Inbody, P. D. Dao, A. Dentamaro, D. Fulcoly, and S. A. Gregory, “Algorithms for automated characterization of three-axis stabilized GEOS using non-resolved optical observations,” Advanced Maui Optical and Space Technologies Conference Technical Papers (2012).

Jähne, B.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Kanaev, A. V.

A. V. Kanaev and C. W. Miller, “Multi-frame super-resolution algorithm for complex motion patterns,” Opt. Express 21, 19850–19866 (2013).
[CrossRef]

A. V. Kanaev and C. W. Miller, “Confidence measures of optical flow for multi-frame image reconstruction,” in Computational Optical Sensing and Imaging (COSI), 2012 OSA Technical Digest Series (Optical Society of America, 2012), postdeadline paper CW2C.2.

Kang, M. G.

S. C. Park, M. K. Park, and M. G. Kang, “Super-resolution image reconstruction: a technical overview,” IEEE Signal Process. Mag. 20, 21–36 (2003).
[CrossRef]

Katkovnik, V.

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3D transform-domain collaborative filtering,” IEEE Trans. Image Process. 16, 2080–2095 (2007).
[CrossRef]

Katsaggelos, A. K.

S. P. Belekos, N. P. Galatsanos, and A. K. Katsaggelos, “Maximum a posteriori video super-resolution using a new multichannel image prior,” IEEE Trans. Image Process. 19, 1451–1464 (2010).
[CrossRef]

Klose, F.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Kondermann, C.

C. Kondermann, R. Mester, and C. Garbe, “A statistical confidence measure for optical flows,” Lec. Notes Comp. Sc. 5304, 290–301 (2008).

Kondermann, D.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Kybic, J.

J. Kybic and C. Nieuwenhuis, “Bootstrap optical flow confidence and uncertainty measure,” Comput. Vis. Image Underst. 115, 1449–1462 (2011).
[CrossRef]

Lewis, J. P.

S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” Int. J. Comp. Vis. 92, 1–31 (2011) and references herein.

Magnor, M.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Mayer, H.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Mester, R.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

C. Kondermann, R. Mester, and C. Garbe, “A statistical confidence measure for optical flows,” Lec. Notes Comp. Sc. 5304, 290–301 (2008).

Milanfar, P.

M. Protter, M. Elad, H. Takeda, and P. Milanfar, “Generalizing the nonlocal-means to super-resolution reconstruction,” IEEE Trans. Image Process. 18, 36–51 (2009).
[CrossRef]

H. Takeda, P. Milanfar, M. Protter, and M. Elad, “Super-resolution without explicit subpixel motion estimation,” IEEE Trans. Image Process. 18, 1958–1975 (2009).
[CrossRef]

Miller, C. W.

A. V. Kanaev and C. W. Miller, “Multi-frame super-resolution algorithm for complex motion patterns,” Opt. Express 21, 19850–19866 (2013).
[CrossRef]

A. V. Kanaev and C. W. Miller, “Confidence measures of optical flow for multi-frame image reconstruction,” in Computational Optical Sensing and Imaging (COSI), 2012 OSA Technical Digest Series (Optical Society of America, 2012), postdeadline paper CW2C.2.

Mitzel, D.

D. Mitzel, T. Pock, T. Schoenemann, and D. Cremers, “Video super resolution using duality based TV-L1 optical flow,” Lec. Notes Comp. Sc. 5748, 432–441 (2009).

Murray-Krezan, J.

J. Murray-Krezan, W. C. Inbody, P. D. Dao, A. Dentamaro, D. Fulcoly, and S. A. Gregory, “Algorithms for automated characterization of three-axis stabilized GEOS using non-resolved optical observations,” Advanced Maui Optical and Space Technologies Conference Technical Papers (2012).

Nieuwenhuis, C.

J. Kybic and C. Nieuwenhuis, “Bootstrap optical flow confidence and uncertainty measure,” Comput. Vis. Image Underst. 115, 1449–1462 (2011).
[CrossRef]

Pajdla, T.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Papenberg, N.

T. Brox, A. Bruhn, N. Papenberg, and J. Weickert, “High accuracy optical flow estimation based on a theory for warping,” in Proceedings of 8th European Conference on Computer Vision (2004), Vol. 4, pp. 25–36.

Park, M. K.

S. C. Park, M. K. Park, and M. G. Kang, “Super-resolution image reconstruction: a technical overview,” IEEE Signal Process. Mag. 20, 21–36 (2003).
[CrossRef]

Park, S. C.

S. C. Park, M. K. Park, and M. G. Kang, “Super-resolution image reconstruction: a technical overview,” IEEE Signal Process. Mag. 20, 21–36 (2003).
[CrossRef]

Pock, T.

D. Mitzel, T. Pock, T. Schoenemann, and D. Cremers, “Video super resolution using duality based TV-L1 optical flow,” Lec. Notes Comp. Sc. 5748, 432–441 (2009).

Protter, M.

M. Protter, M. Elad, H. Takeda, and P. Milanfar, “Generalizing the nonlocal-means to super-resolution reconstruction,” IEEE Trans. Image Process. 18, 36–51 (2009).
[CrossRef]

H. Takeda, P. Milanfar, M. Protter, and M. Elad, “Super-resolution without explicit subpixel motion estimation,” IEEE Trans. Image Process. 18, 1958–1975 (2009).
[CrossRef]

Reulke, R.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Rosenhahn, B.

H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).

Roth, S.

S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” Int. J. Comp. Vis. 92, 1–31 (2011) and references herein.

D. Sun, S. Roth, and M. J. Black, “Secrets of optical flow estimation and their principles,” in Proceedings of International Conference on Computer Vision & Pattern Recognition (2010), pp. 2432–2439.

Salgado, A.

H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).

Scharstein, D.

S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” Int. J. Comp. Vis. 92, 1–31 (2011) and references herein.

Schmitt, H.

R. Hindsley, J. T. Armstrong, H. Schmitt, and E. Baines, “Small glints as an aid for imaging geosats using an optical Michelson interferometer,” J. Appl. Remote Sens. 7, 073549 (2013).
[CrossRef]

Schnoerr, C.

A. Bruhn, J. Weickert, and C. Schnoerr, “Lucas/Kanade meets Horn/Schunck: combining local and global optic flow methods,” Int. J. Comput. Vis. 61, 1–21 (2005).
[CrossRef]

Schoenemann, T.

D. Mitzel, T. Pock, T. Schoenemann, and D. Cremers, “Video super resolution using duality based TV-L1 optical flow,” Lec. Notes Comp. Sc. 5748, 432–441 (2009).

Schunck, B.

B. Horn and B. Schunck, “Determining optical flow,” Artif. Intell. 17, 185–203 (1981).
[CrossRef]

Seidel, H.-P.

H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).

Spies, H.

H. Haussecker and H. Spies, “Motion,” in Handbook of Computer Vision and Applications, B. Jähne, H. Haussecker, and P. Geissler, eds. (Academic, 1999), Vol. 2, pp. 336–338.

Sun, D.

D. Sun, S. Roth, and M. J. Black, “Secrets of optical flow estimation and their principles,” in Proceedings of International Conference on Computer Vision & Pattern Recognition (2010), pp. 2432–2439.

Szeliski, R.

S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” Int. J. Comp. Vis. 92, 1–31 (2011) and references herein.

Takeda, H.

H. Takeda, P. Milanfar, M. Protter, and M. Elad, “Super-resolution without explicit subpixel motion estimation,” IEEE Trans. Image Process. 18, 1958–1975 (2009).
[CrossRef]

M. Protter, M. Elad, H. Takeda, and P. Milanfar, “Generalizing the nonlocal-means to super-resolution reconstruction,” IEEE Trans. Image Process. 18, 36–51 (2009).
[CrossRef]

Tsai, R. Y.

S. Huang and R. Y. Tsai, “Multi-frame image restoration and registration,” Adv. Comp. Vision Image Process 1, 317–339 (1984).

Valgaerts, L.

H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).

Weickert, J.

H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).

A. Bruhn, J. Weickert, and C. Schnoerr, “Lucas/Kanade meets Horn/Schunck: combining local and global optic flow methods,” Int. J. Comput. Vis. 61, 1–21 (2005).
[CrossRef]

T. Brox, A. Bruhn, N. Papenberg, and J. Weickert, “High accuracy optical flow estimation based on a theory for warping,” in Proceedings of 8th European Conference on Computer Vision (2004), Vol. 4, pp. 25–36.

H. Zimmer, A. Bruhn, and J. Weickert, “Freehand HDR imaging of moving scenes with simultaneous resolution enhancement,” in Proceeding of Eurographics, Llandudno, UK, April 11–15, 2011.

A. Bruhn and J. Weickert, “A confidence measure for variational optic flow methods,” in Geometric Properties for Incomplete Data (Springer, 2006), pp. 283–298.

Xudong, L.

H. Ding, L. Xudong, and H. Zhao, “An approach for autonomous space object identification based on normalized AMI and illumination invariant MSA,” Acta Astronaut. 84, 173–181 (2013).
[CrossRef]

Zhao, H.

H. Ding, L. Xudong, and H. Zhao, “An approach for autonomous space object identification based on normalized AMI and illumination invariant MSA,” Acta Astronaut. 84, 173–181 (2013).
[CrossRef]

Zimmer, H.

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).

H. Zimmer, A. Bruhn, and J. Weickert, “Freehand HDR imaging of moving scenes with simultaneous resolution enhancement,” in Proceeding of Eurographics, Llandudno, UK, April 11–15, 2011.

Acta Astronaut. (1)

H. Ding, L. Xudong, and H. Zhao, “An approach for autonomous space object identification based on normalized AMI and illumination invariant MSA,” Acta Astronaut. 84, 173–181 (2013).
[CrossRef]

Adv. Comp. Vision Image Process (1)

S. Huang and R. Y. Tsai, “Multi-frame image restoration and registration,” Adv. Comp. Vision Image Process 1, 317–339 (1984).

Artif. Intell. (1)

B. Horn and B. Schunck, “Determining optical flow,” Artif. Intell. 17, 185–203 (1981).
[CrossRef]

Comput. Vis. Image Underst. (1)

J. Kybic and C. Nieuwenhuis, “Bootstrap optical flow confidence and uncertainty measure,” Comput. Vis. Image Underst. 115, 1449–1462 (2011).
[CrossRef]

IEEE Signal Process. Mag. (1)

S. C. Park, M. K. Park, and M. G. Kang, “Super-resolution image reconstruction: a technical overview,” IEEE Signal Process. Mag. 20, 21–36 (2003).
[CrossRef]

IEEE Trans. Image Process. (4)

S. P. Belekos, N. P. Galatsanos, and A. K. Katsaggelos, “Maximum a posteriori video super-resolution using a new multichannel image prior,” IEEE Trans. Image Process. 19, 1451–1464 (2010).
[CrossRef]

M. Protter, M. Elad, H. Takeda, and P. Milanfar, “Generalizing the nonlocal-means to super-resolution reconstruction,” IEEE Trans. Image Process. 18, 36–51 (2009).
[CrossRef]

H. Takeda, P. Milanfar, M. Protter, and M. Elad, “Super-resolution without explicit subpixel motion estimation,” IEEE Trans. Image Process. 18, 1958–1975 (2009).
[CrossRef]

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3D transform-domain collaborative filtering,” IEEE Trans. Image Process. 16, 2080–2095 (2007).
[CrossRef]

Int. J. Comp. Vis. (1)

S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, “A database and evaluation methodology for optical flow,” Int. J. Comp. Vis. 92, 1–31 (2011) and references herein.

Int. J. Comput. Vis. (1)

A. Bruhn, J. Weickert, and C. Schnoerr, “Lucas/Kanade meets Horn/Schunck: combining local and global optic flow methods,” Int. J. Comput. Vis. 61, 1–21 (2005).
[CrossRef]

J. Appl. Remote Sens. (1)

R. Hindsley, J. T. Armstrong, H. Schmitt, and E. Baines, “Small glints as an aid for imaging geosats using an optical Michelson interferometer,” J. Appl. Remote Sens. 7, 073549 (2013).
[CrossRef]

Lec. Notes Comp. Sc. (2)

C. Kondermann, R. Mester, and C. Garbe, “A statistical confidence measure for optical flows,” Lec. Notes Comp. Sc. 5304, 290–301 (2008).

D. Mitzel, T. Pock, T. Schoenemann, and D. Cremers, “Video super resolution using duality based TV-L1 optical flow,” Lec. Notes Comp. Sc. 5748, 432–441 (2009).

Lec. Notes Comp. Sci. (1)

H. Zimmer, A. Bruhn, J. Weickert, L. Valgaerts, A. Salgado, B. Rosenhahn, and H.-P. Seidel, “Complementary optic flow,” Lec. Notes Comp. Sci. 5681, 207–220 (2009).

Lect. Notes Comput Sc. (1)

D. Kondermann, S. Abraham, G. Brostow, W. Förstner, S. Gehrig, A. Imiya, B. Jähne, F. Klose, M. Magnor, H. Mayer, R. Mester, T. Pajdla, R. Reulke, and H. Zimmer, “On performance analysis of optical flow algorithms,” Lect. Notes Comput Sc. 7474, 329–355 (2012) and references herein.

Opt. Express (2)

Proc. SPIE (1)

K. Dabov, A. Foi, and K. Egiazarian, “Image restoration by sparse 3D transform-domain collaborative filtering,” Proc. SPIE 6812, 681207 (2008).
[CrossRef]

Other (8)

A. V. Kanaev and C. W. Miller, “Confidence measures of optical flow for multi-frame image reconstruction,” in Computational Optical Sensing and Imaging (COSI), 2012 OSA Technical Digest Series (Optical Society of America, 2012), postdeadline paper CW2C.2.

“Photography–Electronic Still-Picture Cameras–Resolution Measurements,” , International Organization for Standardization (2011).

H. Haussecker and H. Spies, “Motion,” in Handbook of Computer Vision and Applications, B. Jähne, H. Haussecker, and P. Geissler, eds. (Academic, 1999), Vol. 2, pp. 336–338.

A. Bruhn and J. Weickert, “A confidence measure for variational optic flow methods,” in Geometric Properties for Incomplete Data (Springer, 2006), pp. 283–298.

J. Murray-Krezan, W. C. Inbody, P. D. Dao, A. Dentamaro, D. Fulcoly, and S. A. Gregory, “Algorithms for automated characterization of three-axis stabilized GEOS using non-resolved optical observations,” Advanced Maui Optical and Space Technologies Conference Technical Papers (2012).

H. Zimmer, A. Bruhn, and J. Weickert, “Freehand HDR imaging of moving scenes with simultaneous resolution enhancement,” in Proceeding of Eurographics, Llandudno, UK, April 11–15, 2011.

T. Brox, A. Bruhn, N. Papenberg, and J. Weickert, “High accuracy optical flow estimation based on a theory for warping,” in Proceedings of 8th European Conference on Computer Vision (2004), Vol. 4, pp. 25–36.

D. Sun, S. Roth, and M. J. Black, “Secrets of optical flow estimation and their principles,” in Proceedings of International Conference on Computer Vision & Pattern Recognition (2010), pp. 2432–2439.

Supplementary Material (1)

» Media 1: AVI (5374 KB)     

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (23)

Fig. 1.
Fig. 1.

Observed brightness of a communication satellite, DTV-9s, as a function of phase angle relative to the sun. Note the several orders of magnitude change in brightness [1].

Fig. 2.
Fig. 2.

Experimental set-up of rotating metal box experiment.

Fig. 3.
Fig. 3.

Image sequence of a rotating metal box. As the box rotates relative to the camera and light source, glints are observed. (Media 1)

Fig. 4.
Fig. 4.

OF estimated between frames 59 and 60 (0.5 deg rotation) using WB algorithm, (a) x component, (b) y component.

Fig. 5.
Fig. 5.

OF estimated between frames 50 and 60 (0.5 deg rotation) using WB algorithm, (a) x component, (b) y component.

Fig. 6.
Fig. 6.

OF estimated between frames 59 and 60 (0.5 deg rotation) using CNL algorithm, (a) x component, (b) y component.

Fig. 7.
Fig. 7.

OF estimated between frames 50 and 60 (0.5 deg rotation) using CNL Gσ algorithm, (a) x component, (b) y component.

Fig. 8.
Fig. 8.

OF error estimation between frames 50 and 70 (a) EB, (b) WR.

Fig. 9.
Fig. 9.

OF estimated between frames 240 and 241 (0.25 deg rotation). (a) x component using WB algorithm and (b) y component using WB algorithm.

Fig. 10.
Fig. 10.

OF estimated between frames 240 and 241 (0.25 deg rotation). (a) x component using WB algorithm and (b) y component using CNL algorithm.

Fig. 11.
Fig. 11.

OF estimated between frames 240 and 244 (0.25 deg rotation). (a) x component using WB algorithm and (b) y component using WB algorithm.

Fig. 12.
Fig. 12.

OF estimated between frames 240 and 244 (0.25 deg rotation). (a) x component using CNL algorithm and (b) y component using CNL algorithm.

Fig. 13.
Fig. 13.

OF error estimation between frames 230 and 250 numbered from 1 to 21, (a) EB, (b) WR.

Fig. 14.
Fig. 14.

4X interpolated frame 60.

Fig. 15.
Fig. 15.

4X super-resolved frame 60 (50 iterations) using frames 56-64 with (a) DUDE processing with WB OF algorithm. (b) DUDE processing with two-way WB OF computation. (c) DUDE processing with CNL OF. (d) DUDE processing with two-way CNL OF computation.

Fig. 16.
Fig. 16.

4X super-resolved frame 60 [50 iterations except for (b) with 30 iterations for the large image and 50 iterations for the close up image] using frames 53–67 with (a) DUDE processing with WB OF. (b) DUDE processing with two-way WB OF computation. (c) DUDE processing with CNL OF computation. (d) DUDE processing with two-way CNL OF computation.

Fig. 17.
Fig. 17.

4X super-resolved frame 60 using frames 50–70 with (a) DUDE processing with WB OF algorithm (30 iterations for the large image and 50 iterations for the close up image). (b) DUDE processing with two-way WB OF computation (30 iterations for the large image and 50 iterations for the close up image). (c) DUDE processing with CNL OF computation (50 iterations). (d) DUDE processing with two-way CNL OF computation (50 iterations).

Fig. 18.
Fig. 18.

4X super-resolved frame 60 using NLS SR algorithm with frames (a) 56–64, (b) 53–67, and (c) 50–70.

Fig. 19.
Fig. 19.

(a) 4X interpolated frame 24. (b) 4X super-resolved frame 240 using frames 230–250 with NLS algorithm. DUDE processing with two-way WB, (c) and CNL. (d) OF computation and 50 iterations.

Fig. 20.
Fig. 20.

Spatial frequency spectrum for (solid purple line) original image, (dashed green line) original image interpolated to a ¼-pixel size grid, (red circles) enhanced with SRWB algorithm applied to 21 frames of data and 50 iterations, (blue triangles) enhanced with SRCNL algorithm applied to 21 frames of data and 30 iterations, and NLS (black dots) results with similarity block size of 7 pixels and σ=5.9. A black dashed line indicates the camera Nyquist frequency.

Fig. 21.
Fig. 21.

Spatial frequency spectra for the SRWB algorithm, with varied iterations. Red circles indicate 50 iterations, magenta squares 40 iterations, and blue triangles 30 iterations. A black dashed line indicates the Nyquist frequency.

Fig. 22.
Fig. 22.

Spatial frequency spectra for the SRCNL algorithm, with varied iterations. Red circles indicate 50 iterations, magenta squares 40 iterations, and blue triangles 30 iterations. A black dashed line indicates the Nyquist frequency.

Fig. 23.
Fig. 23.

Spatial frequency spectra for the SRWB and SRCNL algorithms. For each result 11 frames are used with (blue circles) 2° and (purple diamonds) 0.5° frame-to-frame rotation, total rotation of 20° and 5°, respectively.

Equations (12)

Equations on this page are rendered with MathJax. Learn more.

I1(r)=I2(r+u).
E(u)=I1(r)-I2(r+u)+γI1(r)-I2(r+u)+λR(I1,I2,u),
EWB(u)=I1(r)-I2(r+u)L1+γI1(r)-I2(r+u)L1+αuL1,
ECNL(u)=ρ0.45(I1(r)-I2(r+u))+θρ0.45(u)+ϑu-u^L2+φrrNrwr,r(|u^r-u^r|),
wr,rexp[|rr|22σ12(I(r)I(r))22σ22]o(r)o(r),
EE=(uxuxG)2+(uyuxG)2,
AE=180πarccosuxuxG+uyuyG+1(ux2+uy2+1)((uxG)2+(uyG)2+1),
IE=[IGI(r+ut)]2,
ψwr=[I1(r)I2(r+u)]2.
ILn=DBWnIH+en,n=1N,
IH(k+1)=IH(k)+Δt(n=1NUnrefWnrefBDTUrefn(ILnDBWrefnIH(k))+λdiv([Ψ((ν1tIH(k))2)ν1ν1t+ν2ν2t]IH(k))),
U=Gσ*exp(ψwra),

Metrics