Abstract

Silhouettes arise in a variety of imaging scenarios. Pristine silhouettes are often degraded via blurring, detector sampling, and detector noise. We present a maximum a posteriori estimator for the restoration of parameterized facial silhouettes. Extreme dealiasing and dramatic superresolution, well beyond the diffraction limit, are demonstrated through the use of strong prior knowledge.

© 2014 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. N. Davidenko, “Silhouetted face profiles: a new methodology for face perception research,” J. Vis. 7(4):6, 1–17 (2007).
    [CrossRef]
  2. A. Laurentini, “The visual hull concept for silhouette-based image understanding,” IEEE Trans. Pattern Anal. Mach. Intell 16, 150–162 (1994).
    [CrossRef]
  3. E. Boyer and J. S. Franco, “A hybrid approach for computing visual hulls of complex objects,” Comput. Vis. Pattern Recogn. 1, 695–701 (2003).
  4. B. DeCann and A. Ross, “Gait curves for human recognition, backpack detection, and silhouette correction in a nighttime environment,” Proc. SPIE 7667, 76670Q (2010).
    [CrossRef]
  5. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21, 2758–2769 (1982).
    [CrossRef]
  6. C. L. Matson, “Fourier spectrum extrapolation and enhancement using support constraints,” IEEE Trans. Signal Process 42, 156–163 (1994).
    [CrossRef]
  7. R. G. Paxman, P. D. Walker, and N. Davidenko, “Silhouette restoration,” in Computational Optical Sensing and Imaging, OSA Technical Digest (Optical Society of America, 2013), paper CM2C.4.
  8. R. G. Paxman, J. H. Seldin, J. R. Fienup, and J. C. Marron, “Use of an opacity constraint in three-dimensional imaging,” Proc. SPIE 2241–14, 116–126 (1994).
    [CrossRef]
  9. R. G. Paxman, J. R. Fienup, M. J. Reiley, and B. J. Thelen, “Phase retrieval with an opacity constraint in laser imaging (PROCLAIM),” in Signal Recovery and Synthesis V, Technical Digest Series, 11 (Optical Society of America, 1998), pp. 34–36.
  10. J. R. Fienup, R. G. Paxman, M. F. Reiley, and B. J. Thelen, “3-D imaging correlography and coherent image reconstruction,” Proc. SPIE 3815–07, 60–69 (1999).
    [CrossRef]
  11. R. G. Paxman, “Superresolution with an opacity constraint,” in Signal Recovery and Synthesis III, Technical Digest Series, Vol. 15 (Optical Society of America, 1989).
  12. P. J. Phillips, H. Moon, S. A. Rizvi, and P. J. Rauss, “The FERET evaluation methodology for face-recognition algorithms,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1090–1104 (2000).
    [CrossRef]
  13. P. J. Phillips, H. Wechsler, J. Huang, and P. J. Rauss, “The FERET database and evaluation procedure for face-recognition algorithms,” Image Vis. Comput. 16, 295–306 (1998).
  14. H. H. Barrett and K. J. Myers, Foundations of Image Science (Wiley, 2004).
  15. A. Berk, L. S. Bernstein, and D. C. Robertson, “MODTRAN: a moderate resolution model for LOWTRAN7,” (Air Force Geophysics Laboratory, 1989).
  16. R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. Sci. Comput. 16, 1190–1208 (1995).
    [CrossRef]
  17. R. C. Veltkamp, “Shape matching: similarity measures and algorithms,” in Proceedings of International Conference on Shape Modeling and Applications (IEEE, 2001), pp. 188–197.
  18. F. O’Sullivan and G. Wahba, “A cross validated Bayesian retrieval algorithm for nonlinear remote sensing experiments,” J. Comput. Phys. 59, 441–455 (1985).
    [CrossRef]
  19. J. L. C. Sanz and T. S. Huang, “Discrete and continuous band-limited signal extrapolation,” IEEE Trans. Acoust. Speech Signal Process 31, 1276–1285 (1983).
    [CrossRef]
  20. M. E. Davison, “The ill-conditioned nature of the limited angle tomography problem,” SIAM J. Appl. Math. 43, 428–448 (1983).
    [CrossRef]
  21. L. Piegl and W. Tiller, The NURBS Book (Springer-Verlag, 1997).
  22. T. W. Sederberg, D. L. Cardon, G. T. Finnigan, N. S. North, J. Zheng, and T. Lyche, “T-spline simplification and local refinement,” ACM Trans. Graph 23, 276–283 (2004).
  23. T. J. A. Sethian, Level Set Methods (Cambridge University, 1996).
  24. S. Osher and R. Fedkiw, Level Set Methods and Dynamic Implicit Surfaces (Springer-Verlag, 2002).
  25. K. Krishnamurthy, W. U. Bajwa, and R. Willet, “Level set estimation from projection measurements: performance guarantees and fast computation,” SIAM J. Imaging Sci. 6, 2047–2074 4(2013).
  26. G. M. Schuster, G. Melnikov, and A. K. Katsaggelos, “Operationally optimal vertex-based shape coding,” IEEE Signal Process. Mag.. 15(6), 91–108 (1998).
    [CrossRef]
  27. A. K. Katsaggelos, L. P. Kondi, F. W. Meier, J. Ostermann, and G. M. Schuster, “MPEG-4 and rate-distortion-based shape-coding techniques,” Proc. IEEE 86, 1126–1154 (1998).
    [CrossRef]

2013 (1)

K. Krishnamurthy, W. U. Bajwa, and R. Willet, “Level set estimation from projection measurements: performance guarantees and fast computation,” SIAM J. Imaging Sci. 6, 2047–2074 4(2013).

2010 (1)

B. DeCann and A. Ross, “Gait curves for human recognition, backpack detection, and silhouette correction in a nighttime environment,” Proc. SPIE 7667, 76670Q (2010).
[CrossRef]

2007 (1)

N. Davidenko, “Silhouetted face profiles: a new methodology for face perception research,” J. Vis. 7(4):6, 1–17 (2007).
[CrossRef]

2004 (1)

T. W. Sederberg, D. L. Cardon, G. T. Finnigan, N. S. North, J. Zheng, and T. Lyche, “T-spline simplification and local refinement,” ACM Trans. Graph 23, 276–283 (2004).

2003 (1)

E. Boyer and J. S. Franco, “A hybrid approach for computing visual hulls of complex objects,” Comput. Vis. Pattern Recogn. 1, 695–701 (2003).

2000 (1)

P. J. Phillips, H. Moon, S. A. Rizvi, and P. J. Rauss, “The FERET evaluation methodology for face-recognition algorithms,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1090–1104 (2000).
[CrossRef]

1999 (1)

J. R. Fienup, R. G. Paxman, M. F. Reiley, and B. J. Thelen, “3-D imaging correlography and coherent image reconstruction,” Proc. SPIE 3815–07, 60–69 (1999).
[CrossRef]

1998 (3)

P. J. Phillips, H. Wechsler, J. Huang, and P. J. Rauss, “The FERET database and evaluation procedure for face-recognition algorithms,” Image Vis. Comput. 16, 295–306 (1998).

G. M. Schuster, G. Melnikov, and A. K. Katsaggelos, “Operationally optimal vertex-based shape coding,” IEEE Signal Process. Mag.. 15(6), 91–108 (1998).
[CrossRef]

A. K. Katsaggelos, L. P. Kondi, F. W. Meier, J. Ostermann, and G. M. Schuster, “MPEG-4 and rate-distortion-based shape-coding techniques,” Proc. IEEE 86, 1126–1154 (1998).
[CrossRef]

1995 (1)

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. Sci. Comput. 16, 1190–1208 (1995).
[CrossRef]

1994 (3)

C. L. Matson, “Fourier spectrum extrapolation and enhancement using support constraints,” IEEE Trans. Signal Process 42, 156–163 (1994).
[CrossRef]

R. G. Paxman, J. H. Seldin, J. R. Fienup, and J. C. Marron, “Use of an opacity constraint in three-dimensional imaging,” Proc. SPIE 2241–14, 116–126 (1994).
[CrossRef]

A. Laurentini, “The visual hull concept for silhouette-based image understanding,” IEEE Trans. Pattern Anal. Mach. Intell 16, 150–162 (1994).
[CrossRef]

1985 (1)

F. O’Sullivan and G. Wahba, “A cross validated Bayesian retrieval algorithm for nonlinear remote sensing experiments,” J. Comput. Phys. 59, 441–455 (1985).
[CrossRef]

1983 (2)

J. L. C. Sanz and T. S. Huang, “Discrete and continuous band-limited signal extrapolation,” IEEE Trans. Acoust. Speech Signal Process 31, 1276–1285 (1983).
[CrossRef]

M. E. Davison, “The ill-conditioned nature of the limited angle tomography problem,” SIAM J. Appl. Math. 43, 428–448 (1983).
[CrossRef]

1982 (1)

J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21, 2758–2769 (1982).
[CrossRef]

Bajwa, W. U.

K. Krishnamurthy, W. U. Bajwa, and R. Willet, “Level set estimation from projection measurements: performance guarantees and fast computation,” SIAM J. Imaging Sci. 6, 2047–2074 4(2013).

Barrett, H. H.

H. H. Barrett and K. J. Myers, Foundations of Image Science (Wiley, 2004).

Berk, A.

A. Berk, L. S. Bernstein, and D. C. Robertson, “MODTRAN: a moderate resolution model for LOWTRAN7,” (Air Force Geophysics Laboratory, 1989).

Bernstein, L. S.

A. Berk, L. S. Bernstein, and D. C. Robertson, “MODTRAN: a moderate resolution model for LOWTRAN7,” (Air Force Geophysics Laboratory, 1989).

Boyer, E.

E. Boyer and J. S. Franco, “A hybrid approach for computing visual hulls of complex objects,” Comput. Vis. Pattern Recogn. 1, 695–701 (2003).

Byrd, R. H.

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. Sci. Comput. 16, 1190–1208 (1995).
[CrossRef]

Cardon, D. L.

T. W. Sederberg, D. L. Cardon, G. T. Finnigan, N. S. North, J. Zheng, and T. Lyche, “T-spline simplification and local refinement,” ACM Trans. Graph 23, 276–283 (2004).

Davidenko, N.

N. Davidenko, “Silhouetted face profiles: a new methodology for face perception research,” J. Vis. 7(4):6, 1–17 (2007).
[CrossRef]

R. G. Paxman, P. D. Walker, and N. Davidenko, “Silhouette restoration,” in Computational Optical Sensing and Imaging, OSA Technical Digest (Optical Society of America, 2013), paper CM2C.4.

Davison, M. E.

M. E. Davison, “The ill-conditioned nature of the limited angle tomography problem,” SIAM J. Appl. Math. 43, 428–448 (1983).
[CrossRef]

DeCann, B.

B. DeCann and A. Ross, “Gait curves for human recognition, backpack detection, and silhouette correction in a nighttime environment,” Proc. SPIE 7667, 76670Q (2010).
[CrossRef]

Fedkiw, R.

S. Osher and R. Fedkiw, Level Set Methods and Dynamic Implicit Surfaces (Springer-Verlag, 2002).

Fienup, J. R.

J. R. Fienup, R. G. Paxman, M. F. Reiley, and B. J. Thelen, “3-D imaging correlography and coherent image reconstruction,” Proc. SPIE 3815–07, 60–69 (1999).
[CrossRef]

R. G. Paxman, J. H. Seldin, J. R. Fienup, and J. C. Marron, “Use of an opacity constraint in three-dimensional imaging,” Proc. SPIE 2241–14, 116–126 (1994).
[CrossRef]

J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21, 2758–2769 (1982).
[CrossRef]

R. G. Paxman, J. R. Fienup, M. J. Reiley, and B. J. Thelen, “Phase retrieval with an opacity constraint in laser imaging (PROCLAIM),” in Signal Recovery and Synthesis V, Technical Digest Series, 11 (Optical Society of America, 1998), pp. 34–36.

Finnigan, G. T.

T. W. Sederberg, D. L. Cardon, G. T. Finnigan, N. S. North, J. Zheng, and T. Lyche, “T-spline simplification and local refinement,” ACM Trans. Graph 23, 276–283 (2004).

Franco, J. S.

E. Boyer and J. S. Franco, “A hybrid approach for computing visual hulls of complex objects,” Comput. Vis. Pattern Recogn. 1, 695–701 (2003).

Huang, J.

P. J. Phillips, H. Wechsler, J. Huang, and P. J. Rauss, “The FERET database and evaluation procedure for face-recognition algorithms,” Image Vis. Comput. 16, 295–306 (1998).

Huang, T. S.

J. L. C. Sanz and T. S. Huang, “Discrete and continuous band-limited signal extrapolation,” IEEE Trans. Acoust. Speech Signal Process 31, 1276–1285 (1983).
[CrossRef]

Katsaggelos, A. K.

G. M. Schuster, G. Melnikov, and A. K. Katsaggelos, “Operationally optimal vertex-based shape coding,” IEEE Signal Process. Mag.. 15(6), 91–108 (1998).
[CrossRef]

A. K. Katsaggelos, L. P. Kondi, F. W. Meier, J. Ostermann, and G. M. Schuster, “MPEG-4 and rate-distortion-based shape-coding techniques,” Proc. IEEE 86, 1126–1154 (1998).
[CrossRef]

Kondi, L. P.

A. K. Katsaggelos, L. P. Kondi, F. W. Meier, J. Ostermann, and G. M. Schuster, “MPEG-4 and rate-distortion-based shape-coding techniques,” Proc. IEEE 86, 1126–1154 (1998).
[CrossRef]

Krishnamurthy, K.

K. Krishnamurthy, W. U. Bajwa, and R. Willet, “Level set estimation from projection measurements: performance guarantees and fast computation,” SIAM J. Imaging Sci. 6, 2047–2074 4(2013).

Laurentini, A.

A. Laurentini, “The visual hull concept for silhouette-based image understanding,” IEEE Trans. Pattern Anal. Mach. Intell 16, 150–162 (1994).
[CrossRef]

Lu, P.

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. Sci. Comput. 16, 1190–1208 (1995).
[CrossRef]

Lyche, T.

T. W. Sederberg, D. L. Cardon, G. T. Finnigan, N. S. North, J. Zheng, and T. Lyche, “T-spline simplification and local refinement,” ACM Trans. Graph 23, 276–283 (2004).

Marron, J. C.

R. G. Paxman, J. H. Seldin, J. R. Fienup, and J. C. Marron, “Use of an opacity constraint in three-dimensional imaging,” Proc. SPIE 2241–14, 116–126 (1994).
[CrossRef]

Matson, C. L.

C. L. Matson, “Fourier spectrum extrapolation and enhancement using support constraints,” IEEE Trans. Signal Process 42, 156–163 (1994).
[CrossRef]

Meier, F. W.

A. K. Katsaggelos, L. P. Kondi, F. W. Meier, J. Ostermann, and G. M. Schuster, “MPEG-4 and rate-distortion-based shape-coding techniques,” Proc. IEEE 86, 1126–1154 (1998).
[CrossRef]

Melnikov, G.

G. M. Schuster, G. Melnikov, and A. K. Katsaggelos, “Operationally optimal vertex-based shape coding,” IEEE Signal Process. Mag.. 15(6), 91–108 (1998).
[CrossRef]

Moon, H.

P. J. Phillips, H. Moon, S. A. Rizvi, and P. J. Rauss, “The FERET evaluation methodology for face-recognition algorithms,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1090–1104 (2000).
[CrossRef]

Myers, K. J.

H. H. Barrett and K. J. Myers, Foundations of Image Science (Wiley, 2004).

Nocedal, J.

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. Sci. Comput. 16, 1190–1208 (1995).
[CrossRef]

North, N. S.

T. W. Sederberg, D. L. Cardon, G. T. Finnigan, N. S. North, J. Zheng, and T. Lyche, “T-spline simplification and local refinement,” ACM Trans. Graph 23, 276–283 (2004).

O’Sullivan, F.

F. O’Sullivan and G. Wahba, “A cross validated Bayesian retrieval algorithm for nonlinear remote sensing experiments,” J. Comput. Phys. 59, 441–455 (1985).
[CrossRef]

Osher, S.

S. Osher and R. Fedkiw, Level Set Methods and Dynamic Implicit Surfaces (Springer-Verlag, 2002).

Ostermann, J.

A. K. Katsaggelos, L. P. Kondi, F. W. Meier, J. Ostermann, and G. M. Schuster, “MPEG-4 and rate-distortion-based shape-coding techniques,” Proc. IEEE 86, 1126–1154 (1998).
[CrossRef]

Paxman, R. G.

J. R. Fienup, R. G. Paxman, M. F. Reiley, and B. J. Thelen, “3-D imaging correlography and coherent image reconstruction,” Proc. SPIE 3815–07, 60–69 (1999).
[CrossRef]

R. G. Paxman, J. H. Seldin, J. R. Fienup, and J. C. Marron, “Use of an opacity constraint in three-dimensional imaging,” Proc. SPIE 2241–14, 116–126 (1994).
[CrossRef]

R. G. Paxman, J. R. Fienup, M. J. Reiley, and B. J. Thelen, “Phase retrieval with an opacity constraint in laser imaging (PROCLAIM),” in Signal Recovery and Synthesis V, Technical Digest Series, 11 (Optical Society of America, 1998), pp. 34–36.

R. G. Paxman, P. D. Walker, and N. Davidenko, “Silhouette restoration,” in Computational Optical Sensing and Imaging, OSA Technical Digest (Optical Society of America, 2013), paper CM2C.4.

R. G. Paxman, “Superresolution with an opacity constraint,” in Signal Recovery and Synthesis III, Technical Digest Series, Vol. 15 (Optical Society of America, 1989).

Phillips, P. J.

P. J. Phillips, H. Moon, S. A. Rizvi, and P. J. Rauss, “The FERET evaluation methodology for face-recognition algorithms,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1090–1104 (2000).
[CrossRef]

P. J. Phillips, H. Wechsler, J. Huang, and P. J. Rauss, “The FERET database and evaluation procedure for face-recognition algorithms,” Image Vis. Comput. 16, 295–306 (1998).

Piegl, L.

L. Piegl and W. Tiller, The NURBS Book (Springer-Verlag, 1997).

Rauss, P. J.

P. J. Phillips, H. Moon, S. A. Rizvi, and P. J. Rauss, “The FERET evaluation methodology for face-recognition algorithms,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1090–1104 (2000).
[CrossRef]

P. J. Phillips, H. Wechsler, J. Huang, and P. J. Rauss, “The FERET database and evaluation procedure for face-recognition algorithms,” Image Vis. Comput. 16, 295–306 (1998).

Reiley, M. F.

J. R. Fienup, R. G. Paxman, M. F. Reiley, and B. J. Thelen, “3-D imaging correlography and coherent image reconstruction,” Proc. SPIE 3815–07, 60–69 (1999).
[CrossRef]

Reiley, M. J.

R. G. Paxman, J. R. Fienup, M. J. Reiley, and B. J. Thelen, “Phase retrieval with an opacity constraint in laser imaging (PROCLAIM),” in Signal Recovery and Synthesis V, Technical Digest Series, 11 (Optical Society of America, 1998), pp. 34–36.

Rizvi, S. A.

P. J. Phillips, H. Moon, S. A. Rizvi, and P. J. Rauss, “The FERET evaluation methodology for face-recognition algorithms,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1090–1104 (2000).
[CrossRef]

Robertson, D. C.

A. Berk, L. S. Bernstein, and D. C. Robertson, “MODTRAN: a moderate resolution model for LOWTRAN7,” (Air Force Geophysics Laboratory, 1989).

Ross, A.

B. DeCann and A. Ross, “Gait curves for human recognition, backpack detection, and silhouette correction in a nighttime environment,” Proc. SPIE 7667, 76670Q (2010).
[CrossRef]

Sanz, J. L. C.

J. L. C. Sanz and T. S. Huang, “Discrete and continuous band-limited signal extrapolation,” IEEE Trans. Acoust. Speech Signal Process 31, 1276–1285 (1983).
[CrossRef]

Schuster, G. M.

A. K. Katsaggelos, L. P. Kondi, F. W. Meier, J. Ostermann, and G. M. Schuster, “MPEG-4 and rate-distortion-based shape-coding techniques,” Proc. IEEE 86, 1126–1154 (1998).
[CrossRef]

G. M. Schuster, G. Melnikov, and A. K. Katsaggelos, “Operationally optimal vertex-based shape coding,” IEEE Signal Process. Mag.. 15(6), 91–108 (1998).
[CrossRef]

Sederberg, T. W.

T. W. Sederberg, D. L. Cardon, G. T. Finnigan, N. S. North, J. Zheng, and T. Lyche, “T-spline simplification and local refinement,” ACM Trans. Graph 23, 276–283 (2004).

Seldin, J. H.

R. G. Paxman, J. H. Seldin, J. R. Fienup, and J. C. Marron, “Use of an opacity constraint in three-dimensional imaging,” Proc. SPIE 2241–14, 116–126 (1994).
[CrossRef]

Sethian, T. J. A.

T. J. A. Sethian, Level Set Methods (Cambridge University, 1996).

Thelen, B. J.

J. R. Fienup, R. G. Paxman, M. F. Reiley, and B. J. Thelen, “3-D imaging correlography and coherent image reconstruction,” Proc. SPIE 3815–07, 60–69 (1999).
[CrossRef]

R. G. Paxman, J. R. Fienup, M. J. Reiley, and B. J. Thelen, “Phase retrieval with an opacity constraint in laser imaging (PROCLAIM),” in Signal Recovery and Synthesis V, Technical Digest Series, 11 (Optical Society of America, 1998), pp. 34–36.

Tiller, W.

L. Piegl and W. Tiller, The NURBS Book (Springer-Verlag, 1997).

Veltkamp, R. C.

R. C. Veltkamp, “Shape matching: similarity measures and algorithms,” in Proceedings of International Conference on Shape Modeling and Applications (IEEE, 2001), pp. 188–197.

Wahba, G.

F. O’Sullivan and G. Wahba, “A cross validated Bayesian retrieval algorithm for nonlinear remote sensing experiments,” J. Comput. Phys. 59, 441–455 (1985).
[CrossRef]

Walker, P. D.

R. G. Paxman, P. D. Walker, and N. Davidenko, “Silhouette restoration,” in Computational Optical Sensing and Imaging, OSA Technical Digest (Optical Society of America, 2013), paper CM2C.4.

Wechsler, H.

P. J. Phillips, H. Wechsler, J. Huang, and P. J. Rauss, “The FERET database and evaluation procedure for face-recognition algorithms,” Image Vis. Comput. 16, 295–306 (1998).

Willet, R.

K. Krishnamurthy, W. U. Bajwa, and R. Willet, “Level set estimation from projection measurements: performance guarantees and fast computation,” SIAM J. Imaging Sci. 6, 2047–2074 4(2013).

Zheng, J.

T. W. Sederberg, D. L. Cardon, G. T. Finnigan, N. S. North, J. Zheng, and T. Lyche, “T-spline simplification and local refinement,” ACM Trans. Graph 23, 276–283 (2004).

Zhu, C.

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. Sci. Comput. 16, 1190–1208 (1995).
[CrossRef]

ACM Trans. Graph (1)

T. W. Sederberg, D. L. Cardon, G. T. Finnigan, N. S. North, J. Zheng, and T. Lyche, “T-spline simplification and local refinement,” ACM Trans. Graph 23, 276–283 (2004).

Appl. Opt (1)

J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21, 2758–2769 (1982).
[CrossRef]

Comput. Vis. Pattern Recogn. (1)

E. Boyer and J. S. Franco, “A hybrid approach for computing visual hulls of complex objects,” Comput. Vis. Pattern Recogn. 1, 695–701 (2003).

IEEE Signal Process. Mag. (1)

G. M. Schuster, G. Melnikov, and A. K. Katsaggelos, “Operationally optimal vertex-based shape coding,” IEEE Signal Process. Mag.. 15(6), 91–108 (1998).
[CrossRef]

IEEE Trans. Acoust. Speech Signal Process (1)

J. L. C. Sanz and T. S. Huang, “Discrete and continuous band-limited signal extrapolation,” IEEE Trans. Acoust. Speech Signal Process 31, 1276–1285 (1983).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell (1)

A. Laurentini, “The visual hull concept for silhouette-based image understanding,” IEEE Trans. Pattern Anal. Mach. Intell 16, 150–162 (1994).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

P. J. Phillips, H. Moon, S. A. Rizvi, and P. J. Rauss, “The FERET evaluation methodology for face-recognition algorithms,” IEEE Trans. Pattern Anal. Mach. Intell. 22, 1090–1104 (2000).
[CrossRef]

IEEE Trans. Signal Process (1)

C. L. Matson, “Fourier spectrum extrapolation and enhancement using support constraints,” IEEE Trans. Signal Process 42, 156–163 (1994).
[CrossRef]

Image Vis. Comput. (1)

P. J. Phillips, H. Wechsler, J. Huang, and P. J. Rauss, “The FERET database and evaluation procedure for face-recognition algorithms,” Image Vis. Comput. 16, 295–306 (1998).

J. Comput. Phys. (1)

F. O’Sullivan and G. Wahba, “A cross validated Bayesian retrieval algorithm for nonlinear remote sensing experiments,” J. Comput. Phys. 59, 441–455 (1985).
[CrossRef]

J. Vis. (1)

N. Davidenko, “Silhouetted face profiles: a new methodology for face perception research,” J. Vis. 7(4):6, 1–17 (2007).
[CrossRef]

Proc. IEEE (1)

A. K. Katsaggelos, L. P. Kondi, F. W. Meier, J. Ostermann, and G. M. Schuster, “MPEG-4 and rate-distortion-based shape-coding techniques,” Proc. IEEE 86, 1126–1154 (1998).
[CrossRef]

Proc. SPIE (3)

B. DeCann and A. Ross, “Gait curves for human recognition, backpack detection, and silhouette correction in a nighttime environment,” Proc. SPIE 7667, 76670Q (2010).
[CrossRef]

R. G. Paxman, J. H. Seldin, J. R. Fienup, and J. C. Marron, “Use of an opacity constraint in three-dimensional imaging,” Proc. SPIE 2241–14, 116–126 (1994).
[CrossRef]

J. R. Fienup, R. G. Paxman, M. F. Reiley, and B. J. Thelen, “3-D imaging correlography and coherent image reconstruction,” Proc. SPIE 3815–07, 60–69 (1999).
[CrossRef]

SIAM J. Appl. Math. (1)

M. E. Davison, “The ill-conditioned nature of the limited angle tomography problem,” SIAM J. Appl. Math. 43, 428–448 (1983).
[CrossRef]

SIAM J. Imaging Sci. (1)

K. Krishnamurthy, W. U. Bajwa, and R. Willet, “Level set estimation from projection measurements: performance guarantees and fast computation,” SIAM J. Imaging Sci. 6, 2047–2074 4(2013).

SIAM J. Sci. Comput. (1)

R. H. Byrd, P. Lu, J. Nocedal, and C. Zhu, “A limited memory algorithm for bound constrained optimization,” SIAM J. Sci. Comput. 16, 1190–1208 (1995).
[CrossRef]

Other (9)

R. C. Veltkamp, “Shape matching: similarity measures and algorithms,” in Proceedings of International Conference on Shape Modeling and Applications (IEEE, 2001), pp. 188–197.

R. G. Paxman, “Superresolution with an opacity constraint,” in Signal Recovery and Synthesis III, Technical Digest Series, Vol. 15 (Optical Society of America, 1989).

H. H. Barrett and K. J. Myers, Foundations of Image Science (Wiley, 2004).

A. Berk, L. S. Bernstein, and D. C. Robertson, “MODTRAN: a moderate resolution model for LOWTRAN7,” (Air Force Geophysics Laboratory, 1989).

R. G. Paxman, J. R. Fienup, M. J. Reiley, and B. J. Thelen, “Phase retrieval with an opacity constraint in laser imaging (PROCLAIM),” in Signal Recovery and Synthesis V, Technical Digest Series, 11 (Optical Society of America, 1998), pp. 34–36.

R. G. Paxman, P. D. Walker, and N. Davidenko, “Silhouette restoration,” in Computational Optical Sensing and Imaging, OSA Technical Digest (Optical Society of America, 2013), paper CM2C.4.

L. Piegl and W. Tiller, The NURBS Book (Springer-Verlag, 1997).

T. J. A. Sethian, Level Set Methods (Cambridge University, 1996).

S. Osher and R. Fedkiw, Level Set Methods and Dynamic Implicit Surfaces (Springer-Verlag, 2002).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (14)

Fig. 1.
Fig. 1.

Example silhouettes. (a) Classic facial silhouette constructed as a paper cutout. (b) Silhouette created by casting a shadow onto an approximately uniform background. Note the information in the shadow that is not evident in the direct image of the object.

Fig. 2.
Fig. 2.

Silhouettes formed by relatively dark objects occluding a bright background. (a) Solar transit of the Space Shuttle and the International Space Station, image courtesy of Thierry Legault, http://www.astrophoto.fr. (b) Aircraft occluding the moon, image courtesy of Kurt Gleichman.

Fig. 3.
Fig. 3.

Multiple frames of silhouettes of a walker occluding a background. Such silhouettes can be used for gait recognition. Silhouettes enable image compression or compressive sensing.

Fig. 4.
Fig. 4.

Steps in parameterizing a facial silhouette from a gray-level profile image.

Fig. 5.
Fig. 5.

Principal-component spectrum for parameterized facial silhouettes, normalized to indicate proportion of variance. The red line indicates a power-law fit to the spectrum.

Fig. 6.
Fig. 6.

Mechanisms of degradation in silhouette imaging. From left to right: the ideal continuous silhouette of the average face represented on a 512×1024 grid, blurred by solar extent for shadow 1.5 m away from face, 32×64 discrete array resulting from detector sampling, and with additive read noise.

Fig. 7.
Fig. 7.

Maximum-likelihood (ML) estimation performance for CNR=10. The difference image indicates that the estimate is excellent and the NASD is 1.42 representation-grid pixels. For reference, a single representation pixel is approximately 0.25 mm for an average face size.

Fig. 8.
Fig. 8.

ML estimation of silhouette for CNR=3.16 and 1.73. In these low-CNR regimes, the ML estimates develop caricature features that do not match the true silhouette.

Fig. 9.
Fig. 9.

Maximum-likelihood (ML) and maximum a posteriori (MAP) estimates of facial silhouettes for three CNR levels. The data associated with these CNR levels are shown in Figs. 7 and 8. The true silhouette and the average facial silhouette are also provided for comparison.

Fig. 10.
Fig. 10.

Demonstration of practical MAP estimation for modest misregistration in the case of using the full FOV. The central panel shows CNR=10 misregistered data such that the first (below-chin) and last (hairline) landmark points are translated by (2.0, 1.5) and (0.3, 0.7) detected pixels, respectively. The joint estimate is significantly better than the naïve estimate (misregistration ignored) and comparable to the estimate made with registered data. The yellow borders represent the field of view (FOV) and the ROI over which the NASD (in representation pixels) was computed for the data and the difference images, respectively.

Fig. 11.
Fig. 11.

Estimation as the FOV varies in the case of no registration errors. The upper row shows the data (CNR=10) and the varying FOVs. The lower row shows the corresponding error images and the NASD values (in representation pixels), computed over a common ROI, equivalent to the fourth largest FOV. The estimates degrade very slowly with decreasing FOV until the FOV is smaller than the ROI, where estimation accuracy degrades rapidly with decreasing FOV.

Fig. 12.
Fig. 12.

Practical MAP estimation with dramatic misregistration and reduced FOV. The true silhouette and corresponding data have been misregistered (translated, rotated, and scaled) relative to the model by translating the first and last landmark points, as shown. The rectangular box indicates the FOV over which the data were used for estimation. The estimated facial silhouette is relatively accurate within the region of the FOV, with an NASD of 2.14 representation pixels.

Fig. 13.
Fig. 13.

Demonstration of extreme dealiasing enabled by strong prior knowledge. The crisp silhouette is constructed on a 256×512 representation grid, and the corresponding Fourier-domain representation (magnitude of the DFT, displayed with log transformation and scaling, common for all panels, selected to emphasize relevant features) shows that object spatial frequencies are nonzero out to the edge of the array. Blur from a circular disk imposes a sombrero-function modulation on the Fourier representation. Additional blur from the 8×8 aggregation kernel imposes an additional 2D sinc modulation. The bold squares (in blue) represent the Nyquist frequency for the undersampled data, equal to 1/8th the Nyquist frequency for the crisp silhouette. The Fourier representation of the undersampled data (enlarged in the figure by a factor of 8) is a mixture of many overlapping Fourier components. The silhouette estimated from the noiseless undersampled data is virtually indistinguishable from the true (crisp) silhouette. When noise is added, silhouette estimation degrades gracefully.

Fig. 14.
Fig. 14.

Demonstration of dramatic superresolution enabled by strong prior knowledge. Blur is caused by imaging with an unaberrated optical system and convolving with an 8×8 aggregation filter. Accordingly, the Fourier representation of the sampled data (enlarged in the figure by a factor of 8) displays a clear cutoff frequency. The bold squares (in blue) represent the Nyquist frequency for the sampled data, equal to 1/8th the Nyquist frequency for the crisp silhouette. The silhouette estimated from the noiseless sampled data is virtually indistinguishable from the true (crisp) silhouette. When noise is added, silhouette estimation degrades gracefully.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

pα(α)=[(2π)Mdet(K^α)]1/2exp[12(αα¯)TK^α1(αα¯)]
=[(2π)Mdet(K^α)]1/2exp[12[AT(αα¯)]TΛ1[AT(αα¯)]],
d(x)=Dx{f(x,α)s(xx)dx}+n(x)
=g(x)+n(x),
CNREdirectEshadowσn,
α^ML=arg minαx(d(x)Dx{f(x;α)s(xx)dx})2.
α^MAP=arg maxα{ln[pr(d|α)]+ln[pr(α)]}
arg minα{x[d(x)Dx{f(x;α)s(xx)dx}]2+η[AT(αα¯)]TΛ1[AT(αα¯)]},
α[αpT,αdT]T,
α^PMAP=arg minα{x[d(x)Dx{f(x;α)s(xx)dx}]2+η[AT(αpα¯p)]TΛ1[AT(αpα¯p)]},

Metrics