Abstract

The averaged point-spread function (PSF) estimation of an image acquisition system is important for many computer vision applications, including edge detection and depth from defocus. The paper compares several mathematical models of the PSF and presents an improved measurement technique that enables subpixel estimation of 2D functions. New methods for noise suppression and uneven illumination modeling were incorporated. The PSF was computed from an ensemble of edge-spread function measurements. The generalized Gaussian was shown to be an 8 times better fit to the estimated PSF than the Gaussian and a 14 times better fit than the pillbox model.

© 2008 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. R. G. White and R. A. Schowengerdt, "Effect of point spread functions on precision step edge measurement," J. Opt. Soc. Am. A 11, 2593-2603 (1994).
    [CrossRef]
  2. R. C. Staunton, "Edge operator error estimation incorporating measurements of CCD TV camera transfer function," IEE Proc. Vision Image Signal Process. 145, 229-235 (1998).
    [CrossRef]
  3. R. C. Staunton, "Detected edge evaluation using measured acquisition system parameters," Pattern Recogn. Lett. 26, 1609-1619 (2005).
    [CrossRef]
  4. A. Pentland, "A new sense for depth of field," IEEE Trans. Pattern Anal. Mach. Intell. 9, 523-531 (1987).
    [CrossRef] [PubMed]
  5. M. Subbarao and N. Gurumoorthy, "Depth recovery from blurred edges," in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1988), pp. 498-503.
  6. J. Ens and P. Lawrence, "An investigation of methods for determining depth from focus," IEEE Trans. Pattern Anal. Mach. Intell. 15, 97-108 (1993).
    [CrossRef]
  7. M. Subbarao and G. Surya, "Depth from defocus: a spatial domain approach," Int. J. Comput. Vis. 13, 271-294 (1994).
    [CrossRef]
  8. A. N. Rajagopalan and S. Chaudhuri, "A variational approach to recovering depth from defocused images," IEEE Trans. Pattern Anal. Mach. Intell. 19, 1158-1164 (1997).
    [CrossRef]
  9. M. Watanabe and S. Nayar, "Rational Filters for Passive Depth from Defocus," Int. J. Comput. Vis. 27, 203-225 (1998).
    [CrossRef]
  10. Li-Ma and R. C. Staunton, "Integration of multiresolution image segmentation and neural networks for object depth recovery," Pattern Recogn. 38, 985-996 (2005).
    [CrossRef]
  11. P. Favaro and S. Soatto, "A geometric approach to shape from defocus," IEEE Trans. Pattern Anal. Mach. Intell. 27, 406-417 (2005).
    [CrossRef] [PubMed]
  12. D. K. Schroder, Advanced MOS Devices (Addison Wesley, 1987).
  13. S. E. Reichenbach, S. K. Park, and R. Narayanswamy, "Characterizing digital image acquisition devices," Opt. Eng. (Bellingham) 30, 170-177 (1991).
    [CrossRef]
  14. A. P. Tzannes and J. M. Mooney, "Measurement of the modulation transfer function of infrared cameras," Opt. Eng. (Bellingham) 34, 1808-1817 (1995).
    [CrossRef]
  15. D. Kavaldjiev and Z. Ninkov, "Influence of nonuniform charge-coupled device pixel response on aperture photometry," Opt. Eng. (Bellingham) 40, 162-169 (2001).
    [CrossRef]
  16. G. Boreman and E. L. Dereniak, "Method for measuring modulation transfer-function of charge-coupled-devices using laser speckle," Opt. Eng. (Bellingham) 25, 148-150 (1986).
  17. J. C. Feltz and M. A. Karim, "Modulation transfer-function of charge-coupled-devices," Appl. Opt. 29, 717-722 (1990).
    [CrossRef] [PubMed]
  18. M. Marchywka and D. G. Socker, "Modulation transfer-function measurement technique for small-pixel detectors," Appl. Opt. 31, 7198-7213 (1992).
    [CrossRef] [PubMed]
  19. A. Daniels, G. D. Boreman, and A. D. Ducharme, "Random transparency targets for modulation transfer-function measurement in the visible and infrared regions," Opt. Eng. (Bellingham) 34, 860-868 (1995).
    [CrossRef]
  20. A. Rosenfeld and A. C. Kak, Digital Picture Processing (Academic, 1982).
  21. T. J. Schultz, "Multiframe image restoration," in Handbook of Image and Video Processing, A.Bovic, ed. (Academic, 2000), pp. 175-189.
  22. C. D. Claxton, "Colour depth-from-defocus incorporating experimental point spread function measurements," Ph.D. dissertation (University of Warwick, Coventry, U.K., 2007).
  23. A. Tarantola, Inverse Problem Theory: Methods for Data Fitting and Model Parameter Estimation (Elsevier, 1987).
  24. R. Chartrand, "Numerical differentiation of noisy, nonsmooth data," http://math.lanl.gov/Research/Publications/Docs/chartrand-2005-numerical.pdf.

2007

C. D. Claxton, "Colour depth-from-defocus incorporating experimental point spread function measurements," Ph.D. dissertation (University of Warwick, Coventry, U.K., 2007).

2005

R. C. Staunton, "Detected edge evaluation using measured acquisition system parameters," Pattern Recogn. Lett. 26, 1609-1619 (2005).
[CrossRef]

Li-Ma and R. C. Staunton, "Integration of multiresolution image segmentation and neural networks for object depth recovery," Pattern Recogn. 38, 985-996 (2005).
[CrossRef]

P. Favaro and S. Soatto, "A geometric approach to shape from defocus," IEEE Trans. Pattern Anal. Mach. Intell. 27, 406-417 (2005).
[CrossRef] [PubMed]

2001

D. Kavaldjiev and Z. Ninkov, "Influence of nonuniform charge-coupled device pixel response on aperture photometry," Opt. Eng. (Bellingham) 40, 162-169 (2001).
[CrossRef]

2000

T. J. Schultz, "Multiframe image restoration," in Handbook of Image and Video Processing, A.Bovic, ed. (Academic, 2000), pp. 175-189.

1998

M. Watanabe and S. Nayar, "Rational Filters for Passive Depth from Defocus," Int. J. Comput. Vis. 27, 203-225 (1998).
[CrossRef]

R. C. Staunton, "Edge operator error estimation incorporating measurements of CCD TV camera transfer function," IEE Proc. Vision Image Signal Process. 145, 229-235 (1998).
[CrossRef]

1997

A. N. Rajagopalan and S. Chaudhuri, "A variational approach to recovering depth from defocused images," IEEE Trans. Pattern Anal. Mach. Intell. 19, 1158-1164 (1997).
[CrossRef]

1995

A. P. Tzannes and J. M. Mooney, "Measurement of the modulation transfer function of infrared cameras," Opt. Eng. (Bellingham) 34, 1808-1817 (1995).
[CrossRef]

A. Daniels, G. D. Boreman, and A. D. Ducharme, "Random transparency targets for modulation transfer-function measurement in the visible and infrared regions," Opt. Eng. (Bellingham) 34, 860-868 (1995).
[CrossRef]

1994

M. Subbarao and G. Surya, "Depth from defocus: a spatial domain approach," Int. J. Comput. Vis. 13, 271-294 (1994).
[CrossRef]

R. G. White and R. A. Schowengerdt, "Effect of point spread functions on precision step edge measurement," J. Opt. Soc. Am. A 11, 2593-2603 (1994).
[CrossRef]

1993

J. Ens and P. Lawrence, "An investigation of methods for determining depth from focus," IEEE Trans. Pattern Anal. Mach. Intell. 15, 97-108 (1993).
[CrossRef]

1992

1991

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, "Characterizing digital image acquisition devices," Opt. Eng. (Bellingham) 30, 170-177 (1991).
[CrossRef]

1990

1988

M. Subbarao and N. Gurumoorthy, "Depth recovery from blurred edges," in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1988), pp. 498-503.

1987

A. Pentland, "A new sense for depth of field," IEEE Trans. Pattern Anal. Mach. Intell. 9, 523-531 (1987).
[CrossRef] [PubMed]

D. K. Schroder, Advanced MOS Devices (Addison Wesley, 1987).

A. Tarantola, Inverse Problem Theory: Methods for Data Fitting and Model Parameter Estimation (Elsevier, 1987).

1986

G. Boreman and E. L. Dereniak, "Method for measuring modulation transfer-function of charge-coupled-devices using laser speckle," Opt. Eng. (Bellingham) 25, 148-150 (1986).

1982

A. Rosenfeld and A. C. Kak, Digital Picture Processing (Academic, 1982).

Boreman, G.

G. Boreman and E. L. Dereniak, "Method for measuring modulation transfer-function of charge-coupled-devices using laser speckle," Opt. Eng. (Bellingham) 25, 148-150 (1986).

Boreman, G. D.

A. Daniels, G. D. Boreman, and A. D. Ducharme, "Random transparency targets for modulation transfer-function measurement in the visible and infrared regions," Opt. Eng. (Bellingham) 34, 860-868 (1995).
[CrossRef]

Chartrand, R.

R. Chartrand, "Numerical differentiation of noisy, nonsmooth data," http://math.lanl.gov/Research/Publications/Docs/chartrand-2005-numerical.pdf.

Chaudhuri, S.

A. N. Rajagopalan and S. Chaudhuri, "A variational approach to recovering depth from defocused images," IEEE Trans. Pattern Anal. Mach. Intell. 19, 1158-1164 (1997).
[CrossRef]

Claxton, C. D.

C. D. Claxton, "Colour depth-from-defocus incorporating experimental point spread function measurements," Ph.D. dissertation (University of Warwick, Coventry, U.K., 2007).

Daniels, A.

A. Daniels, G. D. Boreman, and A. D. Ducharme, "Random transparency targets for modulation transfer-function measurement in the visible and infrared regions," Opt. Eng. (Bellingham) 34, 860-868 (1995).
[CrossRef]

Dereniak, E. L.

G. Boreman and E. L. Dereniak, "Method for measuring modulation transfer-function of charge-coupled-devices using laser speckle," Opt. Eng. (Bellingham) 25, 148-150 (1986).

Ducharme, A. D.

A. Daniels, G. D. Boreman, and A. D. Ducharme, "Random transparency targets for modulation transfer-function measurement in the visible and infrared regions," Opt. Eng. (Bellingham) 34, 860-868 (1995).
[CrossRef]

Ens, J.

J. Ens and P. Lawrence, "An investigation of methods for determining depth from focus," IEEE Trans. Pattern Anal. Mach. Intell. 15, 97-108 (1993).
[CrossRef]

Favaro, P.

P. Favaro and S. Soatto, "A geometric approach to shape from defocus," IEEE Trans. Pattern Anal. Mach. Intell. 27, 406-417 (2005).
[CrossRef] [PubMed]

Feltz, J. C.

Gurumoorthy, N.

M. Subbarao and N. Gurumoorthy, "Depth recovery from blurred edges," in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1988), pp. 498-503.

Kak, A. C.

A. Rosenfeld and A. C. Kak, Digital Picture Processing (Academic, 1982).

Karim, M. A.

Kavaldjiev, D.

D. Kavaldjiev and Z. Ninkov, "Influence of nonuniform charge-coupled device pixel response on aperture photometry," Opt. Eng. (Bellingham) 40, 162-169 (2001).
[CrossRef]

Lawrence, P.

J. Ens and P. Lawrence, "An investigation of methods for determining depth from focus," IEEE Trans. Pattern Anal. Mach. Intell. 15, 97-108 (1993).
[CrossRef]

Marchywka, M.

Mooney, J. M.

A. P. Tzannes and J. M. Mooney, "Measurement of the modulation transfer function of infrared cameras," Opt. Eng. (Bellingham) 34, 1808-1817 (1995).
[CrossRef]

Narayanswamy, R.

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, "Characterizing digital image acquisition devices," Opt. Eng. (Bellingham) 30, 170-177 (1991).
[CrossRef]

Nayar, S.

M. Watanabe and S. Nayar, "Rational Filters for Passive Depth from Defocus," Int. J. Comput. Vis. 27, 203-225 (1998).
[CrossRef]

Ninkov, Z.

D. Kavaldjiev and Z. Ninkov, "Influence of nonuniform charge-coupled device pixel response on aperture photometry," Opt. Eng. (Bellingham) 40, 162-169 (2001).
[CrossRef]

Park, S. K.

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, "Characterizing digital image acquisition devices," Opt. Eng. (Bellingham) 30, 170-177 (1991).
[CrossRef]

Pentland, A.

A. Pentland, "A new sense for depth of field," IEEE Trans. Pattern Anal. Mach. Intell. 9, 523-531 (1987).
[CrossRef] [PubMed]

Rajagopalan, A. N.

A. N. Rajagopalan and S. Chaudhuri, "A variational approach to recovering depth from defocused images," IEEE Trans. Pattern Anal. Mach. Intell. 19, 1158-1164 (1997).
[CrossRef]

Reichenbach, S. E.

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, "Characterizing digital image acquisition devices," Opt. Eng. (Bellingham) 30, 170-177 (1991).
[CrossRef]

Rosenfeld, A.

A. Rosenfeld and A. C. Kak, Digital Picture Processing (Academic, 1982).

Schowengerdt, R. A.

Schroder, D. K.

D. K. Schroder, Advanced MOS Devices (Addison Wesley, 1987).

Schultz, T. J.

T. J. Schultz, "Multiframe image restoration," in Handbook of Image and Video Processing, A.Bovic, ed. (Academic, 2000), pp. 175-189.

Soatto, S.

P. Favaro and S. Soatto, "A geometric approach to shape from defocus," IEEE Trans. Pattern Anal. Mach. Intell. 27, 406-417 (2005).
[CrossRef] [PubMed]

Socker, D. G.

Staunton, R. C.

R. C. Staunton, "Detected edge evaluation using measured acquisition system parameters," Pattern Recogn. Lett. 26, 1609-1619 (2005).
[CrossRef]

Li-Ma and R. C. Staunton, "Integration of multiresolution image segmentation and neural networks for object depth recovery," Pattern Recogn. 38, 985-996 (2005).
[CrossRef]

R. C. Staunton, "Edge operator error estimation incorporating measurements of CCD TV camera transfer function," IEE Proc. Vision Image Signal Process. 145, 229-235 (1998).
[CrossRef]

Subbarao, M.

M. Subbarao and G. Surya, "Depth from defocus: a spatial domain approach," Int. J. Comput. Vis. 13, 271-294 (1994).
[CrossRef]

M. Subbarao and N. Gurumoorthy, "Depth recovery from blurred edges," in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1988), pp. 498-503.

Surya, G.

M. Subbarao and G. Surya, "Depth from defocus: a spatial domain approach," Int. J. Comput. Vis. 13, 271-294 (1994).
[CrossRef]

Tarantola, A.

A. Tarantola, Inverse Problem Theory: Methods for Data Fitting and Model Parameter Estimation (Elsevier, 1987).

Tzannes, A. P.

A. P. Tzannes and J. M. Mooney, "Measurement of the modulation transfer function of infrared cameras," Opt. Eng. (Bellingham) 34, 1808-1817 (1995).
[CrossRef]

Watanabe, M.

M. Watanabe and S. Nayar, "Rational Filters for Passive Depth from Defocus," Int. J. Comput. Vis. 27, 203-225 (1998).
[CrossRef]

White, R. G.

Appl. Opt.

IEE Proc. Vision Image Signal Process.

R. C. Staunton, "Edge operator error estimation incorporating measurements of CCD TV camera transfer function," IEE Proc. Vision Image Signal Process. 145, 229-235 (1998).
[CrossRef]

IEEE Trans. Pattern Anal. Mach. Intell.

A. Pentland, "A new sense for depth of field," IEEE Trans. Pattern Anal. Mach. Intell. 9, 523-531 (1987).
[CrossRef] [PubMed]

J. Ens and P. Lawrence, "An investigation of methods for determining depth from focus," IEEE Trans. Pattern Anal. Mach. Intell. 15, 97-108 (1993).
[CrossRef]

A. N. Rajagopalan and S. Chaudhuri, "A variational approach to recovering depth from defocused images," IEEE Trans. Pattern Anal. Mach. Intell. 19, 1158-1164 (1997).
[CrossRef]

P. Favaro and S. Soatto, "A geometric approach to shape from defocus," IEEE Trans. Pattern Anal. Mach. Intell. 27, 406-417 (2005).
[CrossRef] [PubMed]

Int. J. Comput. Vis.

M. Watanabe and S. Nayar, "Rational Filters for Passive Depth from Defocus," Int. J. Comput. Vis. 27, 203-225 (1998).
[CrossRef]

M. Subbarao and G. Surya, "Depth from defocus: a spatial domain approach," Int. J. Comput. Vis. 13, 271-294 (1994).
[CrossRef]

J. Opt. Soc. Am. A

Opt. Eng. (Bellingham)

S. E. Reichenbach, S. K. Park, and R. Narayanswamy, "Characterizing digital image acquisition devices," Opt. Eng. (Bellingham) 30, 170-177 (1991).
[CrossRef]

A. P. Tzannes and J. M. Mooney, "Measurement of the modulation transfer function of infrared cameras," Opt. Eng. (Bellingham) 34, 1808-1817 (1995).
[CrossRef]

D. Kavaldjiev and Z. Ninkov, "Influence of nonuniform charge-coupled device pixel response on aperture photometry," Opt. Eng. (Bellingham) 40, 162-169 (2001).
[CrossRef]

G. Boreman and E. L. Dereniak, "Method for measuring modulation transfer-function of charge-coupled-devices using laser speckle," Opt. Eng. (Bellingham) 25, 148-150 (1986).

A. Daniels, G. D. Boreman, and A. D. Ducharme, "Random transparency targets for modulation transfer-function measurement in the visible and infrared regions," Opt. Eng. (Bellingham) 34, 860-868 (1995).
[CrossRef]

Pattern Recogn.

Li-Ma and R. C. Staunton, "Integration of multiresolution image segmentation and neural networks for object depth recovery," Pattern Recogn. 38, 985-996 (2005).
[CrossRef]

Pattern Recogn. Lett.

R. C. Staunton, "Detected edge evaluation using measured acquisition system parameters," Pattern Recogn. Lett. 26, 1609-1619 (2005).
[CrossRef]

Other

M. Subbarao and N. Gurumoorthy, "Depth recovery from blurred edges," in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1988), pp. 498-503.

A. Rosenfeld and A. C. Kak, Digital Picture Processing (Academic, 1982).

T. J. Schultz, "Multiframe image restoration," in Handbook of Image and Video Processing, A.Bovic, ed. (Academic, 2000), pp. 175-189.

C. D. Claxton, "Colour depth-from-defocus incorporating experimental point spread function measurements," Ph.D. dissertation (University of Warwick, Coventry, U.K., 2007).

A. Tarantola, Inverse Problem Theory: Methods for Data Fitting and Model Parameter Estimation (Elsevier, 1987).

R. Chartrand, "Numerical differentiation of noisy, nonsmooth data," http://math.lanl.gov/Research/Publications/Docs/chartrand-2005-numerical.pdf.

D. K. Schroder, Advanced MOS Devices (Addison Wesley, 1987).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (20)

Fig. 1
Fig. 1

Migration of samples onto the ESF.

Fig. 2
Fig. 2

Simple model of the optical system with the image plane on the left.

Fig. 3
Fig. 3

Normalized-magnitude PSFs for a 16 mm , f 4 lens: (a) focused, (b) defocused.

Fig. 4
Fig. 4

ESF with a pillbox PSF where σ = 5 (solid curve) and the ideal step edge (dashed curve).

Fig. 5
Fig. 5

ESF when the PSF is a Gaussian with σ = 5 (solid curve) and unevenly illuminated ideal step edge (dashed curve).

Fig. 6
Fig. 6

Generalized Gaussian PSFs where (left) p = 1 , σ = 5 and (right) p = 4 , σ = 5 .

Fig. 7
Fig. 7

Ideal steps (dashed curves) and the ESFs (solid curves) assuming generalized Gaussian PSFs with (left) p = 1 , σ = 5 and (right) p = 4 , σ = 5 .

Fig. 8
Fig. 8

Example of the windowed image.

Fig. 9
Fig. 9

Five-point numerical differentiation results for f 2.8 , z = 0.725 m , angle = 0 ° with ESF shown on the left and PSF on the right.

Fig. 10
Fig. 10

Actual ESF (dashed curve) and Fermi–Dirac fitted ESF (solid curve) results for f 2.8 , z = 0.725 m , angle = 0 ° (left), and PSF (right).

Fig. 11
Fig. 11

Regularized numerical differentiation results (right) for α = 10 (dashed curve), α = 100 (dash–dotted), and α = 1000 (solid).

Fig. 12
Fig. 12

Standard deviation against depth when fitting (left) a Gaussian PSF and (right) a generalized Gaussian PSF.

Fig. 13
Fig. 13

Power of the generalized Gaussian against depth.

Fig. 14
Fig. 14

Two-dimensional PSF assuming a Gaussian model for z = 0.725 m and f 2.8 , where x and y are in pixels.

Fig. 15
Fig. 15

Two-dimensional PSF assuming a generalized Gaussian model for z = 0.725 m and f 2.8 , where x and y are in pixels.

Fig. 16
Fig. 16

Two-dimensional PSF assuming a pillbox model for z = 0.725 m and f 2.8 , where x and y are in pixels.

Fig. 17
Fig. 17

Two-dimensional PSF assuming a Gaussian model for z = 0.414 m and f 2.8 , where x and y are in pixels.

Fig. 18
Fig. 18

Two-dimensional PSF assuming a generalized Gaussian model for z = 0.414 m and f 2.8 , where x and y are in pixels.

Fig. 19
Fig. 19

Two-dimensional PSF assuming a pillbox model for z = 0.414 m and f 2.8 , where x and y are in pixels.

Fig. 20
Fig. 20

Comparison between PSFs for the Gaussian (dashed curve) and generalized Gaussian (solid curve).

Tables (3)

Tables Icon

Table 1 MSE a  Results for f 2.8 as a Function of Depth to Lightbox

Tables Icon

Table 2 MSE a  Results for All Three Apertures, from Best to Worst

Tables Icon

Table 3 Average MSE 10 3 for Each Method

Equations (15)

Equations on this page are rendered with MathJax. Learn more.

r = v o u F ( v o + u ) 2 f u ,
h ( x ) = A ( ξ ) e j θ ( ξ ) e j 2 π ξ x λ F d ξ 2 ,
θ ( x ) = π λ ( 1 u + 1 v 1 F ) x 2 ,
h ( x ) = λ 1 λ 2 r + r e j π λ [ 2 ξ x F ( 1 u + 1 v 1 F ) ] ξ 2 d ξ 2 d λ .
s ( x ) = ( m 1 x + c 1 ) u ( x + x 0 ) + ( m 2 x + c 2 ) u ( x x 0 ) ,
h p ( x ) = 1 2 σ [ u ( x + σ ) u ( x σ ) ] ,
g p ( x ) = { m 1 x + c 1 x x 0 < σ 1 4 σ { [ 2 c 1 + m 1 ( x + x 0 σ ) ] ( x x 0 σ ) + ( x x 0 + σ ) [ 2 c 2 + m 2 ( x + x 0 + σ ) ] } σ x x 0 σ , m 2 x + c 2 σ < x x 0 }
g FD ( x ) = i = 1 N [ a i 1 + exp ( x b i c i ) ] + d ,
h FD ( x ) = g FD ( x ) x = i = 1 N { a i exp ( x b i c i ) c i [ 1 + exp ( x b i c i ) ] 2 } .
h g ( x ) = 1 2 π σ exp [ 1 2 ( x x ¯ ) 2 σ 2 ] .
g g ( x ) = 1 2 ( ( m 1 + m 2 ) σ 2 σ exp [ ( x x ¯ ) 2 2 σ 2 ] + ( m 1 x + c 1 ) { 1 erf [ ( x x ¯ ) σ 2 ] } + ( m 2 x + c 2 ) { 1 erf [ ( x x ¯ ) σ 2 ] } ) ,
erf ( x ) = 2 π 0 x exp ( t 2 ) d t .
h G ( x ) = p 1 1 p 2 σ Γ ( 1 p ) exp ( 1 p x x ¯ p σ p ) ,
g G ( x ) = p 1 1 p 2 σ Γ ( 1 p ) x x 0 exp ( 1 p ξ p σ p ) [ m 1 ( x ξ ) + c 1 ] d ξ + p 1 1 p 2 σ Γ ( 1 p ) x x 0 exp ( 1 p ξ p σ p ) [ m 2 ( x ξ ) + c 2 ] d ξ .
Y ( u ) = α 0 L u ( x ) d x + 1 2 0 L ( 0 x u ( z ) d z ) y ( x ) 2 d x ,

Metrics