Abstract

We describe a novel formulation of the range recovery problem based on computation of the differential variation in image intensities with respect to changes in camera position. This method uses a single stationary camera and a pair of calibrated optical masks to measure this differential quantity directly. We also describe a variant based on changes in aperture size. The subsequent computation of the range image involves simple arithmetic operations and is suitable for real-time implementation. We present the theory of this technique and show results from a prototype camera that we have constructed.

© 1998 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |
  • |
  • |

  1. B. K. P. Horn, Robot Vision (MIT Press, Cambridge, Mass., 1986).
  2. B. D. Lucas, T. Kanade, “An iterative image registration technique with an application to stereo vision,” presented at the 7th International Joint Conference on Artificial Intelligence, Vancouver, B.C., Canada, 1981.
  3. E. P. Simoncelli, E. H. Adelson, D. J. Heeger, “Probability distributions of optical flow,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1991), pp. 310–315.
  4. A. P. Pentland, “A new sense for depth of field,” IEEE Trans. Pattern. Anal. Mach. Intell. AP9, 523–531 (1987).
    [CrossRef]
  5. M. Subbarao, “Parallel depth recovery by changing camera parameters,” in Proceedings of the International Conference on Computer Vision (IEEE Computer Society Press, Los Alamitos, Calif., 1988), pp. 149–155.
  6. Y. Xiong, S. Shafer, “Depth from focusing and defocusing,” in Proceedings of the DARPA Image Understanding Workshop (IEEE Computer Society Press, Los Alamitos, Calif., 1988), pp. 967–976.
  7. A. M. Subbarao, G. Surya, “Depth from defocus: a spatial domain approach,” Int. J. Comput. Vis. 13, 271–294 (1994).
    [CrossRef]
  8. S. K. Nayar, M. Watanabe, M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern. Anal. Mach. Intell. 18, 1186–1198 (1995).
    [CrossRef]
  9. M. Watanabe, S. Nayar, “Minimal operator set for passive depth from defocus,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1996), pp. 431–438.
  10. E. R. Dowski, W. T. Cathey, “Single-lens single-image incoherent passive-ranging systems,” Appl. Opt. 33, 6762–6773 (1994).
    [CrossRef] [PubMed]
  11. D. G. Jones, D. G. Lamb, “Analyzing the visual echo: passive 3-D imaging with a multiple aperture camera,” (Department of Electrical Engineering, McGill University, Mòntreal, Canada, 1993).
  12. E. H. Adelson, J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern. Anal. Mach. Intell. 14, 99–106 (1992).
    [CrossRef]
  13. W. Teoh, X. D. Zhang, “An inexpensive stereoscopic vision system for robots,” in Proceedings of the International Conference on Robotics (IEEE Computer Society Press, Los Alamitos, Calif., 1984), pp. 186–189.
  14. Y. Nishimoto, Y. Shirai, “A feature-based stereo model using small disparities,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1987), pp. 192–196.
  15. A. Goshtasby, W. A. Gruver, “Design of a single-lens stereo camera system,” Pattern Recogn. 26, 923–937 (1993).
    [CrossRef]
  16. H. Farid, “Range estimation by optical differentiation,” Ph.D. dissertation (University of Pennsylvania, Philadelphia, Pa., 1997).
  17. R. W. Floyd, L. Steinberg, “An adaptive algorithm for spatial grey scale,” Proc. Soc. Inf. Disp. 17, 75–77 (1976).
  18. H. Farid, E. P. Simoncelli, “Optimally rotation-equivariant directional derivative kernels,” in Computer Analysis of Images and Patterns (Springer-Verlag, Berlin, 1997), pp. 207–214.
  19. E. P. Simoncelli, H. Farid, “Direct differential range estimation from aperture derivatives,” in Proceedings of the European Conference on Computer Vision (Springer-Verlag, Berlin, 1996), Vol. II, pp. 82–93.
  20. E. P. Simoncelli, H. Farid, “Single-lens range imaging using optical derivative masks,” U.S. patent5,703,677, December30, 1997; international patent pending (filed November13, 1996).
  21. H. Farid, E. P. Simoncelli, “A differential optical range camera,” presented at the 1996 OSA Annual Meeting, Rochester, N.Y., October 20–26, 1996.

1995 (1)

S. K. Nayar, M. Watanabe, M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern. Anal. Mach. Intell. 18, 1186–1198 (1995).
[CrossRef]

1994 (2)

E. R. Dowski, W. T. Cathey, “Single-lens single-image incoherent passive-ranging systems,” Appl. Opt. 33, 6762–6773 (1994).
[CrossRef] [PubMed]

A. M. Subbarao, G. Surya, “Depth from defocus: a spatial domain approach,” Int. J. Comput. Vis. 13, 271–294 (1994).
[CrossRef]

1993 (1)

A. Goshtasby, W. A. Gruver, “Design of a single-lens stereo camera system,” Pattern Recogn. 26, 923–937 (1993).
[CrossRef]

1992 (1)

E. H. Adelson, J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern. Anal. Mach. Intell. 14, 99–106 (1992).
[CrossRef]

1987 (1)

A. P. Pentland, “A new sense for depth of field,” IEEE Trans. Pattern. Anal. Mach. Intell. AP9, 523–531 (1987).
[CrossRef]

1976 (1)

R. W. Floyd, L. Steinberg, “An adaptive algorithm for spatial grey scale,” Proc. Soc. Inf. Disp. 17, 75–77 (1976).

Adelson, E. H.

E. H. Adelson, J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern. Anal. Mach. Intell. 14, 99–106 (1992).
[CrossRef]

E. P. Simoncelli, E. H. Adelson, D. J. Heeger, “Probability distributions of optical flow,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1991), pp. 310–315.

Cathey, W. T.

Dowski, E. R.

Farid, H.

H. Farid, E. P. Simoncelli, “A differential optical range camera,” presented at the 1996 OSA Annual Meeting, Rochester, N.Y., October 20–26, 1996.

E. P. Simoncelli, H. Farid, “Single-lens range imaging using optical derivative masks,” U.S. patent5,703,677, December30, 1997; international patent pending (filed November13, 1996).

H. Farid, E. P. Simoncelli, “Optimally rotation-equivariant directional derivative kernels,” in Computer Analysis of Images and Patterns (Springer-Verlag, Berlin, 1997), pp. 207–214.

H. Farid, “Range estimation by optical differentiation,” Ph.D. dissertation (University of Pennsylvania, Philadelphia, Pa., 1997).

E. P. Simoncelli, H. Farid, “Direct differential range estimation from aperture derivatives,” in Proceedings of the European Conference on Computer Vision (Springer-Verlag, Berlin, 1996), Vol. II, pp. 82–93.

Floyd, R. W.

R. W. Floyd, L. Steinberg, “An adaptive algorithm for spatial grey scale,” Proc. Soc. Inf. Disp. 17, 75–77 (1976).

Goshtasby, A.

A. Goshtasby, W. A. Gruver, “Design of a single-lens stereo camera system,” Pattern Recogn. 26, 923–937 (1993).
[CrossRef]

Gruver, W. A.

A. Goshtasby, W. A. Gruver, “Design of a single-lens stereo camera system,” Pattern Recogn. 26, 923–937 (1993).
[CrossRef]

Heeger, D. J.

E. P. Simoncelli, E. H. Adelson, D. J. Heeger, “Probability distributions of optical flow,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1991), pp. 310–315.

Horn, B. K. P.

B. K. P. Horn, Robot Vision (MIT Press, Cambridge, Mass., 1986).

Jones, D. G.

D. G. Jones, D. G. Lamb, “Analyzing the visual echo: passive 3-D imaging with a multiple aperture camera,” (Department of Electrical Engineering, McGill University, Mòntreal, Canada, 1993).

Kanade, T.

B. D. Lucas, T. Kanade, “An iterative image registration technique with an application to stereo vision,” presented at the 7th International Joint Conference on Artificial Intelligence, Vancouver, B.C., Canada, 1981.

Lamb, D. G.

D. G. Jones, D. G. Lamb, “Analyzing the visual echo: passive 3-D imaging with a multiple aperture camera,” (Department of Electrical Engineering, McGill University, Mòntreal, Canada, 1993).

Lucas, B. D.

B. D. Lucas, T. Kanade, “An iterative image registration technique with an application to stereo vision,” presented at the 7th International Joint Conference on Artificial Intelligence, Vancouver, B.C., Canada, 1981.

Nayar, S.

M. Watanabe, S. Nayar, “Minimal operator set for passive depth from defocus,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1996), pp. 431–438.

Nayar, S. K.

S. K. Nayar, M. Watanabe, M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern. Anal. Mach. Intell. 18, 1186–1198 (1995).
[CrossRef]

Nishimoto, Y.

Y. Nishimoto, Y. Shirai, “A feature-based stereo model using small disparities,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1987), pp. 192–196.

Noguchi, M.

S. K. Nayar, M. Watanabe, M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern. Anal. Mach. Intell. 18, 1186–1198 (1995).
[CrossRef]

Pentland, A. P.

A. P. Pentland, “A new sense for depth of field,” IEEE Trans. Pattern. Anal. Mach. Intell. AP9, 523–531 (1987).
[CrossRef]

Shafer, S.

Y. Xiong, S. Shafer, “Depth from focusing and defocusing,” in Proceedings of the DARPA Image Understanding Workshop (IEEE Computer Society Press, Los Alamitos, Calif., 1988), pp. 967–976.

Shirai, Y.

Y. Nishimoto, Y. Shirai, “A feature-based stereo model using small disparities,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1987), pp. 192–196.

Simoncelli, E. P.

H. Farid, E. P. Simoncelli, “A differential optical range camera,” presented at the 1996 OSA Annual Meeting, Rochester, N.Y., October 20–26, 1996.

E. P. Simoncelli, H. Farid, “Single-lens range imaging using optical derivative masks,” U.S. patent5,703,677, December30, 1997; international patent pending (filed November13, 1996).

H. Farid, E. P. Simoncelli, “Optimally rotation-equivariant directional derivative kernels,” in Computer Analysis of Images and Patterns (Springer-Verlag, Berlin, 1997), pp. 207–214.

E. P. Simoncelli, E. H. Adelson, D. J. Heeger, “Probability distributions of optical flow,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1991), pp. 310–315.

E. P. Simoncelli, H. Farid, “Direct differential range estimation from aperture derivatives,” in Proceedings of the European Conference on Computer Vision (Springer-Verlag, Berlin, 1996), Vol. II, pp. 82–93.

Steinberg, L.

R. W. Floyd, L. Steinberg, “An adaptive algorithm for spatial grey scale,” Proc. Soc. Inf. Disp. 17, 75–77 (1976).

Subbarao, A. M.

A. M. Subbarao, G. Surya, “Depth from defocus: a spatial domain approach,” Int. J. Comput. Vis. 13, 271–294 (1994).
[CrossRef]

Subbarao, M.

M. Subbarao, “Parallel depth recovery by changing camera parameters,” in Proceedings of the International Conference on Computer Vision (IEEE Computer Society Press, Los Alamitos, Calif., 1988), pp. 149–155.

Surya, G.

A. M. Subbarao, G. Surya, “Depth from defocus: a spatial domain approach,” Int. J. Comput. Vis. 13, 271–294 (1994).
[CrossRef]

Teoh, W.

W. Teoh, X. D. Zhang, “An inexpensive stereoscopic vision system for robots,” in Proceedings of the International Conference on Robotics (IEEE Computer Society Press, Los Alamitos, Calif., 1984), pp. 186–189.

Wang, J. Y. A.

E. H. Adelson, J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern. Anal. Mach. Intell. 14, 99–106 (1992).
[CrossRef]

Watanabe, M.

S. K. Nayar, M. Watanabe, M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern. Anal. Mach. Intell. 18, 1186–1198 (1995).
[CrossRef]

M. Watanabe, S. Nayar, “Minimal operator set for passive depth from defocus,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1996), pp. 431–438.

Xiong, Y.

Y. Xiong, S. Shafer, “Depth from focusing and defocusing,” in Proceedings of the DARPA Image Understanding Workshop (IEEE Computer Society Press, Los Alamitos, Calif., 1988), pp. 967–976.

Zhang, X. D.

W. Teoh, X. D. Zhang, “An inexpensive stereoscopic vision system for robots,” in Proceedings of the International Conference on Robotics (IEEE Computer Society Press, Los Alamitos, Calif., 1984), pp. 186–189.

Appl. Opt. (1)

IEEE Trans. Pattern. Anal. Mach. Intell. (3)

A. P. Pentland, “A new sense for depth of field,” IEEE Trans. Pattern. Anal. Mach. Intell. AP9, 523–531 (1987).
[CrossRef]

S. K. Nayar, M. Watanabe, M. Noguchi, “Real-time focus range sensor,” IEEE Trans. Pattern. Anal. Mach. Intell. 18, 1186–1198 (1995).
[CrossRef]

E. H. Adelson, J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern. Anal. Mach. Intell. 14, 99–106 (1992).
[CrossRef]

Int. J. Comput. Vis. (1)

A. M. Subbarao, G. Surya, “Depth from defocus: a spatial domain approach,” Int. J. Comput. Vis. 13, 271–294 (1994).
[CrossRef]

Pattern Recogn. (1)

A. Goshtasby, W. A. Gruver, “Design of a single-lens stereo camera system,” Pattern Recogn. 26, 923–937 (1993).
[CrossRef]

Proc. Soc. Inf. Disp. (1)

R. W. Floyd, L. Steinberg, “An adaptive algorithm for spatial grey scale,” Proc. Soc. Inf. Disp. 17, 75–77 (1976).

Other (14)

H. Farid, E. P. Simoncelli, “Optimally rotation-equivariant directional derivative kernels,” in Computer Analysis of Images and Patterns (Springer-Verlag, Berlin, 1997), pp. 207–214.

E. P. Simoncelli, H. Farid, “Direct differential range estimation from aperture derivatives,” in Proceedings of the European Conference on Computer Vision (Springer-Verlag, Berlin, 1996), Vol. II, pp. 82–93.

E. P. Simoncelli, H. Farid, “Single-lens range imaging using optical derivative masks,” U.S. patent5,703,677, December30, 1997; international patent pending (filed November13, 1996).

H. Farid, E. P. Simoncelli, “A differential optical range camera,” presented at the 1996 OSA Annual Meeting, Rochester, N.Y., October 20–26, 1996.

H. Farid, “Range estimation by optical differentiation,” Ph.D. dissertation (University of Pennsylvania, Philadelphia, Pa., 1997).

W. Teoh, X. D. Zhang, “An inexpensive stereoscopic vision system for robots,” in Proceedings of the International Conference on Robotics (IEEE Computer Society Press, Los Alamitos, Calif., 1984), pp. 186–189.

Y. Nishimoto, Y. Shirai, “A feature-based stereo model using small disparities,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1987), pp. 192–196.

M. Watanabe, S. Nayar, “Minimal operator set for passive depth from defocus,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1996), pp. 431–438.

D. G. Jones, D. G. Lamb, “Analyzing the visual echo: passive 3-D imaging with a multiple aperture camera,” (Department of Electrical Engineering, McGill University, Mòntreal, Canada, 1993).

M. Subbarao, “Parallel depth recovery by changing camera parameters,” in Proceedings of the International Conference on Computer Vision (IEEE Computer Society Press, Los Alamitos, Calif., 1988), pp. 149–155.

Y. Xiong, S. Shafer, “Depth from focusing and defocusing,” in Proceedings of the DARPA Image Understanding Workshop (IEEE Computer Society Press, Los Alamitos, Calif., 1988), pp. 967–976.

B. K. P. Horn, Robot Vision (MIT Press, Cambridge, Mass., 1986).

B. D. Lucas, T. Kanade, “An iterative image registration technique with an application to stereo vision,” presented at the 7th International Joint Conference on Artificial Intelligence, Vancouver, B.C., Canada, 1981.

E. P. Simoncelli, E. H. Adelson, D. J. Heeger, “Probability distributions of optical flow,” in Proceedings of the Conference on Computer Vision and Pattern Recognition (IEEE Computer Society Press, Los Alamitos, Calif., 1991), pp. 310–315.

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1

Geometry for a binocular stereo system with pinhole cameras. The variable V parameterizes the position of the camera pinholes. According to the brightness constancy constraint, the intensity of a point in the world, as recorded by the two pinhole cameras, should be the same.

Fig. 2
Fig. 2

Illustration of direct differential range determination for a single point source. Images of a point light source are formed with two different optical masks, corresponding to the function M(u) and its derivative M(u). In each case the image formed is a scaled and dilated copy of the mask function (by a factor α). Computing the spatial derivative of the image formed under mask M(u) produces an image that is identical to the image formed under the derivative mask M(u), except for a scale factor α. Thus α may be estimated as the ratio of the two images. Range is then computed from α by using the relationship given in Eq. (6).

Fig. 3
Fig. 3

Prototype camera. Top, fast-switching LC SLM employed as an optical attenuation mask. Bottom, our range camera consisting of an off-the-shelf CCD camera and the LC SLM sandwiched between a pair of planar convex lenses. The target consists of a piece of paper with a random texture pattern.

Fig. 4
Fig. 4

Gaussian aperture masks. The top two images are of a Gaussian mask M(u, w) and its partial derivative M(u, w). The bottom two images are of two nonnegative aperture masks, M1(u, w) and M2(u, w), computed from the upper pair of masks by means of Eq. (28).

Fig. 5
Fig. 5

Calibration of an LCD mask and a CCD sensor. (a) Normalized light transmittance (in cd/m2, as measured with a photometer) through constant masks set to each of the four LCD values. (b) Normalized light transmittance measured through each of 32 uniform dithered and gamma-corrected masks, averaged over five trials. If our dithering and gamma correction were perfect, the measurements (circles) would lie along a unit-slope line (dashed line). (c) Normalized CCD pixel intensity of a point light source as imaged through a series of 32 uniform, dithered optical masks (with gamma correction) averaged over five trials and spatially integrated over a 5×5 pixel neighborhood. If both the optical mask and the imaging sensor were linear, then these measurements (circles) would lie along a unit-slope line (dashed line).

Fig. 6
Fig. 6

(a), (b) 1-D slices of the image of a point light source taken through a pair of nonnegative Gaussian-based masks, I1 and I2; (c) 1-D slices of the linear combination of the measurements [see Eq. (30)]; (d) 1-D slices of the resulting images Ix (solid curve) and -Iv (dashed curve). These images should be related to each other by a scale factor of α [see Eq. (10)].

Fig. 7
Fig. 7

Recovered range maps computed by optical viewpoint differentiation for a pair of frontal-parallel surfaces at a distance of (a) 11 and (b) 17 cm from the camera. The computed range maps have a mean of 10.9 and 17.0 cm with a standard deviation of 0.27 and 0.75 cm, respectively.

Fig. 8
Fig. 8

Recovered range maps computed by optical aperture size differentiation for a pair of frontal-parallel surfaces at a distance of (a) 11 and (b) 17 cm from the camera. The computed range maps have a mean of 11.0 and 17.0 cm with a standard deviation of 0.06 and 0.16 cm, respectively.

Fig. 9
Fig. 9

Recovered range map computed by optical viewpoint differentiation, (a) for a slanted surface oriented approximately 30 deg relative to the sensor plane with the center of the plane at a depth of 14 cm and (b) for a pair of occluding surfaces at a depth of 11 and 17 cm.

Equations (39)

Equations on this page are rendered with MathJax. Learn more.

f(x; v)=Ix-vdZ(x),
Iv(x)f(x; v)vv=0=-dZ(x)I(x),
Ix(x)f(x; v)xv=0=I(x).
Iv(x)=-dZ(x)Ix(x).
I(x)=1αMxα,
α=1-d/f+d/Z,
f(x; v)=1αMxα-v,
Iv(x)vf(x; v)v=0=-1αMxα,
Ix(x)xf(x; v)v=0=1α2Mxα.
Ix(x)=-1αIv(x).
E(α)=xP[Iv(x)+αIx(x)]2,
α=-xPIv(x)Ix(x)xPIx2(x).
α=-xPIv(x)Ix(x)xP[Ix(x)]2+σ2.
E(α)=(x, y)P[Iu(x, y)+αIx(x, y)]2+[Iw(x, y)+αIy(x, y)]2.
α=-(x, y)PIu(x, y)Ix(x, y)+Iw(x, y)Iy(x, y)(x, y)P[Ix(x, y)]2+[Iy(x, y)]2+σ2.
f(x; A)=1AαMxAα.
IA(x)Af(x; A)A=1=-1αMxα-xα2Mxα=-1αMxα+xαMxα.
J(x)=-xα2Mxα,
Jx(x)=-1α2Mxα+xαMxα,
M(u; A)=1Aexp-u22A2,
IA(x)=1αMxα,
Ixx(x)=1α3Mxα=1α2IA(x).
E(α2)=(x, y)P{IA(x, y)-α2[Ixx(x, y)+Iyy(x, y)]}2,
α2=(x, y)PIA(x, y)[Ixx(x, y)+Iyy(x, y)](x, y)P[Ixx(x, y)+Iyy(x, y)]2+σ2.
f(x; v)=dxp1α(xp)Mx-xpα(xp)-vL(xp),
Iv(x)vf(x; v)v=0=-dxp1α(xp)Mx-xpα(xp)L(xp),
Ix(x)xf(x; v)v=0=dxp1α2(xp)Mx-xpα(xp)L(xp).
αˆ(xp)-Iv(x)Ix(x)=-dxpα(xp)1α2(xp)Mx-xpα(xp)L(xp)dxp1α2(xp)Mx-xpα(xp)L(xp).
M1(u)=β1M(u)+γ1M(u),
M2(u)=β2M(u)-γ2M(u),
M(u)=γ2M1(u)+γ1M2(u)γ2β1+γ1β2,
M(u)=β2M1(u)-β1M2(u)γ1β2+γ2β1.
I(x)=γ2I1(x)+γ1I2(x)γ2β1+γ1β2,
Iv(x)=β2I1(x)-β1I2(x)γ1β2+γ2β1,
116×7351,
Mˆ(u)H(u)=d[M(u)H(u)]du,
ZD=dbD2Z2db.
αˆ=-Iv+ΔvIx+Δx=-(1/α)M+Δv(1/α2)M+Δx=-α+(α2Δv)/M1+(α2Δx)/M.
ΔZZ2db1-df+dZ2Δ{v,x}.

Metrics