The geometrical theory of diffraction is an extension of geometrical optics which accounts for diffraction. It introduces diffracted rays in addition to the usual rays of geometrical optics. These rays are produced by incident rays which hit edges, corners, or vertices of boundary surfaces, or which graze such surfaces. Various laws of diffraction, analogous to the laws of reflection and refraction, are employed to characterize the diffracted rays. A modified form of Fermat’s principle, equivalent to these laws, can also be used. Diffracted wave fronts are defined, which can be found by a Huygens wavelet construction. There is an associated phase or eikonal function which satisfies the eikonal equation. In addition complex or imaginary rays are introduced. A field is associated with each ray and the total field at a point is the sum of the fields on all rays through the point. The phase of the field on a ray is proportional to the optical length of the ray from some reference point. The amplitude varies in accordance with the principle of conservation of energy in a narrow tube of rays. The initial value of the field on a diffracted ray is determined from the incident field with the aid of an appropriate diffraction coefficient. These diffraction coefficients are determined from certain canonical problems. They all vanish as the wavelength tends to zero. The theory is applied to diffraction by an aperture in a thin screen diffraction by a disk, etc., to illustrate it. Agreement is shown between the predictions of the theory and various other theoretical analyses of some of these problems. Experimental confirmation of the theory is also presented. The mathematical justification of the theory on the basis of electromagnetic theory is described. Finally, the applicability of this theory, or a modification of it, to other branches of physics is explained.
© 1962 Optical Society of AmericaFull Article | PDF Article