To test the sensitivity of a microscope in distinguishing variations of optical thickness in an object, it is useful to have test materials somewhat analogous to the diatoms normally used to test lateral resolving power. This can be done with the edge of a wedge-shaped sheet of collodion, whose thickness varies continuously between a few microns and a few angstroms. Collodion is dissolved in amyl acetate and the solution poured over a glass plate. Upon drying, a thick film is formed. The collodion is moistened by breathing on it and one-half of the film is then torn away forming a sharp edge. A drop of pure amyl acetate is put at this edge; the collodion redissolves, diffusing into the drop, and when it dries a collodion wedge is formed. The collodion is again torn with the new tear going from thick to thin through the wedge and perpendicular to the first tear. The phase retardation of the wedge-shaped collodion edge is measured by means of a three-beam interference microscope.1 With such a “phase-wedge,” a commercial phase-contrast objective shows a path difference of 80 A with rather good contrast.
The phase-contrast method has a disadvantage in that in an object any detail of homogeneous thickness will show an uneven brightness. The edges of such a detail are marked by the well-known halos which are an inevitable characteristic of phase-contrast images. The appearance depends on the width of the phase ring in the objective and the extent of the homogeneous detail. Another form of test object utilizes a piece of collodion sheet uniform in thickness but torn so that it varies in width from zero to some hundred microns. This test object, here called a “phase-point,” reveals the effect of the lateral extent of homogeneous phase details.
© 1957 Optical Society of AmericaPDF Article