Based on the hyperbolic dispersion relation for a strongly anisotropic medium, we propose a kind of pyramid-shaped hyperlenses (PSHLs) consisting of multilayer of planar silver and dielectric films for three-dimensional (3D) subdiffraction imaging at optical wavelengths. Numerical simulations by using the finite difference time domain method demonstrate that the PSHLs can resolve eight point sources with nanoscale separations distributed in 3D domain (with different hexahedron structures). Our results imply the potential applications of the hyperlens in real-time biomolecular imaging, nanolithography, and sensing.
©2009 Optical Society of America
Conventional optical microscopy suffers from the diffraction limit due to the decay of the evanescent waves that carry subwavelength information before reaching the imaging plane. Although scanning near-field optical microscopy can detect subwavelength details, the major constraint of this invention may be the need for scanning the sample point by point, and thus a whole image can not be formed in real time [1-2]. A new imaging concept called superlens  currently received much attention for its ability to amplify evanescent waves by a negative index material and project subdiffraction images without scanning [4-5], but the amplified image can not be processed by conventional optics because the image is still in the near field. It is of great importance to obtain a direct optical far-field image that includes subwavelength features in many applications. A device termed as optical far-field superlens (FSLs) has shown to achieve this goal, and one-dimensional (1D) subdiffraction images can be consequently reconstructed in the far field [6-7]. Most recently, based on the previous FSLs, Xiong et al. have theoretically demonstrated two-dimensional (2D) subdiffraction imaging by rotating a metamaterial FSL and the images are obtained by retrieving the field angular spectrum, thus it should not be regarded as a direct imaging . Another exciting technology, so-called optical hyperlens, also shows remarkable advance toward this goal [9-15], but all the reported systems are limited to image 1D objects because it can only amplify linear objects oriented in a proper direction. Compared with metal-dielectric multilayer structures, metallic wires-based structures recently have attracted tremendous research interests for their abilities to achieve far-field subdiffraction imaging with wide frequency band and low-losses [16-18]. However, so far three-dimensional (3D) far-field subdiffraction imaging has not been reported. In this article, on the basis of the hyperbolic dispersion relation for a strongly anisotropic medium, we design a kind of pyramid-shaped hyperlens (PSHLs) for 3D subdiffraction imaging at optical wavelengths, and the feasibility of the systems for the far-field superresolution imaging will be demonstrated by using the finite difference time domain (FDTD) simulations. It should be noted that, compared with hemispherical shape similar to conventional lens, our proposed pyramidal shape may easily be fabricated by current technology.
2. Hyperlens theory
Most recently, it was suggested that layered metal-dielectric media shows a hyperbolic dispersion and can support the propagation of electromagnetic waves with very large wave vectors [9-10, 15]. As if the thickness of each layer is much smaller than the light wavelength, we can treat this layered material as a uniform medium with effective dielectric constant [9-10, 15]
Where εm, εd, d 1, and d 2 are the electric permittivity and thickness of metal and dielectric, respectively. εθ and εr represent the permittivity of the layered metal-dielectric media in the tangential and radial directions. The dispersion relation of electromagnetic wave in the system is represented by in cylindrical coordinates, where kr and kθ are the wave vectors of light in the radial directions and tangential, and ω and c are the angular frequency and speed of light, respectively. Since noble metals usually have εm < 0 and dielectrics have εd > 0 at optical frequency, the combinations of which can result in either elliptic or hyperbolic dispersion. As εθ and εr are designed to achieve opposite signs, this dispersion relation represents a hyperbola and can support the propagation of light with high spatial frequency components, which carry the subwavelength features of objects [9, 10]. Figure 1 shows the scheme of this hyperbolic dispersion. One can see that, in this structure, the allowed wave vectors are only in certain directions, and lie within the region bounded by the asymptotes of the hyperbola. The Poynting vectors of light are not parallel to wave vectors and lie within a cone bounded by the two extreme Poynting vectors, which are perpendicular to the two asymptotes. As if the parameters εm,εd, d 1, and d 2 satisfy εmd 1 + εdd 2 → 0, all the Poynting vectors shown in Fig. 1 would be in the normal directions of the layered materials [10,12, 19-21]. In other words, a field distribution in a plane would be undistortedly transferred to other arbitrary planes through the rays parallel to the axis of the layered media. Hence, hyperlens of cylindrical or ridged structure can transfer amplified subwavelength features of objects with propagating waves to the far field [9-11].
3. PSHLs structures and imaging properties
Figures 2(a)-2(b) illustrate the proposed PSHLs constructed by alternatively stacking planar Ag films (green) and dielectric films (yellow) [side view (a) of the PSHLs made of two symmetrical ones (b)] with d 1 = d 2 = 4 nm. Figures 2(c)-2(d) schematically show the spatial distributions of eight point sources (black spots) in 3D domain and their imaging points (red spots), α (= < BAO) and β (=< AOA’) represent the slope angle and the dihedral angle of two opposite planes, respectively. It should be noted that, from the geometric relation of the angle < BAA’=90°-β/2 + α, one see that, if α > β/2, < BAA’ will be greater than 90°. In this case the shape shown in Fig. 2(c) will not be a pyramid as shown in Fig. 2(a) since its top is larger than its bottom. However, such a shaped structure will not affect the magnification of the eight point sources shown in Figs. 2(c)-(d). In this paper, for simplicity we will call our proposed structure as PSHLs at all cases. In our 3D FDTD simulations, the spatial and temporal steps are set at Δx = Δy = Δz = 2 nm and Δt = Δx/2c, the wavelength of the exciting light λ is 365 nm and the relative dielectric constants of Ag and dielectric layer at this wavelength are εm = -2.4012+ j0.2488 and εd =2.4, respectively .
We first consider 2D subdiffraction imaging properties of the PSHLs as only four point sources 1, 3, 5, and 7 are placed on the opposite inner surfaces. Under the condition εmd 1 + εdd 2 → 0, their images 1’, 3’, 5’, and 7’ will be transferred to the oblique cut planes along the normal direction of layered media. Considering points 1 and 5, their image separation at the oblique cut plane appears to be l 1′5′ =l15 /cos(α). Since cos(α) <1, the image separation can be amplified to be beyond the diffraction limit. The magnification (defined as l 1′5′/ l 15) is P 15 =1/cos(α). For points 1 and 3, one can see that the image separation is dependent on β , l 13, and l 11′. From the geometry, one can get the magnification P 13 = l 1′3′/l 13= 1 + 2l 11′ cos(β/2)/l 13. Similar result about points 5 and 7 can be deduced as P 57 = l 5′7′ /l 57 = 1 + 2l 55′ cos(β/2)/l 57. Figure 3(b) shows the simulated amplitude distributions of magnetic field ∣Hz∣ of light emitted from points 1, 3, 5, and 7 on the output surfaces of the PSHLs as α =45°, β =120°, and lOA = 346 nm. In this case, we first specify the l O1 = 230 nm and l O5 = 84 nm, then we can derive the distances between point sources: l 15 = l 57 = l 37 = 146 nm ≈ 0.4λ , l 13 = 398 nm ≈ 1.1λ , l 55′ = l 77′ = 262 nm ≈ 0.72λ, and l 11′ = l 33′ = 116 nm ≈ 0.32λ, respectively. The corresponding image separations on the output surfaces are amplified to l 1′5′ =l 3′7′ = 206 nm ≈ 0.56λ , l 5′7′ = 408 nm ≈ 1.1λ, and l 1′3′ =514 nm ≈ 1.4λ, respectively. We can see that the distances between two adjacent images are beyond the diffraction limit. The corresponding magnifications are P 15 = P 37 = 1.4, P 57 = 2.8, and P 13 = 1.3, respectively.
Considering the other four point sources 1, 2, 3, and 4 placed on another plane perpendicular to the plane of points 1, 3, 5, and 7 belonged to [see Figs. 2(c)-2(d)], we present in Fig. 3(c) the calculated amplitude distributions of magnetic intensity ∣Hz∣ of the four images on the output surfaces of the system . Here we set l O1 = 98 nm while all the other parameters are the same as that used in Fig. 3(b), then the other distances between the point sources can be obtained: l 12 = l 23 = l 34 = l 41 = 120 nm ≈ λ/3 and l 11′ = 248 nm ≈ 0.68λ. One can see that the amplified image spots are symmetrically distributed on different output surfaces, respectively. The separation between two adjacent image points is l 1′2′ = l 2′3′ = l 3′4′ = l 4′1′ = 295nm ≈ 0.8λ, corresponding to the magnification factor of P 12 = P 23 = P 34 = P 41 ≈ 2.5 . Similarly, other four symmetrical point sources 5, 6, 7, and 8 can also be imaged by the system with amplified separations.
In terms of the above results, we can deduce that the PSHLs can realize 3D subdiffraction imaging. Figures 4(a)-4(b) show the side- and top-views of the calculated gray distributions (∣Hz∣) of images on the output surfaces as eight point sources 1-8 with nanoscale separations are distributed in 3D domain [Figs. 2(c)-2(d)]. In the calculations, we choose l O1 =264 nm and l O5 =119 nm while all the other parameters are the same as that used in Fig. 3(a). Then the distances between the point sources can be gotten: l 12 = 324 nm ≈ 0.89λ, l 56 = l 15 = 146 nm ≈ 0.4λ, l 11′ =82 nm ≈ 0.25λ, and l 55′ = 228 nm ≈ 0.62λ. From the figure, one can see that eight image spots are symmetrically distributed on the corresponding output surfaces. The separations between two adjacent images are l 1′2′ =406 nm ≈ 1.1λ , l 1′5′ = 206 nm ≈ 0.56λ , and l 5′6′ = 306 nm ≈ 0.84λ, respectively, corresponding to the magnification factors of P 12 =1.3, P 56 = 2.1 , and P 15 =1.4 , respectively.
From the geometry of the PSHLs, we can deduce the following relations:
Then, we can get the corresponding magnifications from Eq. (2):
From Eq. (3), we can see that the magnifications between the nearest-neighbor point sources are different. One corresponds to the case where two adjacent point sources are distributed on the same inner surface (for instance, points 1 and 5). In this case, the magnification is P 15 = 1/cos(α), only dependent on α. To get a bigger magnification, one has to select a larger α. The other is the case that two adjacent point sources are respectively located at the neighboring inner surfaces (for instance, points 1 and 2, or 5 and 6). This implies that, the magnification P 12 (P 56) is dependent on l OA, l O1 (l O5), α, and β. It can be seen that, to get a larger magnification P 12 (P 56 ), one has to increase lOA, α or decrease β,l O1(l O5).
Figures 4(c)-4(d) respectively show the side- and top-views of the simulated gray distributions (∣Hz∣) of light emitted from the eight point sources with the same 3D distribution as that shown in Figs. 4(a)-(b) as the PSHLs is removed. We see that the images of two nearest-neighbor point sources on the same output surfaces emerge as a single light spot and can not be resolved completely.
4. Resolution of PSHLs
In order to estimate the resolving power of the 3D imaging system, we have to consider the mutual talk of the adjacent objects. For example, as the distance of point sources 1 and 5 becomes smaller, the light intensity at image point 5’ will involve that from source 1, which may destroy the resolution of the system. We can calculate the field distribution of light on the image surface of the PSHLs as only one point source is placed on the inner surface of the system. We employ the optical modulation MAB=(IB−IA)/(IA+IB) to represent the image contrast, where IB and IA represent the light intensities at image point B’ as only B(A) is placed on the inner surface of the system. We can get the image contrast M 51 about 0.95 as points 1 and 5 are simultaneously involved on the input surface of the PSHLs as the case of Fig. 4(a) but without point sources (2-4 and 6-8). As point source 1 moves toward 5, I 1 will become stronger and hence M 51 will decrease monotonically. Similarly, we can calculate out the optical modulation M 56 and M 57 by using the same method. Figures 5(a)-5(c) show the dependence of the optical modulation M 51, M 56, and M 57 and imaging distances l 5′1′, l 5′6′, and l 5′7′ on the object distances l 51, l 56, and l 57 as β changes from 80° (—) to 100° (——) and 120° (— ∙ —) but α =45°, respectively. One can see that the optical modulation is monotonically reduced with the object distance. For example, as l 51 changes to 40 nm and 30 nm, the corresponding M 51 is about 0.45 and 0.21, respectively. From the optical modulation criterion [14,23], we see that as two point sources are simultaneously placed at positions 1 and 5 of the inner surface of the system, the images 1’ and 5’ can be exactly distinguished even as the distance between the two object points is as small as l 51 =30 nm. Figures 5(d)-5(f) show the dependence of the optical modulation M 51, M 56, and M 57 and imaging distances l 5′1′, l 5′6′, and l 5′7′ on the object distances l 51, l 56, and l 57 as β = 90° but α change from 60° (—) to 45° (——) and 30° (— • —), respectively. It can be seen that the cross-talk between the nearest-neighbor point sources are monotonically decreased with the object distances. This means that, there exists a resolution limit for each pair of nearest-neighbor point sources. For instance, as β and α are set at 120° and 45°, respectively, the resolution limit of the system for points 5 and 6 is down to about 35 nm, whereas that for points 1 and 5 is about 30 nm. This is because that optical modulation less than 0.2 is too small to be distinguishable [14, 23]. In addition, the image separation linearly depends on the corresponding object separation, which is consistent with the magnification formula given before.
5. Imaging range of object points
In the above sections, we only consider the case of point sources placed on the inner surface of the PSHLs. However, from an application point of view, the point sources may be apart from the inner surface of the PSHLs. In this case, the resolution of the system may be different. In the following, we will consider the effect of distance between the sources and the inner surface of the systems on the resolution by shaping the eight point sources in other 3D structures. Firstly, considering the case in Figs. 4(a)-4(b) where points 5, 6, 7, and 8 are fixed while points 1, 2, 3, and 4 vertically depart equidistantly from the respective inner surfaces of the system, we show in Figs. 6 (a)-6(b) the calculated gray distributions (|Hz|) of light from the sources on the output surface of the PSHLs and the corresponding profiles of |H|2 along the straight line (1’5’) as 1, 2, 3, and 4 are 20 nm away from the inner surface (l 12 = l 23 = l 34 = l 41 = 297 nm > 140 nm), respectively. From the figure, we can see that the images of the eight point sources are clearly distinguishable. The larger the distances of points 1, 2, 3, and 4 from their respective inner surfaces, the weaker the light intensities of images at 1’, 2’, 3’ and 4’ is. Figures 6 (c)-6(d) shown are the corresponding gray distributions of ∣Hz∣ on the output surface of the PSHL and the profiles of ∣H∣2 along the straight line (1’5’) as points 1, 2, 3, and 4 are 70 nm away from their respective inner surfaces (l 12 = l 23 = l 34 = l 41 = 228 nm > 140 nm). Obviously, in this case, the images of sources 1 and 5 can not be distinguished by the system, since the ratio of saddle to left-peak intensity [see Fig. 6(d)]is 0.85, larger than the Rayleigh’s criterion 0.81 . This means that, when the distances between points 1, 2, 3, and 4 and the inner surface are larger than 70 nm, the eight point sources can not be independently imaged by the system.
Keeping the eight point sources distributed as a regular hexahedron with side length 140 nm but gradually reducing β from 120° to 60°, the distance between points 1, 2, 3, and 4 and their inner surfaces of the PSHLs will become smaller monotonically. As a result, the light intensity at image points 1’, 2’, 3’, and 4’ will become stronger. Figures 7(b)-7(c) show the gray distributions (∣Hz∣) of the images on one of the output surfaces of the PSHL and the corresponding profiles of ∣H∣2 along the straight line (1’5’) as β is set at 80° while all the other parameters are the same as that used in Fig. 4. From the figures we see that the images of sources 1 and 5 can not be distinguished by the system. Figures 7 (d) -7(e) show the gray distributions (∣Hz∣) of the images on one of the output surfaces of the PSHLs and the corresponding profile of ∣H∣2 intensity distributions along the straight line (1’5’) as β changes to 60°. The result shows that the image points 1’ and 5’ can be distinguished clearly. This means that by choosing a proper β the proposed system can essentially resolve eight point sources distributed in various 3D structures.
The above-mentioned results indicate that the proposed PSHLs can indeed resolve eight point sources with nanoscale separations distributed in various 3D structures. However, the system has some limitations: firstly, it is not a realistic 3D subdiffraction imaging since it can only resolve limited point sources in space. For instance, as point sources 1 and 5 are placed as that schematically shown in Fig. 8, where the line connecting 1 and 5 is perpendicular to line OA, we can see that the distance of image points 1’ and 5’ is equal to that of points 1 and 5, implying that points 1 and 5 can not be resolved in the far-field. In contrast, spherical hyperlens may truly realize 3D subdiffraction imaging since it may magnify much more point sources simultaneously. Secondly, the limitation of hyperlens based on metal-dielectric media may rest on the fixed working wavelength for a determinate system, because only at a fixed wavelength that εm, εd, d 1, and d 2 can satisfy εmd 1 + εdd 2 → 0 . In this article, we choose the working wavelength λ = 365 nm because that far-field subdiffraction imaging based on metal-dielectric media at that working wavelength has been recently experimentally demonstrated . However, our proposed structure can also work at other wavelengths, especially in visible wavelengths, which is beneficial for bioimaging.
In conclusion, we have designed a PSHLs for 3D far-field subdiffraction imaging. Numerical results of imaging properties from FDTD simulations demonstrated that eight point sources with subdiffraction separations in 3D domain can be magnified to the extent that conventional far-field optics can be further manipulated. The 3D subdiffraction imaging capabilities provide the PSHLs big potential in nanoscale optical imaging, real-time biomolecular imaging, sensing, and nanolithography.
This work is supported by the National Basic Research Program (Grant 2007CB935300) and the National Natural Science Foundation of China (Grants 60736041and 10774116).
References and links
1. E. A. Ash and G. Nicholls, “Super-resolution Aperture Scanning Microscope,” Nature (London) 237, 510 (1972). [CrossRef]
2. J. Koglin, U. C. Fischer, and H. Fuchs, “Material contrast in scanning near-field optical microscopy at 1–10 nm resolution,” Phys. Rev. B 55, 7977 (1997). [CrossRef]
6. S. Durant, Z. Liu, J. Steele, and X. Zhang, “Theory of the transmission properties of an optical far-field superlens for imaging beyond the diffraction limit,” J. Opt. Soc. Am. B 23, 2383 (2006). [CrossRef]
10. A. Salandrino and N. Engheta, “Far-field subdiffraction optical microscopy using metamaterial crystals: Theory and simulations,” Phys. Rev. B 74, 075103 (2006). [CrossRef]
12. Z. Jacob, L. V. Alekseyev, and E. Narimanov, “Semiclassical theory of the hyperlens,” J. Opt. Soc. Am 24, A52 (2007). [CrossRef]
14. L. Chen, X. Y. Zhou, and G. P. Wang, “V-shaped metal-dielectric multilayers for far-field subdiffraction imaging,” Appl. Phys B 92,127 (2008). [CrossRef]
15. J. G. Hu, P. Wang, Y. H. Lu, H. Ming, C. C. Chen, and J. X. Chen, “Sub-diffraction-Limit Imaging in Optical Hyperlens,” Chin. Phys. Lett. 25, 4439 (2008). [CrossRef]
16. P. Ikonen, C. Simovski, S. Tretyakov, P. Belov, and Y. Hao, “Magnification of subwavelength field distributions at microwave frequencies using a wire medium slab operating in the canalization regime,” Appl. Phys. Lett 91, 104102 (2007). [CrossRef]
18. G. Shvets, S. Trendafilov, J. B. Pendry, and A. Sarychev, “Guiding, Focusing, and Sensing on the Subwavelength Scale Using Metallic Wire Arrays,” Phys. Rev. Lett 99, 053903 (2007). [CrossRef] [PubMed]
20. D. Schurig and D. R. Smith, “Sub-diffraction imaging with compensating bilayers,” New J. Phys 7, 162 (2005). [CrossRef]
21. P. A. Belov and Y. Hao, “Subwavelength imaging at optical frequencies using a transmission device formed by a periodic layered metal-dielectric structure operating in the canalization regime,” Phys. Rev. B 73, 113110 (2006). [CrossRef]
22. P. B. Johnson and R. W. Christy, “Optical Constants of the Noble Metals,” Phys. Rev. B 6, 4370 (1972). [CrossRef]
23. J. R. Meyer-Arendt, Introduction to Classical and Modern Optics (second Edition), part. 2. pp. 208–211.
24. M. Born and E. Wolf, Principles of Optics (6th edition, Pergamon, Oxford, 1980).