Photorealistic ray tracing methods have been developed that allow us to see how devices such as imperfect invisible spheres and invisibility cloaks would appear if actually constructed and placed in outdoor environments. The methods developed allow photorealistic depiction of devices with gradient indices of refraction and birefringence or trirefringence in non-Cartesian coordinate systems (and hence accurately handle ray splitting/beam walkoff). The resulting images, which can be rendered in real time to produce animations as will be shown, allow subjective assessment of the performance of optical instruments such as invisibility devices in environments in which they are intended to ultimately be used.
©2010 Optical Society of America
When many people envision an invisibility device such as an optical cloak in action, outdoor use is often imagined. Futuristic vehicles with stealth, invisible aircraft, or even personal camouflage to be used on the ground are concepts that fire the imagination. Current metamaterial technology is far from these goals, but it is a worthwhile endeavor to show just how far current technology is from such scenarios by depicting how such devices would look if they could actually be built in the future versus how they would look if built with the best of today’s materials. In this letter, photorealistic depictions of such metamaterial-based optical instruments in action will be presented. In this case however, it is not just ideal operation with ideal (and not-yet-invented) materials that will be depicted, but imperfect, realistic materials that could potentially be fabricated that will be shown. To the extent of the author’s knowledge, this is the first report of photorealistic ray tracing that cannot only handle anisotropic (birefringent) materials in non-Cartesian coordinate systems with spatially gradient permittivity profiles, but can also incorporate real camera footage/scenery. Fast ray tracing methods will also be discussed which allow real time ray tracing and creation of animations that incorporate the above-mentioned effects.
This letter is organized into two sections that describe the ray tracing methods. The first describes ray tracing in an outdoor scene involving constant index (with n < 1) materials and 3 dimensional (3D) wireframe object models—this is essentially background material but with stunning and pedagogically useful results. The second section extends the technique to gradient index materials with anisotropy and beam walkoff in order to properly depict a spherically birefringent invisible object designed with use of transformation optics.
2. 3D ray tracing with n < 1 objects in a natural environment
3D ray tracing is not at all new. It is routinely used to create special effects in cinematography, in video games, and in graphics design. A beautiful, photorealistic example of a set of glasses filled with “water” and “negative-index water” was demonstrated in 2006  with use of the free software known as POV-Ray. That particular software allows accurate photon-mapped ray tracing through solid objects of constant refractive index, even if the index is below 1. With realistic metamaterials, dispersion is often present and unfortunately POV-Ray does not permit customized dispersion curves nor ray tracing through gradient index objects. It cannot handle anisotropy or polarized ray tracing. Its interface is limited and it is not commonly used professionally, but the results of ref . showed that what is possible can be stunning. Inspired by that previous work, the author has created a tool  that is integrable with commonly used graphics software (such as Maya, V-Ray, 3D Studio Max), well-known to graphics designers and building architects. One feature is that 3D architectural models could potentially incorporate negative index materials or materials where n < 1 in a natural environment. The physics is not new—this was merely a programming task, but the resulting images are interesting and form a suitable introduction to more sophisticated methods to be described in the next section.
Figure 1 illustrates the case of a swimming pool filled up with water of various “indices”. It is pedagogically useful to see what “negative-index water” would look like, and dark lines have been added to Fig. 1(d) to show the location of the bottom inside edges of the swimming pool that would never be visible without a negative index material. Indeed, it is clear that if a person were swimming in such a pool, the body of the swimmer would appear to glide above the plane of the pool in an illusion. Figure 1(c) shows the situation of total external reflection. The artificial sky is visible in the reflection in the pool’s surface. In this case, light was assumed to be unpolarized so an average of the Fresnel curves for both polarizations was used to calculate the reflectivity at the water/air interface as a function of incidence angle. Pseudorandom bump maps create the waves and ripples on the surface of the water.
3. Photorealistic ray tracing in anisotropic, gradient-permittivity materials
Gradient index 3D ray tracing has been demonstrated in a few situations such as Eaton lenses  and carpet cloaks , and probably in a few specific cases involving architectural fiber optic lighting. (The commercially available software Zemax can handle gradient index ray tracing in a few specific cases, e.g. in certain optical fiber index profiles). In this study, we build upon previous methods to create animations of optical instruments in outdoor situations while handling polarized ray tracing in anisotropic, gradient-permittivity devices. An invisible sphere provides a useful device with which polarized ray tracing methods can be described (because of its symmetry). The index profile for a non-trivial invisible sphere of unity radius in which all ray trajectories orbit the origin once is given by the following equation, where the index n is a function of the radius r .Eq. (1) can be carried out from mechanical analogues of the Eikonal equation [4–6]. A problematic feature of this particular optical instrument is that there is a singularity at the origin r = 0 where the index n approaches infinity. That singularity can be transmuted by introducing a small amount of birefringence near the origin as described in detail in ref [6,7]. Creating birefringence would ordinarily result in ray splitting, but this can be avoided by forcing all components i of the permeability to equal those of the permittivity (), which is the procedure followed in ref . Since the resulting structure would behave exactly the same as the original structure (invisible) when viewed from the outside, it is not a particularly worthwhile object to study for photorealistic ray tracing. Additionally, it should be noted that although the singularity would be transmuted in that case, the device itself would be far from feasible to build because of the need for the magnetic response.
To create a device more amenable to potential fabrication at optical wavelengths, correct behavior for one of the two incident polarizations can be sacrificed to create an all-dielectric device. This is because one polarization (TM) along any plane passing through the sphere’s origin is unaffected by nonzero , , and , which can be defined at will. The other three components of permeability and permittivity are then rescaled so that with the same procedure described in ref . but for the opposite polarization and without any resulting surface reflections. (They experimentally demonstrated operation of an Eaton lens at microwave frequencies so selected a different polarization.) After the above transmutation, optimization, and rescaling operations are carried out, the result is that and, , and are simply the squares of the corresponding components of the following modified refractive index tensor in spherical coordinates as a function of radius R, where n(r) is defined in Eq. (1).Eq. (2) represents the transmuting function, the construction of which is described in ref [6,7]. It is possible to choose an appropriate function R(r) such that the singularity is transmuted with the resulting index always finite and greater than 1 in all components. For the invisible sphere, a possible such function is as follows.7]. This is a device that could potentially be fabricated with existing materials, the unavoidable (but finite) anisotropy at the origin notwithstanding. Forcing the use of dielectric materials as described disturbs ray trajectories for one polarization. This is illustrated in Fig. 2 .
The ray trajectories shown in Fig. 2 were calculated by integrating Hamilton’s equations of motion  for each polarized ray. To place this object in a natural environment with photorealistic quality, a large sample of ray trajectories for the disturbed polarization corresponding to Fig. 2(c) was pre-calculated with software capable of numerical integration, and an empirical function was created that relates the impact parameter of any ray impinging on the surface of the plane with its appropriate turning angle. The “impact parameter” b refers to the shortest distance of any impinging ray to a parallel ray that passes through the origin of the sphere. The “turning angle” refers to the amount each ray is turned about the origin. In Fig. 2(b), all turning angles are 360° for every ray shown, all with different impact parameters. Once the empirical relationship is established for Fig. 2(c), internal Hamiltonian ray calculations are no longer needed, since the macroscopic behavior (ray input vs. ray output) is established and sufficient data can offer reasonable interpolation for uncalculated impact parameters.
Once the empirical relationship between impact parameter and turning angle is established, ray tracing takes place as follows: Rays are traced in a time-reversed way from a virtual camera fanning outwards through a view frustrum. Those rays that intersect the background scene need no further action and the pixel is assigned the appropriate color. In cases where rays intersect the curved surface of the invisible sphere, the impact parameter is calculated and then translated to turning angle. The turning angle is applied along the plane connecting the three points corresponding to the camera, the impact point, and the sphere center, in order to find the ray exit point. The exit ray then intersects some point of the background which then determines the pixel color to assign at the view frustrum. This process is repeated for each polarization and the per-pixel results averaged. Because numerical integration is only required to establish the empirical relationship once, ray tracing is very fast and in fact can be done in real time to create on-the-fly animations for the invisible sphere discussed.
Figure 3 shows a background outdoor scene (formed into a panorama that surrounds the virtual camera) in Singapore’s Chinese Garden and the same scene with the modified invisible sphere. It is evident that the sphere is really only “half” invisible because the trajectories are correct for only one polarization in the dielectric-only device. The resulting animation/image capture in Fig. 3(b) is convincingly photorealistic. It is conceivable that it could actually be built and be broadband as shown (1 < n < 9.46 for all components). Further optimization of the transmuting function would not be able to yield anything better than 1 < n < 3 for all components, but even so this appears within practical reach. The scene in Fig. 3 consists only of the invisible sphere and the background panorama. No 3D object models are included, unlike the scene in Fig. 1.
4. How would these figures compare to real photographs?
The camera is assumed to be a pinhole camera, infinitely small, with no diffraction effects. This gives an infinite depth of focus in Fig. 1 and Fig. 3. There is also a parallax error in Fig. 3(b). It is caused by the way the panorama was created. A large number (about 90) photographs were taken from a camera mounted on a tripod and digitally stitched together to create the panorama that surrounds the virtual camera . Depth information was not recorded. As long as the separation between the virtual camera and the invisible sphere is small relative to the distance separating either the sphere or the camera from the nearest object in the background, the parallax error is negligible. This is why a scene with faraway objects was chosen, and not an interior room view with protruding features, for example. Since the objects in Fig. 1 are truly 3D objects in the model, such parallax errors would not occur, but the additional computational load caused by the 3D vertex calculations (instead of a pseudo-2D surface lookup as in Fig. 3) would probably preclude real time rendering on current processors. In addition, if objects such as mirrors were included near the invisible sphere or if realistic lighting models or dispersion effects are to be included, photons must be traced in the opposite direction (from source to camera) so that brightness is properly determined. Since many photons would not reach the camera, significantly higher computational resources are required to capture enough at the view frustrum to achieve photorealism. The real time rendering method described here can be used to handle many interesting devices such as proposed optical cloaking devices [11,12] though. Once materials feasible enough to fabricate such structures are designed or discovered, how they would behave in a natural environment could be characterized with this software.
Real time, photorealistic ray tracing of metamaterial-based devices in natural environments has been demonstrated. When assessing the capabilities of invisibility devices such as the modified invisible sphere here, i.e. devices that can potentially be constructed with materials that are within practical reach, it would be worthwhile to see how the devices actually look and behave in realistic environments.
Artwork contribution by D. S. Hanggoro and T. Ngo and a grant from the Singapore Ministry of Education/National University of SingaporeR263000414112/133 are acknowledged.
References and links
2. A. J. Danner, “Photorealistic rendering of metamaterials, gradient index devices, and polarization-dependent invisibility cloaks,” presented at Frontiers in Optics (Optical Society of America Annual Meeting), San Jose, California, 11–15 October, 2009.
6. T. Tyc and U. Leonhardt, “Transmutation of singularities in optical instruments,” N. J. Phys. 10, 1–8 (2009).
7. A. J. Danner, and U. Leonhardt, “Lossless design of an Eaton lens and invisible sphere by transformation optics with no bandwidth limitation,” presented at the Conference on Lasers and Electro-optics (CLEO), Baltimore, Maryland, 1–5 June 2009.
10. M. Brown and D. Lowe, “Automatic panoramic image stitching using invariant features,” Int. J. Comput. Vis. 74(1), 59–73 (2007). [CrossRef]
11. U. Leonhardt and T. Tyc, “Broadband invisibility by non-Euclidean cloaking,” Science 323(5910), 110–112 (2009). [CrossRef]