The field of near-eye see-through devices has recently received significant media attention and financial investments. However, devices demonstrated to date suffer from significant practical limitations resulting from the conventional optics on which they are based. Potential manufacturers seek to surpass these limitations using novel optical schemes. In this paper, we propose such a potentially disruptive optical technology that may be used for this application. Conceptually, our optical scheme is situated at the interface of geometric incoherent refractive imaging and radiative coherent diffractive imaging. The generation of an image occurs as a result of data transmission through a two-dimensional network of optical waveguides that addresses a distribution of switchable holographic elements. The device acts as a wavefront generator, and the eye is the only optical system in which the image is formed. In the following we describe the device concept and characteristics, as well as the results of initial simulations.
© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
The fields of augmented reality (AR), mixed reality (MR), and virtual reality (VR) have recently been subject to renewed interest due to the large opportunities offered by smartphone applications. As an extension of this personal, everyday device, smart glasses could allow users to interact directly with their favorite applications without looking at and touching a screen. Such smart glasses could generate new applications in relation to the surrounding analogic world (AR) or to a digitalized world, real or virtual (MR or VR).
To support and anticipate customer expectations, smart glass concepts and devices have been proposed with the help of graphic designers. Impressive marketing material has created a discrepancy between customer perception and the actual technological capabilities of these devices. A thin, light, aesthetically pleasing pair of glasses with low power consumption showing a bright and contrasted image with high resolution and a wide field of view is unfortunately still a dream.
Most development on smart glasses for AR applications is based on a conventional imaging scheme built on the following steps:
- (1) Sensing: perception of the surrounding environment by sensors [camera].
- (2) Processing: computation of the digital image in relation to the surrounding field of view [micro-processor].
- (3) Generating: a display creates the analogic image [microdisplay].
- (4) Transforming: the real image is transformed into a virtual image seen at large distance to be seen by the viewer [optical system].
- (5) Propagating: the photons produced in the image creation process are brought to the eye [free-space or waveguide propagation].
- (6) Combining: the virtual image is superimposed on to the surrounding scenery [semireflective, grating, or holographic elements].
These technological functions are difficult to integrate in a compact way, and most of the devices produced for AR applications are still closer to a smart helmet than to smart glasses.
Based on this analysis we have tried to find an unconventional design that takes as a starting point an idealized image of lightweight, discreet smart glasses that has been brought to the consumer and uses alternative technologies to achieve a thin, light, and bright see-through device.
We found that the difficulties encountered in the optical system design are due to the steps 3 to 6 that concern the manipulation of the image from the display to the eye. To circumvent these difficulties, an obvious solution is to emit directly the wavefronts related to the image in front of the eye. This led us to develop a new kind of display that mixes integrated photonics and digitalized holography [1,2]: integrated photonics brings light from the light sources to the eye as a data transfer system, and holography transforms this data into wavefronts for the image to be projected on the retina.
In Section 2, we describe our display concept. In Section 3, we introduce the concept of the self-focusing effect allowing image formation on the retina. We then present the two main technological constituents of the system in Sections 4 and 5: the integrated photonics light distribution and the digitalized holography, respectively. Sections 6 and 7 describe estimates of the imaging properties and device power consumption, respectively.
2. GENERAL CONCEPT
Image formation into the eye is described in Fig. 1(a): our eye transforms a wavefront coming from the source image and focuses it on the retina. The coordinates of this image point on the retina are given by the angle of the wavefront that reaches the cornea.
A conventional display emits light at the level of the image pixels. In a near-eye configuration the spherical wavefront generated by each pixel has a curvature that the eye is not able to correct [Fig. 1(b)]. Even if a wavefront has the correct angular orientation, the resulting image point on the retina is blurred. In order to form the image, an optical system is introduced that reduces the curvature of the spherical wavefront and produces the planar wavefront with the correct angular coordinates [Fig. 1(c)].
The use of an optical system increases the volume of the optical device and leads to severe constrains in terms of field of view and eye box. Lens-free near-eye displays are possible alternatives that allow generation of planar wavefronts without the use of a single-axis optical system. An example proposed by Maimone et al. uses an integral imaging concept . An array of lenses or pinholes, creates a distribution of elementary planar wavefronts with the given angular coordinates [Fig. 1(d)]. The size of the optical system is decreased due to the reduction in size of the optics. The lens or pinhole array plays the role of a pupil expander and allows the eye box constraint to be relaxed. Another solution proposed by Sun et al. consists of a direct emission of a complex holographic wavefront . The device uses a phased array to generate the phase distribution of the image to be formed on the retina [Fig. 1(e)] .
These two solutions rely on a segmentation of the wavefront. The first case uses an incoherent wavefront angular distribution built on a conventional display. The second case uses a coherent wavefront phase distribution built on an unconventional integrated photonics device.
We propose an intermediate solution, situated at the interface of geometric incoherent refractive imaging and radiative coherent diffractive imaging. The wavefront distribution is built in a coherent way by the use of integrated photonics and holography. However, the resulting wavefront is not considered as a complex phase function calculated from the Fourier transform of the image. It is rather built as an incoherent geometrical combination of elementary coherent wavefronts resulting from the image angular point coordinates.
The principle of our concept is described in Fig. 1(f): the display emits a distribution of spherical wavefronts with a given angular orientation . These wavefronts are mutually coherent so that a planar wavefront is generated as described in the Huygens–Fresnel principle. The eye can then focus the generated planar wavefront on the retina with the specified angular coordinate. As the only optical system needed to form the image on the retina is the eye itself, we can imagine a highly integrated architecture as a basis of the display device.
An artist’s view of our device is given in Fig. 2. Each emitting point that generates an elementary spherical wavefront is addressed by a waveguide that brings the image data to the outcoupling region. At this location a switchable outcoupling grating extracts the propagated light in the vicinity of a holographic element. This reflective elementary hologram can be considered as an orientated Bragg grating that defines the spherical wavefront angular orientation .
The surface of the glass is covered by a complex waveguide design that addresses the emissive points distribution (EPD). A distribution of outcoupling electrodes allows the activation/deactivation of the wavefront emission in order to refresh the image formation on the retina. Light that forms the image is generated by an amplitude modulated laser array that is coupled to the waveguide distribution.
If the device is made in transparent materials, with small refractive index variation, we can expect an overall transparency that could allow see-through applications.
The operating principle of the device is described in Fig. 3. In this simple case we show the rendering of a basic image of pixels. In Fig. 3(a) we have an exploded view of the device: (1) the laser array produces the light used to form the image; (2) the routing waveguides direct the light to the waveguide output distribution; (3) the waveguide output distribution addresses the EPD; (4) the outcoupling grating layer extracts light from guided to free-space optics; (5) the electrode layer enables light extraction at specified locations; and (6) the hologram layer defines light direction and fixes the EPD phase coherence.
Our simple illustration shows the letter “T” scanning image formation in three steps [Figs. 3(b) and 3(c)]. In step 1 we emit three angular directions corresponding to three image pixels. The three lasers are emitting coherent light at three different output power levels at a green wavelength. In the drawing, each green laser is associated with a specific color (red, blue, yellow). A first set of electrodes (gray) is activated, and each light beam guided from the laser is emitted on the glass surface on an EPD of the corresponding color. We have represented only one emitted beam for each laser but each laser, i.e., each angular direction, is associated with a set of emissive holographic points in red, blue, or yellow. At this step the display projects on the retina three points corresponding to the three first angular directions [Fig. 3(b)].
In step 2 a second set of three image pixels is projected. Another set of amplitudes is given to the three laser sources, and another set of electrodes (pink) is activated in order to address the corresponding angular holographic EPD.
The same process is repeated in step 3 to project the last three image pixels. Vision persistence is used to recover the whole image [Fig. 3(b)].
This ambitious concept is based on technological steps that have to be investigated and demonstrated theoretically and practically. One of the first issues concerns the ability to form an image in relation to the Huygens–Fresnel principle. We describe this image-forming method as the self-focusing effect.
3. SELF-FOCUSING EFFECT
A. Theoretical Analysis
The self-focusing effect has already been introduced and demonstrated experimentally by Hong et al. in the field of optical data storage . The authors have shown the ability to focus a laser from a combination of phase-adjusted laser beams. More recently this concept has found new applications in LIDAR devices .
In our case the self-focusing effect can be described considering both a Gaussian beam model and the multiple interferences phenomenon. Figure 4(a) shows the basic concept of multiple beams focusing into the eye. Our display is located at a distance from the eye. The eye is described as a thin lens of focal length . The retina is located at a distance from the surface of the eye. The display emits a collection of Gaussian beams from the EPD . The number of emissive points is limited by the entrance pupil Π with diameter , as shown in Fig. 4(b).
The particularity of our concept is that each emitted beam of a given emissive point propagates according to the same wave vector . Geometrically, after passing through the eye lens, all these parallel beams converge to the same point , located in the focal plane of the eye lens.
We take the approximation of an eye that can be described as a thin lens. According to the Gaussian beam formulation the waist located at is imaged by the eye at a distance given by
Equation (1) shows that if the display is positioned close to the object focal plane of the eye (), the distance is close to the focal length. The wavefronts converging on the point are close to the waist location and can be considered as plane waves with wave vector . The behavior of the beam superposition at the point can then be described as multiple interfering planar waves. We describe the field of the planar wave as follows:
The interference energy function is given by the sum of the beams passing through the eye lens pupil aperture as2) that
As a demonstration and for a better understanding of the phenomena, we pursue our theoretical analysis with a 1D emissive distribution in the plane (, ), as shown in Fig. 5. The wave vector is expressed in relation to the propagation angle as2) is then expressed as
The interphase function at the focal plane becomes10) can be simplified as 7) showing a maximum of energy at .
Equation (11) also shows that if the emissive points are distributed in a square periodic grid of period , that is, if , then other maximums of energy occur at the coordinates
Equation (12) is the expression of the order of diffraction of the periodic EPD structure. It shows that the self-focusing effect can effectively focalize the energy at the targeted location but also at periodic resonances. The self-focusing effect could lead to a sharp image on the retina but, as the image is duplicated, the periodic EPD forbids effective imaging. To avoid this resonance effect one solution is to introduce randomness in the EPD, as shown in Fig. 4(b).
We have simulated the self-focusing effect using the double formalism of the Gaussian beam and multiple beam interference. Equation (3) is used in an iterative way to sum the contribution of an EPD.
We consider a Gaussian beam propagation to describe the amplitude of the beam from the display to the retina. The amplitude of the field in the plane of the display is given in a first approximation by a Gaussian function of waist :
The beam propagates from the display to the eye and forms an image of waist close to the retina:
The simulated intensity function is calculated from the summation of the planar beam given in Eq. (3) with a Gaussian beam intensity weighting:
We evaluate the interference figure as a function of the period of the EPD and compare the periodic distribution with a semirandom distribution, defined by a “randomly periodic equation”:
Figure 6(a) shows the case : only one point of the EPD belongs to the eye lens aperture (red dot on the left figure). The resulting intensity function on the retina shows only the Gaussian beam contribution. The Gaussian beam of waist given by Eq. (14) represents the blurred signal on the retina.
In Fig. 6(b) the EPD period is decreased to 200 μm. We compare the periodic distribution with five points on the EPD (red dots) and the randomly periodic distribution that give four points on the EPD (green dots). The resulting intensity function of both cases shows the periodic self-focusing effect (on the center) and the random self-focusing effect (on the right). Figures 6(c) and 6(d) show the case with and .
The choice of the parameter on Fig. 6 allows a good view of the phenomenon as the waist and are visible on the same graphic. In the case of periodic EPD, the interfering beams generate a focus that is replicated on a period given by Eq. (12). When randomness is introduce in the EPD, the diffraction orders vanish such that a speckle pattern is formed allowing a single focused spot of the image to be created. The size of the emissive zone fixes the size of the speckle distribution, the size of the eye lens aperture fixes the radius of the focus that tends to the diffraction limit, given by7 the normalized intensity cross section of the self-focusing signal in the case of a random periodic EPD for and the theoretical Airy function  for a diffracting pupil aperture diameter .
C. Display Analysis
We evaluate the efficiency of the self-focusing effect by measuring the signal-to-noise ratio (SNR) with parameters more consistent with our concept:
The SNR is calculated according to the equation7]: 8. For a better comparison, the periodic case is extended in the axis ().
As expected, the results show that randomness greatly improves the SNR and that the choice of the EPD has a strong impact on the imaging process. For a given period , the EPD random sequence choice can modify the SNR over several orders of magnitude.
As shown in Fig. 8, we need a small EPD period to increase the SNR and the image rendering. However, for a given emissive point size, the number of available EPDs decreases with the EPD period. As the number of available EPDs is directly related to the number of pixels of the projected image, we have to manage a compromise between the quality and the resolution of the image. This compromise differs from the standard space–bandwidth constraint (for a given display size, sharpness and resolution increase together)  and underlines the unconventional aspect of our approach: the increase of the image resolution reduces the image rendering.
The SNR is a first step in the characterization of the self-focusing process. The effective impact of the EPD choice on the image quality, related to resolution sharpness or contrast constraints, is currently under investigation and will be published soon.
The choice of the EPD configuration is also a research topic strongly related to technological constraints. The part of randomness introduced in the EPD must be consistent with the technological solutions used to bring light to the surface of the display. In particular, it must take into account the limitations induced by the waveguide design.
4. WAVEGUIDE DISTRIBUTION ARRAY
The technological principle considered to bring light to the EPD is described in Fig. 9. A waveguide distribution array guides the energy that fixes the amplitude of the emitted signal. The location of the emitted beam is determined by the intersection between the waveguide and the outcoupling activation electrode . These intersections between the activated electrodes and waveguides define the EPDs. As shown in Fig. 3, various EPDs are activated at the same time as different waveguides are addressed by different lasers.
Figure 9 shows two waveguides and addressed by two lasers and . The electrode extracts the two signals toward the holographic elements (hoels) and . The two hoels, which belong to the EPD related to the two image point angular coordinates, reflect the signal in the given angular directions and .
The device parameters are given in Figs. 10 and 11. Waveguides are designed to propagate a single mode in the visible range, at the wavelength of the hologram maximum efficiency (around 532 nm for the polymer holographic material considered here). One promising technology for the manufacturing of waveguides operational in the visible range is silicon nitride . Typical value parameters for the waveguide thickness and width at these wavelengths are 200 nm and 300 nm. The distance between the waveguides is chosen to limit the coupling to neighboring waveguides. A typical value of 1.5 μm can be considered. The SiN waveguides can be manufactured on transparent glass and covered by a cladding of thickness .
The outcoupling grating is etched in the cladding above the waveguide. A typical grating period for a wavelength of 532 nm is around 400 nm.
The design of the waveguide including coupling/propagation constraints due to the particular random EPD choice is currently investigated theoretically and experimentally.
The electrode that activates the outcoupling grating can be made of a liquid crystal layer as proposed by Buss et al. . The width of the electrode fixes the length of the outcoupling grating [Fig. 11(a)]. Typical lengths could be of the order of a few microns. The electrodes are separated by a distance . The total number of emissive points for the whole EPD family is given by the parameters and in relation to the eye pupil aperture11]. In contrast to holographic data storage, each hologram does not relate to complex data information but to a single angular reference. Each hoel can then be interpreted as an elementary directional Bragg element.
5. DIRECTIONAL HOLOGRAPHIC ELEMENTS
A. General Principle
The principle of light coupling from the waveguide to the hoel has been described in Fig. 9. Here, we present in Fig. 11 the principle of the phase adjustment that allows the efficiency of the interferences produced by the directional hoels.
Figures 11(a) shows a lateral section of the device. A guided wave is extracted in two locations and is reflected by two hoels belonging to the same EPD. The length of the hoel is related to the size of the electrode but is not limited by the interelectrode distance . As for the width of the hologram, its length can exceed so that neighboring hoels can overlap. We have chosen in our simulation a hoel radius that defines a waist of the emitted beam.
As presented in Section 3.A, each EPD must be phase adjusted in order to self-focalize on a specific location of the retina plane. The phase shift between the extracted beams described in Fig. 11(a) is related to the optical path shift and to the grating distribution. It can hardly be controlled by a nanoscale resolution mask design over the whole device surface. Instead, phase adjustment is guaranteed by the intrinsic nanoscale resolution of the 3D hoel recording process.
The recording process is described in Fig. 11(b). The guided laser light plays the role of the reference beam, and a free-space beam coming from the same laser is used as the object beam. This beam is segmented in a multitude of elementary beams coming from a given angular direction. Activation of the outcoupling electrodes allows the creation of the interference pattern between the reference and the object beams that is recorded by the hoels for a given EPD.
When the reference beam is coupled from the waveguide, the conjugate object beam is generated in a reflective mode, as shown in Fig. 11(a). The phase adjustment is automatically recorded from the original object beam.
B. Recording Setup
In order to record the hoel distribution corresponding to a given EPD, a specific recording setup has to be built. Figure 12 shows the basic concept of the setup. An optical fiber is used to split a laser beam into a reference and an object beam. The object beam is collimated in a two lens optical system from a point source given by the first fiber extremity. First lens L1 makes the image of the fiber extremity on the object focal plane of the second lens L2. The position of this image fixes the angular direction of the collimated beam. An aperture mask corresponding to the EPD is located on the image focal point of the first lens and is imaged on the hologram layer by the second lens L2. The object beam that impacts the hologram layer is collimated and phase adjusted in relation to the optical fiber position and is segmented in a collection of beam spots defined by the aperture mask. The image of the aperture mask is aligned with the EPD that is activated by the electrodes and by the optical coupling of the second optical fiber in a selected waveguide distribution (shown in the inset of Fig. 12).
The recording setup shown in Figs. 11(b) and 12 uses a collimated planar object wavefront. It is designed to form an image at infinity for each eye. The modification of the fiber longitudinal position in the recording setup allows the device to form an image at a fixed, given viewing distance by modifying the curvature of the object wavefront. In this case, the management of a symmetric off-axis pixels distribution for each eye should be used to reduce the vergence accomodation conflict (VAC) that usually limits conventional smart glasses approaches . Our concept allows viewing an image in different plane locations with limited VAC. However, the manufacturing process fixes the plane location for a given display device.
Numerous questions remain regarding the technological process for the realization of the holographic device. Uncertainties around the recording duration, the material behavior, and the possible replication methods have to be investigated to validate a potential commercial interest to the concept. However, the impressive achievements of commercially feasible terabit-scale hologram storage and hologram printers imply a technical maturity that should be applicable to our approach [13,14].
We are currently evaluating the hoel recording process and have presented some initial design considerations . A collaboration with a polymeric holographic material supplier has been initiated and should allow us to soon record the first hoel distribution in order to validate the self-focusing effect in an image-forming process.
6. IMAGING PROPERTIES
A. Sharpness/Contrast Issue
The sharpness of an image formed by an optical system is generally characterized by its modulation transfer function (MTF). This function gives the efficiency of the system for the rendering of a spatial frequency with a given contrast. The MTF can be calculated from the Fourier transform of the point spread function (PSF) that is the impulse intensity distribution.
Figure 7 shows the PSF of an optimal self-focusing effect (green curve). The calculation takes into account a perfect lens as the optical imaging system and the signal in its central part is very close to the theoretical diffraction limit. A more realistic approach needs to take into account the specific optical characteristics of the human eye and this poses some specific issues due to the human vision process.
The eye is a complex optical system that does not follow the diffraction theory. Unlike Eq. (17), aberrations in the pupil periphery degrade the MTF as the eye pupil diameter increases . On the other hand, it has been shown that the coherent laser interfering imaging process can alleviate the pupil peripheral aberration distortion and improve the PSF . The question of the effective sharpness of the self-focusing effect in the eye is an open question and needs to be studied with modern eye models and physiologic experiments in a coherent imaging process.
The measurement of the ANSI contrast is also a mean to evaluate the efficiency of an imaging system. It consists in forming a checkerboard image and measuring the ratio of intensity between the dark and white cells. In the case of self-focusing imaging, the PSF can be divided in two parts: a thin central spot and a large noisy speckle contribution, as shown in Fig. 7. This leads us to introduce a double Gaussian model in which the thin central Gaussian spot contributes to the imaging process and the large Gaussian noise reduces the overall contrast . Improving the contrast requires limiting the impact of the large Gaussian contribution. This is done by an optimal EPD design that improves the energy ratio between the two Gaussian contributions and by limiting the number of pixels required to form an image (typically by the use of nonadjacent pixel distribution, as shown below). Characterization of our system in terms of ANSI contrast simulation will be described in an upcoming paper.
B. Resolution/Image Rendering Issue
As we have mentioned in Section 3, the rendering of the image is related to the number of emissive points of a given EPD. This number depends on the eye pupil aperture size and on the choice of the EPD function, in particular, the EPD period :21) and (22) confirm that the image rendering given by and the image resolution given by are diverging parameters. Reducing increases the image quality but reduces the resolution.
We can define the following consistent parameters:
This gives a total number of pixels that corresponds to a conventional image resolution of about pixels.
We have simulated in Fig. 13(a) a retinal projection on the basis of an image of pixels projected on a field of view (FOV). This artistic view highlights the differences between our retinal projection concept and a conventional display. The image is formed by separated luminous dots rather than by adjacent pixels. The angular distance that separates the dots is not necessarily uniform and can be adapted to the content for a given region of the FOV. In the example the text is projected on the retina with an angular distance between the dots of 3 arcmin and the value is increased to 4.5 arcmin for the GPS pictogram.
Equation (22) leads to low pixel number values. However, even if the resolution of the available image is low, the total number of pixels can be optimized by selecting specific regions in the FOV. This possibility underlines once again the unconventional approach of the concept: the perceived FOV, traditionally given by the product of the resolution with the angular pixel increment, can be increased here for a given constant total number of pixels. The dynamic image addressing for a specific region of the FOV and with a specific resolution is, however, fixed for a given device. Each EPD can be modulated in power emission but not in angular reference.
In terms of image rendering, the dotted aspect of the projected image can modify the perceived resolution and improve the result in comparison to a conventional display due to half-toning visual effects. As an illustration, we compare the same text coded with a resolution of pixels for an unconventional dots pattern display [Fig. 13(b)] and for a conventional adjacent pixels display [Fig. 13(c)].
Imaging properties are currently investigated in an extensive study to evaluate the impact of the speckle noise. This theoretical study will be shown by experimental evaluations incorporating visual tests.
7. POWER CONSIDERATIONS
To conclude the technological review of our concept we focus on power considerations to check if the device is consistent with the objective of near-eye integration.
We target the projection of a full bright image on a circular FOV of 15°. The image is characterized by a brightness . We calculate the power required in relation to the etendue of the eye [Fig. 14]:23), the power that enters the eye is then between 1 μW and 10 μW.
The etendue of the beam seen from the eye is equal to the etendue of the eye seen from the beam. The emitting surface characterized by a diameter is given by the relation
The total amount of optical power emitted from the display is given by14].
We note this overall device efficiency , which leads to the optical power emitted by the lasers:19]. This power affects the use of the device and is not optimized as it does not depend on the projected image. On the contrary, our emissive display adapts its power to the content so that for projecting text information as in the case of Fig. 13 an electric power far lower than 10 mW can be expected.
Another power consideration is that of eye safety. The image is projected on the retina in a scanning mode (Fig. 3) and must not lead to hazard for the retina. Laser safety is based on the calculation of the maximum permissible exposure (MPE) on the cornea. For the case of a collimated laser beam, a blinking reflex of 0.25 s, and a pupil diameter of 7 mm, MPE is about  and corresponds to a laser power limit of about 1 mW.
In our display the exposure of the cornea for one pixel during 0.25 s is given by27) leads to an exposure value of , which is far below the MPE limit.
The device seems to present no hazard for the eye. However, the case of the blinking reflex can be discussed and long-term laser exposure will have to be studied in more detail.
We present a complete theoretical overview of an unconventional imaging concept that could allow the development of a near-eye integrated transparent display. We describe the concept of the self-focusing effect that could allow image formation by retinal projection in a lens-free device configuration. The concept is simulated, and first results on evaluation characteristics such as the focus SNR give first insight on the device feature. The waveguide design and hoel concept are introduced, and we give some first perspectives on image rendering, device manufacturability, and power consumption.
Initial limitations are identified in terms of image rendering and commercial implementation. Image resolution is constrained by the waveguide integration and self-focusing efficiency. The use of holographic elements limits the projection to a monochromatic image, and a fast holographic recording process has yet to be demonstrated. These limitations can be balanced by the new opportunities opened by the unconventional imaging approach. In particular, the ability to adapt locally the resolution of the image inside a discontinuous field of view can open interesting applications.
More generally, this research can be seen as a fundamental reflection on the new opportunities for retinal projection that are opened by recent technological achievements in integrated photonics and holography. As our laboratory is strongly involved in conventional microdisplay design and manufacturing for AR/VR/MR applications , such investigations may anticipate potential technological evolutions.
We thank Pr. Haeberle from Laboratoire MIPS of Université de Haute-Alsace for fruitful discussions on diffraction and holographic issues.
1. C. Martinez, “Image projection device,” U.S. patent 2015/0370073 A1 (December 24, 2015).
2. C. Martinez, V. Krotov, D. Fowler, and O. Haeberle, “Lens-free near-eye intraocular projection display, concept and first evaluation,” in Imaging and Applied Optics, OSA Technical Digest (Optical Society of America, 2016), paper CW1C.5.
3. A. Maimone, D. Lanman, K. Rathinavel, K. Keller, D. Luebke, and H. Fuchs, “Pinlight displays: wide field of view augmented reality eyeglasses using defocused point light sources,” ACM Trans. Graph. 33, 89 (2014). [CrossRef]
4. J. Sun, E. Timurdogan, A. Yaacobi, E. Shah Hosseini, and M. R. Watts, “Large-scale nanophotonic phased array,” Nature 493, 195–199 (2013). [CrossRef]
5. S. S. Hong, B. K. Horn, D. M. Freeman, and M. S. Mermelstein, “Lensless focusing with subwavelength resolution by direct synthesis of the angular spectrum,” Appl. Phys. Lett. 88, 261107 (2006). [CrossRef]
6. M. Heck, “Highly integrated optical phased arrays: photonic integrated circuits for optical beam shaping and beam steering,” Nanophotonics 6, 93–107 (2017). [CrossRef]
7. M. Born and E. Wolf, Principle of Optics, 7th ed. (Cambridge University, 1999).
8. A. W. Lohmann, R. G. Dorsch, D. Mendlovic, Z. Zalevsky, and C. Ferreira, “Space-bandwidth product of optical signals and systems,” J. Opt. Soc. Am. A 13, 470–473 (1996). [CrossRef]
9. A. Z. Subramanian, P. Neutens, A. Dhakal, R. Jansen, T. Claes, X. Rottenberg, F. Peyskens, S. Selvaraja, P. Helin, B. Du Bois, K. Leyssens, S. Severi, P. Deshpande, R. Baets, and P. Van Dorpe, “Low-loss singlemode PECVD silicon nitride photonic wire waveguides for 532–900 nm wavelength window fabricated within a CMOS pilot line,” IEEE Photon. J. 5, 2202809 (2013). [CrossRef]
10. T. Buß, C. L. C. Smith, and A. Kristensen, “Electrically modulated transparent liquid crystal-optical grating projection,” Opt. Express 21, 1820–1829 (2013). [CrossRef]
11. G. Barbastathis, M. Levene, and D. Psaltis, “Shift multiplexing with spherical reference waves,” Appl. Opt. 35, 2403–2417 (1996). [CrossRef]
12. G. Wetzstein, “Light field, focus-tunable, and monovision near-eye displays,” SID Symp. Dig. Tech. Pap. 47, 358–360 (2016). [CrossRef]
13. L. Hesselink, S. S. Orlov, and M. C. Bashaw, “Holographic data storage systems,” Proc. IEEE 92, 1231–1280 (2004). [CrossRef]
14. H. Bjelkhagen and D. Brotherton-Ratcliffe, Ultra-Realistic Imaging, Advanced Techniques in Analogue and Digital Colour Holography (CRC Press, 2013).
15. C. Martinez, V. Krotov, and D. Fowler, “Holographic recording setup for integrated see-through near-eye display evaluation,” in Imaging and Applied Optics, OSA Technical Digest (Optical Society of America, 2017), paper JTu5A.36.
16. F. W. Campbell and R. W. Gubisch, “Optical quality of the human eye,” J. Physiol. 186, 558–578 (1966). [CrossRef]
17. B. A. Wandell, Foundations of Vision (Sinaur Associates, 1995), p. 54.
18. V. Krotov, C. Martinez, and O. Haeberlé, “Imaging performance analysis of a lens-free near to eye display,” in Imaging and Applied Optics, OSA Technical Digest (Optical Society of America, 2017), paper JTu5A.5.
19. R. LiKamWa, Z. Wang, A. Carroll, F. X. Lin, and L. Zhong, “Draining our glass: an energy and heat characterization of Google Glass,” in 5th Asia-Pacific Workshop on Systems (APSYS) (2014).
20. F. C. Delori, R. H. Webb, and D. H. Sliney, “Maximum permissible exposures for ocular safety (ANSI 2000), with emphasis on ophthalmic devices,” J. Opt. Soc. Am. A 24, 1250–1265 (2007). [CrossRef]
21. F. Templier, L. Dupré, S. Tirano, M. Marra, V. Verney, F. Olivier, B. Aventurier, D. Sarrasin, F. Marion, T. Catelain, F. Berger, L. Mathieu, B. Dupont, and P. Gamarra, “75-1: Invited paper: GaN-based emissive microdisplays: a very promising technology for compact, ultra-high brightness display systems,” SID Symp. Dig. Tech. Pap. 47, 1013–1016 (2016). [CrossRef]