Abstract

Monocentric lenses have recently changed from primarily a historic curiosity to a potential solution for panoramic high-resolution imagers, where the spherical image surface is directly detected by curved image sensors or optically transferred onto multiple conventional flat focal planes. We compare imaging and waveguide-based transfer of the spherical image surface formed by the monocentric lens onto planar image sensors, showing that both approaches can make the system input aperture and resolution substantially independent of the input angle. We present aberration analysis that demonstrates that wide-field monocentric lenses can be focused by purely axial translation and describe a systematic design process to identify the best designs for two-glass symmetric monocentric lenses. Finally, we use this approach to design an F/1.7, 12 mm focal length imager with an up to 160° field of view and show that it compares favorably in size and performance to conventional wide-angle imagers.

© 2012 Optical Society of America

Errata

Igor Stamenov, Ilya P. Agurok, and Joseph E. Ford, "Optimization of two-glass monocentric lenses for compact panoramic imagers: general aberration analysis and specific designs: erratum," Appl. Opt. 52, 5348-5349 (2013)
https://www.osapublishing.org/ao/abstract.cfm?uri=ao-52-22-5348

1. Introduction: Wide-Angle Monocentric Lens Imaging

Imagers that require the combination of wide field of view, high angular resolution, and large light collection present a difficult challenge in optical system design. Geometric lens aberrations increase with aperture diameter, numerical aperture (NA), and field of view and scale linearly with focal length. This means that for a sufficiently short focal length, it is possible to find near-diffraction-limited wide-angle lens designs, including lenses mass-produced for cell phone imagers. However, obtaining high angular resolution (for a fixed sensor pixel pitch) requires a long focal length for magnification, as well as a large NA to maintain resolution and image brightness. This combination is difficult to provide over a wide angle range. Conventional lens designs for longer focal length wide-angle lenses represent a tradeoff between competing factors of light collection, volume, and angular resolution. For example, conventional reverse-telephoto and “fisheye” lenses provide extremely limited light collection compared to their large clear aperture and overall volume [1]. However, the problem goes beyond the lens itself. Solving this lens design only leads to a secondary design constraint, in that the total resolution of such wide-angle lenses can easily exceed 100 megapixels. This is beyond the current spatial resolution and communication bandwidth of a single cost-effective sensor [2], especially for video output at 30frames/s or more.

One early solution to wide-angle imaging was “monocentric” lenses [3], using only hemispherical or spherical optical surfaces that share a single center of curvature. This symmetry yields zero coma or astigmatism over a hemispherical image surface, and on that surface provides a field of view limited only by vignetting from the central aperture stop. The challenge of using a curved image surface has limited the practical application of this type of lens, but there has been a resurgence of interest in monocentric lens imaging. In 2009, Krishnan and Nayar proposed an omnidirectional imager using a spherical ball lens contained within a spherical detector shell [4]. In 2010 Ford and Tremblay [5] proposed using a monocentric lens as the objective in a multiscale imager system [6], where overlapping regions of the spherical image surface are relayed onto conventional image sensors, and where the mosaic of subimages can be digitally processed to form a single aggregate image. Cossairt and Nayar demonstrated a closely related configuration using a glass ball and single-element relay lenses, recording and digitally combining overlapping images from five adjacent image sensors [7]. And recently a gigapixel monocentric multiscale imager has been demonstrated that integrates a two-dimensional mosaic of subimages [8,9], using the optical layout shown in Fig. 1(a) [10].

 

Fig. 1. (a) Optical layout of a 2.4 gigapixel monocentric multiscale lens and (b) the same image field transferred by tapered fiber bundles instead of multiple relay optics.

Download Full Size | PPT Slide | PDF

Monocentric lenses and spherical image formation provide favorable scaling to long focal lengths and have been shown capable of a two orders of magnitude higher space–bandwidth product (number of resolvable spots) than conventional flat field systems of the same physical volume [11]. In early monocentric lens cameras, the usable field of view was limited by the vignetting and diffraction from the central lens aperture, as well as the ability of recording media to conform to a spherical image surface. However, the system aperture stop need not be located in the monocentric lens. The detailed design of the multiscale monocentric lens [10] shows that locating the aperture stop within the secondary (relay) imagers enables uniform relative illumination and resolution over the full field. This design maintains F/2.4 light collection with near-diffraction-limited resolution over a 120° field of view. With 1.4 μm pitch sensors, this yields an aggregate resolution of 2.4 gigapixels.

Such wide and uniform fields can also be achieved via the conceptually simpler “waveguide” approach shown in Fig. 1(b). Instead of relay optics, the spherical image surface is transferred to conventional planar image sensors using one or more multimode fiber bundles [12]. Fused fiber faceplates are commercially available with high spatial resolutions and light collection (2.5 μm between fiber cores, and NA of 1) and can be fabricated as straight or tapered [13]. Straight fiber bundles can project sufficiently far to allow space for packaging of CMOS image sensors, while tapered fiber bundles can provide the 31 demagnification used in the relay optics in Fig. 1(a). Fiber bundles introduce artifacts from multiple sampling of the image [14], which can be mitigated but not eliminated through postdetection image processing [15]. In addition, the edges between adjacent fiber bundles can introduce “seams” in the collected image, whose width depends on the accuracy of fiber bundle fabrication. However, waveguide transfer can reduce the overall physical footprint and significantly increase light collection: in the multiscale optics structure, field overlap at the center of three relay optics must be divided between three apertures, while the waveguide can transfer all the light energy from a given field angle to a single sensor.

As shown in Fig. 2, in both systems stray light can be controlled using a physical aperture stop at the center of the monocentric lens [Fig. 2(a)] or through a “virtual stop” achieved by limiting light transmission in the image transfer optics [Fig. 2(b)]. In the case of relay imaging, this is done using a physical aperture stop internal to the relay optics. In the case of fiber transfer, this can be done by restricting the NA of the fiber bundles. Straight fiber bundles with a lower index difference between the core and cladding glasses are commercially available with an NA of 0.28. Alternately, high-index-contrast bundles with a spatial taper to a smaller output face can provide a controlled NA. Such bundles have the original output NA, but conservation of étendue reduces the input NA of a tapered fiber bundle by approximately the ratio of input to output diameter [16]. For example, a 31 taper of input to output width with a 1.84 core and 1.48 cladding index yields approximately 0.3 input NA.

 

Fig. 2. Monocentric lens imaging with (a) a physical aperture stop at the center of the objective lens and (b) a “virtual stop” accomplished by limiting the NA of the image transfer, which, as drawn, are fiber bundles made from nonimaging low-NA optical fibers.

Download Full Size | PPT Slide | PDF

The field of view of monocentric “virtual stop” imagers can be extraordinarily wide. A physical aperture at the center of the objective lens is projected onto the field angle. At 60° incidence (120° field of view), the aperture is elliptical and reduced in width by 50%, with a corresponding decrease in light transmission and diffraction-limited resolution. As Fig. 2(b) shows, however, moving the aperture stop to the light transfer enables uniform illumination and resolution over a 160° field of view, where at extreme angles the input light illuminates the back surface of the monocentric objective. The lens as drawn in Fig. 2(b) indicates that the image transfer optics perform all stray light control, and does not show nonsequential paths from surface reflections. In practice, an oversized physical aperture or other light baffles can be used to block most of the stray light, while the image transfer optics provide the final, angle-independent apodization. While such practical optomechanical packaging constraints can limit the practical field of view, the potential for performance improvement over a conventional “fisheye” lens is clear.

Despite the structural constraints, even a simple two-glass monocentric objective lens can provide high angular resolution over the spherical image surface. With waveguide image transfer, the overall system resolution is directly limited by the objective lens resolution. In multiscale imagers, geometric aberrations in the objective can be corrected by fabricating objective-specific relay optics. However, compensating for large aberrations in the primary lens tends to increase the complexity and precision of fabrication of the relay optics. Since each multiscale imager requires many sets of relay optics (221 sets in the 120° field, 2.4 gigapixel imager design), minimizing relay optic complexity and fabrication tolerance can significantly reduce system cost. So for both structures, it is useful to optimize the objective lens resolution.

T. Sutton offered the first design of the monocentric lens in 1859 [3], with a wide field of view but high f-number (about 30) due to the lack of correction of spherical and chromatic aberrations. To solve this problem, in 1942, J. G. Baker proposed a glass combination with a high-index flint glass for the outer shell and low-index crown for the internal ball lens [3], resulting in a monocentric but not front-to-back symmetric lens structure with aberrations well corrected for a moderately high f-number of 3.5. The design of a compact and high-resolution endoscope lens with a two-glass symmetrical monocentric lens was published by Waidelich in 1965 [12]. An endoscopic lens with a 2 mm focal length and f-number of 1.7 has a diffraction-limited image quality, but scaling in focal length to >1cm for photographic applications with the focus about 12 mm required the increase of the f-number to about 3.5. More recently, a two-glass monocentric lens was used as the projection objective in a Kodak autostereoscopic display [17,18]. In the Waidelich and Kodak designs, like in the Baker lens, the outer shell meniscus was made from a flint glass and the internal ball lens from a crown glass with a moderate difference in Abbe number. As will be shown later, this was nearly the ideal combination. The outer meniscuses had a higher nd, and the difference in refraction index between outer and internal ball lenses did not exceed 0.08. This low index difference probably was inherited from the Cooke triplet design [1]. We will show that differences in refraction indices larger than 0.2 are needed to approach diffraction-limited performance in lower f-number photographic lenses. Nevertheless, these designs have demonstrated that purely monocentric lenses can achieve high-quality achromatic imaging.

Optical systems are now typically designed by computer numeric optimization in commercial software like Zemax and CodeV. The highly constrained monocentric lenses seem well suited to a global optimization search to identify the best glass combination and radii. However, we have found that “blind” optimization of monocentric lenses, even a simple two-glass lens, often overlooks the best solutions. This is especially true for large-NA designs, where the ray angles are steep and the optimization space has multiple deep local minima. There are also a large number of glasses to consider. The available glass catalog was recently increased by the publication of a number of new glasses from Hoya to 559 glasses [19]. The more advanced optimization algorithms take significant processing time. Even for a two-glass lens it is impractical to use them to search all 312,000 potential combinations, so the best design may be overlooked. Fortunately, the symmetry of monocentric lenses permits a relatively straightforward mathematical analysis of geometrical optic aberrations, as well as providing some degree of intuitive understanding of this overall design space. Combining “old school” analysis with computer sorting of glass candidates can enable a global optimization for any specific focal length and spectral bandwidth desired.

In this paper, we provide a detailed analysis for the design of two-glass monocentric lenses. We begin with the first-order paraxial and Seidel third-order analysis of the focus of wide-field monocentric imagers, showing that despite the highly curved focal surface, axial translation of monocentric lenses can maintain focus of a planar object field from infinite to close conjugates. We will optimize these lenses operating with a larger and so more general “virtual” stop, because introducing a comparably sized physical stop will only tend to create vignetting at the field points and cut off some of the most highly aberrated rays. We continue by demonstrating the systematic optimization of these lenses by the following process.

For a specified focal length, NA, and wavelength range:

  • (1) Compute and minimize third-order Seidel spherical and longitudinal chromatism aberrations to find approximate surface radii for a valid glass combination.
  • (2) Optimize lens prescriptions via exact ray tracing of multiple ray heights for the central wavelength.
  • (3) Calculate the polychromatic mean square wavefront deformation, and generate a ranked list of all lens candidates.
  • (4) Confirm ranking order by comparing polychromatic diffraction modulation transfer function (MTF) curves.

To verify this method, we redesign the objective from our 2.4 gigapixel multiscale imager and find that the global optimization process yields the original design (and fabricated) lens, as well as additional candidates with improved internal image surface resolution. We then apply the design methodology to a new system, an ultracompact fiber-coupled imager with a 12 mm focal length and uniform resolution and light collection over a more than 120° field of view. We show that this design compares favorably to a more conventional imager using a “fisheye” wide-field lens and conclude with observations on future directions and challenges for this type of imager.

2. Theoretical Analysis of Monocentric Lenses

A. Focus of Monocentric Lenses

Photographic lenses are normally focused by moving them closer to or further from the image plane, but this appears impractical for the deep spherical image surface in a wide-field monocentric lens. For the 70 mm focal length objective of Fig. 1, a 1 mm axial translation to focus on an object at a 5 m range brings the image surface only 0.5 mm closer to the objective for an object at a 60° field angle, and 0.17 mm closer for an object at an 80° field. This seems to imply that the lens can only focus in one direction at time and needs three-dimensional translation to do so. In the monocentric multiscale imager, the primary lens position is fixed, and the secondary imagers are used for independent focus on each region of the scene [10,20]. However, introducing an optomechanical focus mechanism for each secondary imager constrains the lens design and adds significantly to the overall system bulk and cost. More fundamentally, the monocentric waveguide imager shown in Fig. 2(b) has no secondary imagers and cannot be focused in this way, which initially appears a major disadvantage. In fact, however, axial translation of monocentric lenses maintains focus on a planar object across the full field of view.

Consider the geometry of an image formation in the monocentric lens structure shown in Fig. 3. For a focal length f with refocusing from an object at infinity to the closer on-axis object at distance d, assuming df, the image surface shift Δx is [1]

Δx=f2dff2d.
For the off-axis field point B, having a field angle and the angle of principal ray β1, the distance OB to the object is d/cos(β1) and the refocusing image point shift Δx (BinfB) will be
Δx(β1)=f2cos(β1)d=Δxcos(β1)BinfQ¯=Δx(β1)cos(β1)=Δx=AinfA¯,
which means that for refocusing, the spherical image surface 5() is axially translated on segment Δx to the position 5(d). As will be shown later, for our 12 mm focus monocentric lens this approximation works well for up to the closest distance of 500 mm, and reasonably well for objects at a 100 mm range. So for a planar object at any distance above some moderate minimum, the geometry of refocusing the monocentric lens is in accordance with first order paraxial optics.

 

Fig. 3. First- and third-order consideration of monocentric lens refocus.

Download Full Size | PPT Slide | PDF

The most general analytic tool for lens aberration analysis and correction is classical third-order Seidel theory [21,22]. In Seidel theory, astigmatism and image curvature are bound with the coefficients C and D. Referring to the variables defined in Fig. 3, the coefficients C and D can be expressed as [2123]

C=12s=1mhs(βs+1βs1ns+11ns)2(αs+1ns+1αsns),
D=12s=1m(1ns1ns+1)rs+C,
where ri is the radius of the ith surface and ni is the preceding index of refraction. From Fig. 3 it is clear that the chief ray angles to the optical axis at each surface are identical (β1=β2=β3=β4) for any axial position of the monocentric optics relative to image surface. Therefore, coefficient C remains zero while focusing to planar object surfaces by means of axial movement of the monocentric objective lens. According to [21],
C=14nim(1Rt1Rs)andD=12nim1Rs,
where Rt is the tangential image surface radius, Rs is the sagittal image surface radius, and nim is the image space refraction index. So if C is defined as zero during refocusing, then Rt will be equal to Rs and the image surface will stay spherical and maintain the image radius Rim=Rt=Rs. Also, from Eq. (4) it is clear that the coefficient D will remain constant. Because D remains constant, Eq. (5) also shows that Rs and, hence, radius Rim will stay unchanged as well. Third-order aberration theory couples the image curvature with astigmatism, while coma and spherical aberrations need to be corrected at this surface [21,22,24]. In other words, third-order Seidel aberration theory also indicates that simple axial refocusing of monocentric lens structures preserves the image surface radius, maintaining focus for a planar object onto a spherical image surface over a wide range of object distances. In a purely monocentric geometry there is zero coma, and no additional coma will be introduced by translation [1,22,24]. Suppose that spherical aberration is corrected at infinity. Third-order spherical aberration does not have the term that depends on the object distance [21,22,25]. We expect only a minor change in spherical aberration during refocusing due to the small changes in terms (αi+1αi)2 [21,22], which constitutes the Seidel spherical aberration coefficient. This is confirmed by the ZEMAX simulations shown in Section 3 for an optimal f=12mm lens solution.

B. Design Optimization of Monocentric Lenses

Imaging optics are conventionally designed in two steps [1,21,22,24]. First, monochromatic design at the center wavelength achieves a sufficient level of aberration correction. The second step is to correct chromatic aberration, usually by splitting some key components to achieve achromatic power: a single glass material is replaced with two glasses of the same refraction index at the center wavelength, but different dispersions. However, this process cannot easily be applied to the symmetric two-glass monocentric lens shown in Fig. 2. For a given glass pair and focal length, the lens has only four prescription parameters to be optimized (two glasses and two radii), and the radii are constrained with the monocentric symmetry and desired focal length. From this perspective, monocentric lenses resemble conventional spherical doublets. An approach for global optimization of such lenses was proposed in [26], and our approach for a global search for optimal monocentric lenses, considering all glass combinations, is closely related. As in [26], we want to analyze all valid glass combinations. However, unlike aplanatic doublets [27], there is not an optimal analytical solution for monocentric lenses, so a modified process is needed. In addition, analysis in [26] is restricted to fifth-order aberration estimates [28], while we want to extend the design process to include full analytic raytracing and MTF calculation.

We define a systematic search process beginning with a specified focal length, f-number, and wavelength range. Optimization of each glass combination was done in three steps. The first step is to determine the solution (if it exists) to minimize third-order Seidel geometrical and chromatic aberrations [2123]. A monocentric lens operating in the “fiber” stop mode has only two primary third-order aberrations—spherical aberration (Seidel wave coefficient W040) and longitudinal chromatism W020, which is defocus between blue and red paraxial foci. The sum of the absolute values of the third-order coefficients provides a good approach for a first-pass merit (cost) function, and an analytical solution for third-order coefficients allows this cost function to be quickly calculated.

The monocentric lens shown in Fig. 4 is defined with six variables, the two radii r1 and r2, and the index and Abbe number for each of two glasses: the outer glass n2 and v2, and the inner glass n3 and v3. Ray tracing of any collimated ray can use the Abbe invariant:

ri=hini+1nini+1αi+1niαi,
or
αi+1=nini+1αi+hini+1nini+iri,
where αi are the angles between the marginal rays and the optical axis. The ray tracing proceeds surface by surface, and at each step for the input ray angle αi and ray height hi the output angle αi+1 can be calculated by Eq. (7). The ray height at the next surface is
hi+1=hiαi+1di,
where di is the thickness between surfaces i+1 and i. For the monocentric lens d1=r1r2, d2=2r2, and d3=r1r2. The ray trace of the marginal input ray having α1=0 gives
1f=2r1(11n2)+2r2(1n21n3).
For a given glass combination and focal length f, Eq. (9) constrains the radius r2 to the radius r1 for all subsequent calculations.

 

Fig. 4. Third-order aberration theory applied to monocentric lens design.

Download Full Size | PPT Slide | PDF

The Seidel spherical aberration coefficient B, according to [2123], is

B=12s=14hs(αs+1αs1ns+11ns)2(αs+1ns+1αsns).
Equivalently, the spherical wave aberration W040 for the marginal ray is
W040=14Bρ4=ρ=118s=14hs(αs+1αs1ns+11ns)2(αs+1ns+1αsns),
where a clockwise positive angle convention is used.

The starting position for ray tracing is h1=f×NA and α1=0. Consequently, applying the Abbe invariant for each surface, we can substitute ray angles and heights with the system constructional parameters. Thus, from the Abbe invariant for the first surface we get

r1=h1n21n2α2orα2=h1n21n2r1.
Thus, α2 can be determined from the input ray height and prescription parameters of the first surface. The value of h2 is found from angle α2 [see Eq. (8)], and so on. Using this iterative process, and the relation between r1 and r2 from Eq. (9), we get
W040=h14((n223n2n3+n32)32f3(n2n3)2(n21)(n22n3)(n31)4n22(n2n3)2r13+(n21)2(n22+n2n3+n32)8fn22(n2n3)2r12(n21)(n22+n2n3+n32)16f2n2(n2n3)2r1).
The defocus coefficient W020 between images in blue and red light is equal to L0/2 [22,24], where
L0=2W020=h12((n31)(2f(1n2)+n2r1)f(n2n3)r1n3v3(n21)(2f(1n3)+n3r1)f(n2n3)r1n2v2).
Eq. (14) is sufficiently accurate for the visible (photographic) spectral range, where the dispersion is approximately linear. Design for extended visible waveband (400–700 nm) requires calculations in two separate subbands with custom-defined Abbe numbers to compensate for the increased nonlinearity of the glass dispersion curve. We define E(r1)=|W040|+|W020| as a merit function for third-order aberrations, which is continuous-valued and has a single global minimum identifying the approximate (near optimum) radius r1 for each valid two-glass combination. Afterward, radius r2 is calculated from Eq. (9).

The result is an algebraic expression for the third-order aberrations of the solution—if any—for a given glass combination. In our examples, we performed this calculation for each of the 198,000 combinations of the 446 glasses that were available as of April 2012 in the combined Schott, Ohara, Hikari, and Hoya glass catalogs [2932]. This yields a list of qualified candidates (those forming an image on or outside of the outer glass element), ranked by third-order aberrations. However, this ranking is insufficiently accurate for a fully optimized lens.

Because of high NA, monocentric systems tend to have strong fifth- and seventh-order aberrations. That makes third-order analysis only a first approximation toward a good design. Fortunately, the two-glass monocentric lens system has an exact analytical ray trace solution in compact form, and the more accurate values of the lens prescription parameters can be found from a fast exact ray tracing of several ray heights.

The variables for ray tracing of rays with arbitrary input height h are shown in Fig. 5, where ϕi are the angles between the ray and the surface normal. We can write

sin(ϕ1)=hr1.
From Snell’s law, we have
sin(ϕ1)=hr1n2.
Applying the sine law for the triangle ABO yields
sin(ϕ2)=r1r2sin(ϕ1)=r1r2hr1n2=hr2n2.
Again from Snell’s law,
sin(ϕ2)=n2n3sin(ϕ2)=n2n3hr2n2=hr2n3.
Triangle OBC has equal sides OB and OC. So,
sin(ϕ3)=n3n2sin(ϕ3)=n3n2hr2n3=hr2n2.
From triangle OCD, we find
sin(ϕ4)=r2r1sin(ϕ3)=hr1n2.
Then, finally,
sin(ϕ4)=n2sin(ϕ4)=hr1.
One interesting result is that for the monocentric lens there is an invariant in the form
ϕ4ϕ1.
Next, the segment OE=S can be found by applying the sine theorem to the triangle OED:
S=r1sin(ϕ4)sin[180°(180°ϕ4)(180°ϕ1ϕ22ϕ33ϕ44)]
or
S=r1sin(ϕ4)sin(180°+2ϕ1+ϕ22+ϕ33+ϕ44).
From Eq. (15),
ϕ1=arcsin(hr1).
From the triangles OAB, OBC, and OCD, we have
ϕ22=ϕ2ϕ1=arcsin(hr2n2)arcsin(hr1n2),
ϕ33=180°2ϕ2=180°2arcsin(hr2n3),
ϕ44=ϕ3ϕ4=arcsin(hr2n2)arcsin(hr1n2).
Finally,
S=hsin{2[arcsin(hr1)arcsin(hr1n2)+arcsin(hr2n2)arcsin(hr2n3)]}.
From Eq. (29), the longitudinal aberration for the ray with input height hi is given by
ΔS(hi)=S(hi)f.
Radius r2 is bound to radius r1 with Eq. (9). With a given focal length and combination of refractive indices, the longitudinal aberration ΔS is actually a function of a single variable r1. Finally, for the more accurate monochromatic optimization of the radius r1, we obtain a more accurate cost function Q:
Q=i=13Abs(ΔS(hi,λ))+j=13kjAbs[ΔS(hj,λ)ΔS(hk,λ)],
where the first term of Eq. (31) minimizes longitudinal aberrations and the second term minimizes aberration derivatives. In Eq. (31) for visible light operation, λ is the nd line located in the center of a photographic wave band, and three input ray heights are: h1=f×NA, h2=0.7h1, and h3=0.4h1, respectively.

 

Fig. 5. Monocentric lens real ray trace variables.

Download Full Size | PPT Slide | PDF

With such steep ray angles, Q is a strongly varying function, which is why a fast automated optimization can overlook the optimal radius values for a given glass pair. Figure 6 shows the dependence of Q on radius r1 for a representative case, one of the best glass pairs (S-LAH79 and S-LAH59) for the 12 mm focal length lens described in Section 3. The monochromatic image quality criterion Q has several minima over the possible range for the r1 radius. The preliminary solution for the r1 radius for this glass pair obtained from the third-order aberration minimization was 8.92 mm, close to the global minimum solution for Q found at 9.05 mm. This shows how the first optimization step provided a good starting point for the r1 radius, avoiding the time-consuming investigation of low-quality solutions in the multi-extremum problem illustrated by Fig. 6.

 

Fig. 6. Example of dependence of criterion Q on the radius r1.

Download Full Size | PPT Slide | PDF

Optimization with the criterion in Eq. (31) gives the optimal solution by means of minimum geometrical aberrations, but it is not sufficient to provide reliable sorting of monocentric lens solutions. The system polychromatic mean square wavefront deformation is better correlated with the Strehl ratio and other diffraction image quality criteria [21]. So, in the third step, the wavefront deformation is calculated and expanded into Zernike polynomials. The polychromatic mean square wavefront deformation is calculated and used as a criterion for creating the ranked list of monocentric lens solutions by means of their quality. In the monocentric lens geometry, the aperture stop is located at the center of the lens, where the entrance and exit pupils coincide as well. This is shown in Fig. 7.

 

Fig. 7. Image formation in the monocentric lens.

Download Full Size | PPT Slide | PDF

For an arbitrary ray, the lateral aberrations ΔY are bound to the wavefront deformation as

ΔY=WρλA,
where λ is the wavelength, W is the wavefront deformation expressed in wavelengths, ρ is the reduced ray pupil coordinate that varies from zero at the pupil center to unity at the edge; A is defined as the back NA, and ΔY as the lateral aberration in mm [21,33]. From Fig. 7 we have
ΔS(ρ)=ΔYAρ=WρλA2ρ,
where Aρ is directional cosine of the angle q or coordinate of the ray in the image space. Expansion of the wavefront deformation into fringe Zernike polynomials up to seventh order is given in Table 9.1 of [21]. From Eq. (33),
ΔS(ρ,λi)A2λi=4C20+C40(24ρ212)+C60(120ρ4120ρ2+24)+C80(560ρ6840ρ4+360ρ240).
The values for ΔS(ρ,λi) are calculated with fast ray tracing [Eq. (30)] for nine rays with reduced coordinate heights ρj=1, 0.95, 0.9, 0.85, 0.8, 0.75, 0.7, 0.6, 0.5 and for the three wavelengths 470, 550, and 650 nm. Then coefficients Cn0(λi) are calculated from the least square criterion [34]:
j=19[ΔS(ρ,λi)A2λi4C20(λi)C40(λi)(24ρ212)C60(λi)(120ρ4120ρ2+24)C80(λi)(560ρ6840ρ4+360ρ240)]2=min.
We found that the optimal geometrical aberration solution from step two does not have complete correlation with the diffraction quality criteria and has a small general defocus. In order to prevent general defocus of the image surface, we introduce a small shift dS of the back focal distance that makes the new coefficient C20new(λ2=550nm) equal to zero:
dS=4C20(λ2)λ2A2.
This means that the system will have a slightly adjusted focus fnew=f+dS, and the only difference from before will be in coefficients C20new:
C20new(λi)=C20(λi)dSA24λi.
Finally, according to [21] [Chapter 9.1, Eq. (23), and Chapter 9.2, Eq. (13)], the system polychromatic mean square wavefront deformation (ΔΦ)2 is
(ΔΦ)2=Φ¯2(Φ¯)2=12i=13{[C20new(λi)]23+[C40(λi)]25+[C60(λi)]27+[C80(λi)]29},
where Φ¯2 is the average value of the squared wave aberration and (Φ¯)2 is the squared average wave aberration. In our examples, the top 50 solutions for different glasses combinations were sorted in the ranked list by polychromatic mean square wavefront deformation, then each was imported into ZEMAX optical design software and quickly optimized for the best MTF performance at the 200line pairs per millimeter (lp/mm). This frequency was chosen because the smallest fiber bundle receiver 24AS available from Schott has a 2.5 μm pitch [13]. This close to the optimal design, however, the MTF performance is well behaved, and a similar result is found with a wide range of MTF values.

3. Specific Monocentric Cases

A. AWARE2 Monocentric Lens Analysis

The goal of this analysis is a monocentric lens optimization process for finding the best possible candidates for fabrication. These candidates are then subject to other material constraints involved in the final selection of a lens design, including mechanical aspects such as differential thermal expansion and environmental robustness, as well as practical aspects like availability and cost. The process described above appears to provide a comprehensive list of candidate designs. However, the best test of a lens design process is to compare the results to those generated by the normal process of software-based lens design. To do this, we used the constraints of a monocentric objective that was designed in the DARPA AWARE program—specifically, the AWARE-2 objective lens [20], which was designed by a conventional software optimization process, then fabricated, tested, and integrated into the AWARE2 imager. The lens has a 70 mm focal length and image space f-number of 3.5, using a fused silica core and an S-NBH8 glass outer shell. The optical prescription is shown in Table 1, and the layout in Fig. 8(a).

Tables Icon

Table 1. Optical Prescription of the Fabricated AWARE2 Lens

 

Fig. 8. AWARE2 lens and global optimum solution.

Download Full Size | PPT Slide | PDF

The global optimization method identified this candidate lens system, as well as multiple alternative designs (glass combinations) that provide a similar physical volume and improved MTF. The optical prescription of the top-ranked solution is shown in Table 2, and the lens layout is shown in Fig. 8(b).

Tables Icon

Table 2. Optical Prescription of the Top Design Solution for the AWARE2 Lens

The new candidate appears physically very similar to the fabricated lens. However, the MTF and ray aberrations for the manufactured prototype and the top design solution are compared in Fig. 9. The new candidate lens is significantly closer to a diffraction-limited resolution and has lower chromatism and polychromatic mean square wavefront deformation than the actual fabricated lens. It is important to recognize that the resolution of the AWARE-2 imager system includes the microcamera relay optics. The relay optics corrected for aberrations in the primary, as well as providing flattening of the relayed image field onto the planar image sensors. In fact, the overall AWARE-2 system optics design [10] was diffraction limited. However, conducting the systematic design process on a relatively long focal length system, where geometrical aberrations influence the resolution, served as a successful test of the design methodology.

 

Fig. 9. MTF and ray aberration performance comparison of the (a) fabricated AWARE2 prototype and (b) new AWARE2 design candidate.

Download Full Size | PPT Slide | PDF

B. SCENICC f-Number 1.71 12 mm Focal Length Lenses

Our specific goal was to design an imager with an at least 120° field of view and resolution and sensitivity comparable to the human eye (1 arc min), resulting in about 100 megapixels total resolution. Assuming we use the waveguide configuration of Fig. 1(b) with 2.5 μm pitch, NA=1 fiber bundles, we defined the goal of a 12 mm focal length lens with diffraction-limited operation in the photographic spectral band (0.47–0.65 μm) and an NA of 0.29 (f-number 1.71). We designed the monocentric lens assuming the fiber stop operation mode [Fig. 2(b)], as that is the more demanding design: the physical aperture stop and vignetting of the field beams will increase diffraction at wider field angles but can only reduce geometrical optic aberrations.

The result of the design process was an initial evaluation of 198,000 glass pair systems, of which some 56,000 candidates passed the initial evaluation and were optimized using exact ray tracing to generate the final ranked list. The entire process took only 15 min to generate using a single-threaded MATLAB optimization code running on a 2.2 GHz i7 Intel processor. Part of this list is shown in Table 3. Because different optical glass manufacturers produce similar glasses, the solutions have been combined in families, and the table shows the first seven of these families. We show the radii for a primary glass and list several substitution glasses in parentheses. The designs with the substitutions glasses result in small changes in radii but substantially the same performance. The table shows the computed polychromatic mean square wavefront deformation, the criterion for the analytical global search, and the value for the MTF at 200lp/mm (Nyquist sampling) found following Zemax optimization of the same candidate glasses. If the design process works, we would expect that these metrics would be strongly correlated. Figure 10 illustrates this by showing the correlation for representative samples of the top 200 solutions, and in fact we find the identical sequence for all 200 of the top candidates.

Tables Icon

Table 3. Top Solutions for a f/1.7 f=12mm Monocentric Lens

 

Fig. 10. Correlation between polychromatic mean square wavefront deformation and MTF at 200lp/mm for candidates.

Download Full Size | PPT Slide | PDF

The best performance monocentric lens solution (family number 1) demonstrates near-diffraction-limited resolution over the photographic visible operational waveband (470–650 nm). It uses S-LAH79 for the outer shell glass and K-LASFN9 for the inner glass. To provide a central aperture stop, it is necessary to fabricate the center lens as two hemispherical elements. Because optical glass K-LASFN9 has a high refractive index of 1.81, the interface between the two hemispherical elements can cause reflections at large incidence angles unless the interface is index matched, and the index of optical cements is limited. For example, the Norland UV-cure epoxy NOA164 has an index of 1.64. This results in a critical angle of 65° and a maximum achievable field of view of ±55°. For this glass system, it is preferable to fabricate the center lens as a single spherical element and operate the system in the “virtual iris” mode, where the system can provide a maximum field of view of ±78.5°. The optical layout, MTF, and ray fan diagrams of the top solution operating in “virtual” stop mode are shown in Fig. 11.

 

Fig. 11. Highest ranked design solution (high-index center glass).

Download Full Size | PPT Slide | PDF

Table 4 shows the detailed optical prescription of the monocentric lens example. The lens operation is shown in three configurations. The distance to the object plane is changed from infinity to 1 m and then to 0.5 m. The back focal distance (thickness 5 in Table 4) is changed as well. The back focal distance for the object at infinity is 2.92088 mm; for the 1 m object distance, 3.06156 mm; and for the 0.5 m object distance, 3.20050 mm.

Tables Icon

Table 4. Optical Prescription for Refocusing of the Monocentric Lens, Showing Multiple Configurations for Three Object Distances

The MTF for two extreme object positions is shown in Fig. 12. After refocusing at a closer object distance, the image surface loses concentricity with the lens but still retains the original radius, as was shown in Section 2.A. The loss of concentricity enables conjugation between planar objects at a finite distance to the spherical image surface, and so implies the ability to focus on points anywhere in the object space. A concentric system designed at a specific finite object radius can be conjugated only with a specific concentric spherical image surface, and focusing a concentric lens system on spherical objects at different distances would require a change in the image surface radius. At closer object distances, angle α1 (Fig. 3) increases from zero to some small finite value, which changes the spherical aberration [Eq. (10)] and MTF slightly away from optimum value. As predicted by third-order aberration analysis, we maintain image quality close to the original infinite conjugate design during the refocusing operation, over a wide range of field angles. ZEMAX design files modeling this lens are available for download [35]. Note that adequate simulation of the “virtual stop” operational mode in ZEMAX requires the rotation of the aperture stop at off-axis field angles, as well as rotation on this angle image surface around its center of curvature. In this file, a distinct configuration has been defined for each field angle.

 

Fig. 12. MTF performance of monocentric lens (a) focused at infinity (design) and (b) refocused at a flat object at 0.5 m.

Download Full Size | PPT Slide | PDF

The members of the third family of Table 3 have a glass with a significantly lower refraction index of the central glass ball, 1.69, which can be index matched with standard UV cure epoxies. This enables the lens to be fabricated as two halves and assembled with a physical aperture stop at the center. While members of this family have a slightly lower image quality performance, they can fully operate over a ±65° field of view in both “virtual stop” and “aperture stop” modes. The optical layout, MTF, and ray fan diagrams of the top member of the third family operating in the “aperture stop” mode for the object located at infinity are shown in Fig. 13(a). The ray fan graphs are shown for the axial and 60° field points at the image surface. The scale of ray fans is ±10μm. The design has been modified to include a 10 μm thick layer of NOA164 glue between the two glasses and in the central plane. Simplified ZEMAX files of this lens without optical cement are also available.[36].

 

Fig. 13. Top member of the third family (lower center glass index) operating in (a) the physical aperture stop, including a 73° field angle to illustrate the effect of aperture vignetting and (b) “virtual” aperture stop mode, with uniform response up to 80°.

Download Full Size | PPT Slide | PDF

The optical layout and MTF of the top member of the third family operating in the “fiber stop” mode are shown in Fig. 13(b). The “virtual stop” mode has a uniform high image quality performance all over the field. The “aperture stop” mode suffers from a drop in performance at the edge of the field due to the pupil vignetting and aberrations in the optical cement layer. This design is useful, however, as operation in the “aperture stop” mode simplifies requirements to the image transfer and detection. In this case the input aperture of the fibers bundle can exceed the back aperture of the optics, as the physical aperture stop provides all stray light filtering needed, and the image transfer can be done with relay optics or with standard Schott fiber bundles with NA=1 and 2.5 μm pitch.

C. Comparison with Conventional Wide-Angle Lenses

The architecture of the monocentric lens is intrinsically compact: the principal (central) rays of all field angles are orthogonal to the front (powered) optical surface and are directly focused to an image surface that is always substantially perpendicular to the incident light. Conventional wide-field-of-view imagers require a lens that conveys wide-field input to a contiguous planar image surface. Extreme wide-angle “fisheye” lenses use a two-stage architecture in the more general class of reverse-telephoto lenses [1,37], where the back focal distance is greater than the focal length. The front optic is a wide-aperture negative lens to ensure acceptance of at least a fraction of light incident over a wide angle range. The negative power reduces the divergence of the principal rays of the input beams, so the following focusing positive component operates with a significantly reduced field of view, although it must provide sufficient optical power to compensate for the negative first element. This retro-telephoto architecture results in a physically bulky optic. Figure 14 shows two conventional wide-field lenses on the same scale as the f=12mm design example above. At the top of the figure is a 13-element “fisheye” lens taken directly from the “Zebase” catalog of standard lens designs (F_004), then scaled from 31 mm to the 12 mm focal length. The smaller lens at the center of Fig. 14 is based on the prescription for a 9-element wide-field lens specifically designed to minimize volume [38]. This design was intended for a shorter focal length, and the aberrations are not well corrected when scaled to 12 mm. When this design was optimized, even allowing for aspheric surfaces, the components tended toward the same overall physical shape and volume of the lens above. Both of these lenses are substantially larger than the monocentric lens design, even including the volume of the fiber bundles, and they collect less light, even accounting for coupling and transmission losses in the fiber bundles.

 

Fig. 14. Comparison of two conventional wide-field lenses with a monocentric-waveguide lens. All have a 12 mm focal length, 120° field of view, and similar light collection, but the monocentric lens provides higher resolution in a compact volume.

Download Full Size | PPT Slide | PDF

4. Conclusion

This paper describes a general perspective on monocentric-lens-based imagers, including discussions of stray light control, focus, and a method for conducting a global search of optimal design solutions for two-glass symmetric monocentric lenses. We proved that the monocentric lenses can be focused in the conventional way by axially moving the lens regarding a fixed-radius spherical image surface, indicating the feasibility of panoramic lenses using a fiber bundle image transfer. We checked our design approach by comparing the result for a 70 mm focal length lens and concluded that it is an effective approach to identifying top candidate solutions for specific applications. Practical constraints will determine the final selection. For the “virtual iris” stray light filtering, a spherical high-index central glass solution can be used. With a conventional aperture stop, where the central ball lens consists of two hemispherical elements and a physical stop at the center, reflections from the internal surface can limit the extreme field angle, favoring solutions with a lower index center lens.

Looking further at a specific design example, we identified high- and low-index center glass solutions for f/1.7 12 mm focal length monocentric lenses with (at least) a 120° field of view and showed that this lens compares favorably to conventional fisheye and projection lens solutions to panoramic imaging. It is reasonable to ask whether the specific designs used for the comparison were optimal, and in fact there is clearly room for improvement in these specific designs. However, the standard lens categorization shown in Fig. 15 indicates that monocentric lenses can enter a domain of light collection and field of view that is not otherwise addressable [39].

 

Fig. 15. Systematic diagram of photographic lens setup families, including monocentric multiscale and waveguide imagers. Figure adapted from [39].

Download Full Size | PPT Slide | PDF

We conclude that monocentric lenses offer a significant potential for panoramic imaging and deserve further investigation. Future topics of research and development are the systematic design of more complex monocentric lens structures, image processing techniques, and practical technologies for image transfer from the spherical image surface to planar image sensors.

This research was supported by the DARPA SCENICC program under contract W911NF-11-C-0210 and by the DARPA AWARE program under contract HR0011-10C-0073.

References

1. W. Smith, Modern Lens Design, 2nd ed. (McGraw Hill, 2005).

2. T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE J. 120, 24–31 (2011). [CrossRef]  

3. R. Kingslake, A History of the Photographic Lens (Academic, 1989), pp. 49–67.

4. G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009). [CrossRef]  

5. J. E. Ford and E. Tremblay, “Extreme form factor imagers,” in Imaging Systems, OSA Technical Digest (CD) (2010), paper IMC2.

6. D. J. Brady and N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009). [CrossRef]  

7. O. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel computational imaging,” in IEEE International Conference on Computational Photography (IEEE, 2011), pp. 1–8.

8. H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

9. D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012). [CrossRef]  

10. E. J. Tremblay, D. L. Marks, D. J. Brady, and J. E. Ford, “Design and scaling of monocentric multiscale imagers,” Appl. Opt. 51, 4691–4702 (2012). [CrossRef]  

11. P. Milojkovic and J. Mait, “Space-bandwidth scaling for wide field-of-view imaging,” Appl. Opt. 51, A36–A47 (2012). [CrossRef]  

12. J. A. Waidelich Jr., “Spherical lens imaging device,” U.S. patent 3,166,623 (19 January 1965).

13. J. J. Hancock, The design, fabrication, and calibration of a fiber filter spectrometer, Ph.D. Thesis (University of Arizona, 2012); see also product datasheets posted on Schott Fiber Optics website, “Schott Fiber Optic Faceplates” (faceplates_us_march_2011.pdf) and “Schott Fused Imaging Fiber Tapers” (Tapers-US-October_2011.pdf).

14. R. Drougard, “Optical transfer properties of fiber bundles,” J. Opt. Soc. Am. 54, 907–914 (1964). [CrossRef]  

15. J.-H. Han, J. Lee, and J. U. Kang, “Pixelation effect removal from fiber bundle probe based optical coherence tomography imaging,” Opt. Express 18, 7427–7439 (2010). [CrossRef]  

16. Y. F. Li and J. W. Y. Lit, “Transmission properties of a multimode optical-fiber taper,” J. Opt. Soc. Am. A 2, 462–468 (1985). [CrossRef]  

17. J. M. Cobb, D. Kessler, and J. Agostinelli, “Optical design of a monocentric autostereoscopic immersive display,” Proc. SPIE 4832, 80–90 (2002). [CrossRef]  

18. J. M. Cobb, D. Kessler, and J. E. Roddy, “Autostereoscopic optical apparatus,” U.S. Patent 6,871,956 (29 March 2005).

19. Hoya glass catalog, http://www.hoya-opticalworld.com/english/ (20 June 2012).

20. D. Marks, E. Tremblay, J. Ford, and D. Brady, “Multicamera aperture scale in monocentric gigapixel cameras,” Appl. Opt. 50, 5824–5833 (2011). [CrossRef]  

21. M. Born and E. Wolf, Principles of Optics, 7th expanded ed. (Cambridge University, 1999).

22. V. Churilovskiy, The Theory of Chromatism and Third Order Aberrations (Mashinostroenie, 1968).

23. J. Sasian, “Theory of sixth-order wave aberrations,” Appl. Opt. 49, D69–D95 (2010). [CrossRef]  

24. R. Kingslake and R. B. Johnson, Lens Design Fundamentals, 2nd ed. (SPIE, 2010).

25. G. G. Slyusarev, Aberrations and Optical Design Theory, 2nd ed. (Adam Hilger, 1984).

26. J. L. Rayces and M. Rosete-Aguilar, “Selection of glasses for achromatic doublets with reduced secondary spectrum. I. Tolerances conditions for secondary spectrum, spherochromatism, and fifth-order spherical aberrations,” Appl. Opt. 40, 5663–5676 (2001). [CrossRef]  

27. I. Gardner, “Application of algebraic aberration equations to optical design,” Scientific Papers of the Bureau of Standards No. 550 (1927).

28. H. A. Buchdahl, Optical Aberrations Coefficients (Dover, 1968).

29. Schott glass catalog, http://www.us.schott.com/advanced_optics/english/download/schott_optical_glass_catalogue_excel_june_2012.xls.Schott.

30. Ohara glass catalog, http://www.oharacorp.com/xls/glass-data-2012.xls.

31. Sumita glass catalog, http://www.sumita-opt.co.jp/ja/goods/data/glassdata.xls.

32. Hoya glass catalog, http://www.hoyaoptics.com/pdf/MasterOpticalGlass.xls.

33. M. M. Rusinov, Handbook of Computational Optics(Mashinostroenie, 1984), Chap. 23.

34. I. Agurok, “Method of ‘truss’ approximation in wavefront testing,” Proc. SPIE 3782, 337–348 (1999). [CrossRef]  

35. http://psilab.ucsd.edu/VirtualStopFamily1.zip.

36. http://psilab.ucsd.edu/PhysicalApertureStopFamily3.zip.

37. J. Kulmer and M. Bauer, “Fish-eye lens designs and their relative performance,” Proc. SPIE 4093, 360–369 (2000). [CrossRef]  

38. M. Horimoto, “Fish eye lens system,” U.S. Patent 4,412,726(1 November 1983).

39. H. Gross, F. Blechinger, and B. Achtner, Survey of Optical Instruments, Vol. 4 of Handbook of Optical Systems (Wiley, 2008).

References

  • View by:
  • |
  • |
  • |

  1. W. Smith, Modern Lens Design, 2nd ed. (McGraw Hill, 2005).
  2. T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE J. 120, 24–31 (2011).
    [CrossRef]
  3. R. Kingslake, A History of the Photographic Lens (Academic, 1989), pp. 49–67.
  4. G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009).
    [CrossRef]
  5. J. E. Ford, and E. Tremblay, “Extreme form factor imagers,” in Imaging Systems, OSA Technical Digest (CD) (2010), paper IMC2.
  6. D. J. Brady and N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009).
    [CrossRef]
  7. O. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel computational imaging,” in IEEE International Conference on Computational Photography (IEEE, 2011), pp. 1–8.
  8. H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.
  9. D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012).
    [CrossRef]
  10. E. J. Tremblay, D. L. Marks, D. J. Brady, and J. E. Ford, “Design and scaling of monocentric multiscale imagers,” Appl. Opt. 51, 4691–4702 (2012).
    [CrossRef]
  11. P. Milojkovic and J. Mait, “Space-bandwidth scaling for wide field-of-view imaging,” Appl. Opt. 51, A36–A47 (2012).
    [CrossRef]
  12. J. A. Waidelich, “Spherical lens imaging device,” U.S. patent 3,166,623 (19January1965).
  13. J. J. Hancock, The design, fabrication, and calibration of a fiber filter spectrometer, Ph.D. Thesis (University of Arizona, 2012); see also product datasheets posted on Schott Fiber Optics website, “Schott Fiber Optic Faceplates” (faceplates_us_march_2011.pdf) and “Schott Fused Imaging Fiber Tapers” (Tapers-US-October_2011.pdf).
  14. R. Drougard, “Optical transfer properties of fiber bundles,” J. Opt. Soc. Am. 54, 907–914 (1964).
    [CrossRef]
  15. J.-H. Han, J. Lee, and J. U. Kang, “Pixelation effect removal from fiber bundle probe based optical coherence tomography imaging,” Opt. Express 18, 7427–7439 (2010).
    [CrossRef]
  16. Y. F. Li and J. W. Y. Lit, “Transmission properties of a multimode optical-fiber taper,” J. Opt. Soc. Am. A 2, 462–468 (1985).
    [CrossRef]
  17. J. M. Cobb, D. Kessler, and J. Agostinelli, “Optical design of a monocentric autostereoscopic immersive display,” Proc. SPIE 4832, 80–90 (2002).
    [CrossRef]
  18. J. M. Cobb, D. Kessler, and J. E. Roddy, “Autostereoscopic optical apparatus,” U.S. Patent 6,871,956 (29March2005).
  19. Hoya glass catalog, http://www.hoya-opticalworld.com/english/ (20June2012).
  20. D. Marks, E. Tremblay, J. Ford, and D. Brady, “Multicamera aperture scale in monocentric gigapixel cameras,” Appl. Opt. 50, 5824–5833 (2011).
    [CrossRef]
  21. M. Born and E. Wolf, Principles of Optics, 7th expanded ed. (Cambridge University, 1999).
  22. V. Churilovskiy, The Theory of Chromatism and Third Order Aberrations (Mashinostroenie, 1968).
  23. J. Sasian, “Theory of sixth-order wave aberrations,” Appl. Opt. 49, D69–D95 (2010).
    [CrossRef]
  24. R. Kingslake and R. B. Johnson, Lens Design Fundamentals, 2nd ed. (SPIE, 2010).
  25. G. G. Slyusarev, Aberrations and Optical Design Theory, 2nd ed. (Adam Hilger, 1984).
  26. J. L. Rayces and M. Rosete-Aguilar, “Selection of glasses for achromatic doublets with reduced secondary spectrum. I. Tolerances conditions for secondary spectrum, spherochromatism, and fifth-order spherical aberrations,” Appl. Opt. 40, 5663–5676 (2001).
    [CrossRef]
  27. I. Gardner, “Application of algebraic aberration equations to optical design,” Scientific Papers of the Bureau of Standards No. 550 (1927).
  28. H. A. Buchdahl, Optical Aberrations Coefficients (Dover, 1968).
  29. Schott glass catalog, http://www.us.schott.com/advanced_optics/english/download/schott_optical_glass_catalogue_excel_june_2012.xls .Schott.
  30. Ohara glass catalog, http://www.oharacorp.com/xls/glass-data-2012.xls .
  31. Sumita glass catalog, http://www.sumita-opt.co.jp/ja/goods/data/glassdata.xls .
  32. Hoya glass catalog, http://www.hoyaoptics.com/pdf/MasterOpticalGlass.xls .
  33. M. M. Rusinov, Handbook of Computational Optics(Mashinostroenie, 1984), Chap. 23.
  34. I. Agurok, “Method of ‘truss’ approximation in wavefront testing,” Proc. SPIE 3782, 337–348 (1999).
    [CrossRef]
  35. http://psilab.ucsd.edu/VirtualStopFamily1.zip .
  36. http://psilab.ucsd.edu/PhysicalApertureStopFamily3.zip .
  37. J. Kulmer and M. Bauer, “Fish-eye lens designs and their relative performance,” Proc. SPIE 4093, 360–369 (2000).
    [CrossRef]
  38. M. Horimoto, “Fish eye lens system,” U.S. Patent 4,412,726(1November1983).
  39. H. Gross, F. Blechinger, and B. Achtner, Survey of Optical Instruments, Vol. 4 of Handbook of Optical Systems (Wiley, 2008).

2012

2011

D. Marks, E. Tremblay, J. Ford, and D. Brady, “Multicamera aperture scale in monocentric gigapixel cameras,” Appl. Opt. 50, 5824–5833 (2011).
[CrossRef]

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE J. 120, 24–31 (2011).
[CrossRef]

2010

2009

G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009).
[CrossRef]

D. J. Brady and N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009).
[CrossRef]

2002

J. M. Cobb, D. Kessler, and J. Agostinelli, “Optical design of a monocentric autostereoscopic immersive display,” Proc. SPIE 4832, 80–90 (2002).
[CrossRef]

2001

2000

J. Kulmer and M. Bauer, “Fish-eye lens designs and their relative performance,” Proc. SPIE 4093, 360–369 (2000).
[CrossRef]

1999

I. Agurok, “Method of ‘truss’ approximation in wavefront testing,” Proc. SPIE 3782, 337–348 (1999).
[CrossRef]

1985

1964

Achtner, B.

H. Gross, F. Blechinger, and B. Achtner, Survey of Optical Instruments, Vol. 4 of Handbook of Optical Systems (Wiley, 2008).

Agostinelli, J.

J. M. Cobb, D. Kessler, and J. Agostinelli, “Optical design of a monocentric autostereoscopic immersive display,” Proc. SPIE 4832, 80–90 (2002).
[CrossRef]

Agurok, I.

I. Agurok, “Method of ‘truss’ approximation in wavefront testing,” Proc. SPIE 3782, 337–348 (1999).
[CrossRef]

Bauer, M.

J. Kulmer and M. Bauer, “Fish-eye lens designs and their relative performance,” Proc. SPIE 4093, 360–369 (2000).
[CrossRef]

Blechinger, F.

H. Gross, F. Blechinger, and B. Achtner, Survey of Optical Instruments, Vol. 4 of Handbook of Optical Systems (Wiley, 2008).

Born, M.

M. Born and E. Wolf, Principles of Optics, 7th expanded ed. (Cambridge University, 1999).

Brady, D.

Brady, D. J.

E. J. Tremblay, D. L. Marks, D. J. Brady, and J. E. Ford, “Design and scaling of monocentric multiscale imagers,” Appl. Opt. 51, 4691–4702 (2012).
[CrossRef]

D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012).
[CrossRef]

D. J. Brady and N. Hagen, “Multiscale lens design,” Opt. Express 17, 10659–10674 (2009).
[CrossRef]

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

Buchdahl, H. A.

H. A. Buchdahl, Optical Aberrations Coefficients (Dover, 1968).

Churilovskiy, V.

V. Churilovskiy, The Theory of Chromatism and Third Order Aberrations (Mashinostroenie, 1968).

Cobb, J. M.

J. M. Cobb, D. Kessler, and J. Agostinelli, “Optical design of a monocentric autostereoscopic immersive display,” Proc. SPIE 4832, 80–90 (2002).
[CrossRef]

J. M. Cobb, D. Kessler, and J. E. Roddy, “Autostereoscopic optical apparatus,” U.S. Patent 6,871,956 (29March2005).

Cossairt, O.

O. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel computational imaging,” in IEEE International Conference on Computational Photography (IEEE, 2011), pp. 1–8.

Drougard, R.

Feller, S. D.

D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012).
[CrossRef]

Ford, J.

D. Marks, E. Tremblay, J. Ford, and D. Brady, “Multicamera aperture scale in monocentric gigapixel cameras,” Appl. Opt. 50, 5824–5833 (2011).
[CrossRef]

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

Ford, J. E.

E. J. Tremblay, D. L. Marks, D. J. Brady, and J. E. Ford, “Design and scaling of monocentric multiscale imagers,” Appl. Opt. 51, 4691–4702 (2012).
[CrossRef]

J. E. Ford, and E. Tremblay, “Extreme form factor imagers,” in Imaging Systems, OSA Technical Digest (CD) (2010), paper IMC2.

Funatsu, R.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE J. 120, 24–31 (2011).
[CrossRef]

Gardner, I.

I. Gardner, “Application of algebraic aberration equations to optical design,” Scientific Papers of the Bureau of Standards No. 550 (1927).

Gehm, M. E.

D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012).
[CrossRef]

Golish, D. R.

D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012).
[CrossRef]

Gross, H.

H. Gross, F. Blechinger, and B. Achtner, Survey of Optical Instruments, Vol. 4 of Handbook of Optical Systems (Wiley, 2008).

Hagen, N.

Hahn, J.

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

Han, J.-H.

Hancock, J. J.

J. J. Hancock, The design, fabrication, and calibration of a fiber filter spectrometer, Ph.D. Thesis (University of Arizona, 2012); see also product datasheets posted on Schott Fiber Optics website, “Schott Fiber Optic Faceplates” (faceplates_us_march_2011.pdf) and “Schott Fused Imaging Fiber Tapers” (Tapers-US-October_2011.pdf).

Horimoto, M.

M. Horimoto, “Fish eye lens system,” U.S. Patent 4,412,726(1November1983).

Johnson, A.

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

Johnson, R. B.

R. Kingslake and R. B. Johnson, Lens Design Fundamentals, 2nd ed. (SPIE, 2010).

Kang, J. U.

Kessler, D.

J. M. Cobb, D. Kessler, and J. Agostinelli, “Optical design of a monocentric autostereoscopic immersive display,” Proc. SPIE 4832, 80–90 (2002).
[CrossRef]

J. M. Cobb, D. Kessler, and J. E. Roddy, “Autostereoscopic optical apparatus,” U.S. Patent 6,871,956 (29March2005).

Kim, J.

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

Kingslake, R.

R. Kingslake and R. B. Johnson, Lens Design Fundamentals, 2nd ed. (SPIE, 2010).

R. Kingslake, A History of the Photographic Lens (Academic, 1989), pp. 49–67.

Kittle, D. S.

D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012).
[CrossRef]

Krishnan, G.

G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009).
[CrossRef]

Kulmer, J.

J. Kulmer and M. Bauer, “Fish-eye lens designs and their relative performance,” Proc. SPIE 4093, 360–369 (2000).
[CrossRef]

Lee, J.

Li, Y. F.

Lit, J. W. Y.

Mait, J.

Marks, D.

Marks, D. L.

E. J. Tremblay, D. L. Marks, D. J. Brady, and J. E. Ford, “Design and scaling of monocentric multiscale imagers,” Appl. Opt. 51, 4691–4702 (2012).
[CrossRef]

D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012).
[CrossRef]

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

McLaughlin, P.

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

Miau, D.

O. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel computational imaging,” in IEEE International Conference on Computational Photography (IEEE, 2011), pp. 1–8.

Milojkovic, P.

Mitani, K.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE J. 120, 24–31 (2011).
[CrossRef]

Nayar, S. K.

G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009).
[CrossRef]

O. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel computational imaging,” in IEEE International Conference on Computational Photography (IEEE, 2011), pp. 1–8.

Nojiri, Y.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE J. 120, 24–31 (2011).
[CrossRef]

Rayces, J. L.

Roddy, J. E.

J. M. Cobb, D. Kessler, and J. E. Roddy, “Autostereoscopic optical apparatus,” U.S. Patent 6,871,956 (29March2005).

Rosete-Aguilar, M.

Rusinov, M. M.

M. M. Rusinov, Handbook of Computational Optics(Mashinostroenie, 1984), Chap. 23.

Sasian, J.

Schott,

Schott glass catalog, http://www.us.schott.com/advanced_optics/english/download/schott_optical_glass_catalogue_excel_june_2012.xls .Schott.

Shaw, J.

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

Slyusarev, G. G.

G. G. Slyusarev, Aberrations and Optical Design Theory, 2nd ed. (Adam Hilger, 1984).

Smith, W.

W. Smith, Modern Lens Design, 2nd ed. (McGraw Hill, 2005).

Son, H.

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

Stack, R.

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

Stack, R. A.

D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012).
[CrossRef]

Tremblay, E.

D. Marks, E. Tremblay, J. Ford, and D. Brady, “Multicamera aperture scale in monocentric gigapixel cameras,” Appl. Opt. 50, 5824–5833 (2011).
[CrossRef]

J. E. Ford, and E. Tremblay, “Extreme form factor imagers,” in Imaging Systems, OSA Technical Digest (CD) (2010), paper IMC2.

Tremblay, E. J.

E. J. Tremblay, D. L. Marks, D. J. Brady, and J. E. Ford, “Design and scaling of monocentric multiscale imagers,” Appl. Opt. 51, 4691–4702 (2012).
[CrossRef]

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

Vera, E. M.

D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012).
[CrossRef]

Waidelich, J. A.

J. A. Waidelich, “Spherical lens imaging device,” U.S. patent 3,166,623 (19January1965).

Wolf, E.

M. Born and E. Wolf, Principles of Optics, 7th expanded ed. (Cambridge University, 1999).

Yamashita, T.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE J. 120, 24–31 (2011).
[CrossRef]

Yanagi, T.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE J. 120, 24–31 (2011).
[CrossRef]

Yoshida, T.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE J. 120, 24–31 (2011).
[CrossRef]

Appl. Opt.

J. Opt. Soc. Am.

J. Opt. Soc. Am. A

Nature

D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486, 386–389 (2012).
[CrossRef]

Opt. Express

Proc. SPIE

I. Agurok, “Method of ‘truss’ approximation in wavefront testing,” Proc. SPIE 3782, 337–348 (1999).
[CrossRef]

J. Kulmer and M. Bauer, “Fish-eye lens designs and their relative performance,” Proc. SPIE 4093, 360–369 (2000).
[CrossRef]

G. Krishnan and S. K. Nayar, “Towards a true spherical camera,” Proc. SPIE 7240, 724002 (2009).
[CrossRef]

J. M. Cobb, D. Kessler, and J. Agostinelli, “Optical design of a monocentric autostereoscopic immersive display,” Proc. SPIE 4832, 80–90 (2002).
[CrossRef]

SMPTE J.

T. Yamashita, R. Funatsu, T. Yanagi, K. Mitani, Y. Nojiri, and T. Yoshida, “A camera system using three 33-megapixel CMOS image sensors for UHDTV2,” SMPTE J. 120, 24–31 (2011).
[CrossRef]

Other

R. Kingslake, A History of the Photographic Lens (Academic, 1989), pp. 49–67.

J. A. Waidelich, “Spherical lens imaging device,” U.S. patent 3,166,623 (19January1965).

J. J. Hancock, The design, fabrication, and calibration of a fiber filter spectrometer, Ph.D. Thesis (University of Arizona, 2012); see also product datasheets posted on Schott Fiber Optics website, “Schott Fiber Optic Faceplates” (faceplates_us_march_2011.pdf) and “Schott Fused Imaging Fiber Tapers” (Tapers-US-October_2011.pdf).

J. E. Ford, and E. Tremblay, “Extreme form factor imagers,” in Imaging Systems, OSA Technical Digest (CD) (2010), paper IMC2.

O. Cossairt, D. Miau, and S. K. Nayar, “Gigapixel computational imaging,” in IEEE International Conference on Computational Photography (IEEE, 2011), pp. 1–8.

H. Son, D. L. Marks, E. J. Tremblay, J. Ford, J. Hahn, R. Stack, A. Johnson, P. McLaughlin, J. Shaw, J. Kim, and D. J. Brady, “A Multiscale, wide field, gigapixel camera,” in Imaging Systems Applications, OSA Technical Digest (2011), paper JTuE2.

J. M. Cobb, D. Kessler, and J. E. Roddy, “Autostereoscopic optical apparatus,” U.S. Patent 6,871,956 (29March2005).

Hoya glass catalog, http://www.hoya-opticalworld.com/english/ (20June2012).

I. Gardner, “Application of algebraic aberration equations to optical design,” Scientific Papers of the Bureau of Standards No. 550 (1927).

H. A. Buchdahl, Optical Aberrations Coefficients (Dover, 1968).

Schott glass catalog, http://www.us.schott.com/advanced_optics/english/download/schott_optical_glass_catalogue_excel_june_2012.xls .Schott.

Ohara glass catalog, http://www.oharacorp.com/xls/glass-data-2012.xls .

Sumita glass catalog, http://www.sumita-opt.co.jp/ja/goods/data/glassdata.xls .

Hoya glass catalog, http://www.hoyaoptics.com/pdf/MasterOpticalGlass.xls .

M. M. Rusinov, Handbook of Computational Optics(Mashinostroenie, 1984), Chap. 23.

M. Horimoto, “Fish eye lens system,” U.S. Patent 4,412,726(1November1983).

H. Gross, F. Blechinger, and B. Achtner, Survey of Optical Instruments, Vol. 4 of Handbook of Optical Systems (Wiley, 2008).

M. Born and E. Wolf, Principles of Optics, 7th expanded ed. (Cambridge University, 1999).

V. Churilovskiy, The Theory of Chromatism and Third Order Aberrations (Mashinostroenie, 1968).

R. Kingslake and R. B. Johnson, Lens Design Fundamentals, 2nd ed. (SPIE, 2010).

G. G. Slyusarev, Aberrations and Optical Design Theory, 2nd ed. (Adam Hilger, 1984).

http://psilab.ucsd.edu/VirtualStopFamily1.zip .

http://psilab.ucsd.edu/PhysicalApertureStopFamily3.zip .

W. Smith, Modern Lens Design, 2nd ed. (McGraw Hill, 2005).

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (15)

Fig. 1.
Fig. 1.

(a) Optical layout of a 2.4 gigapixel monocentric multiscale lens and (b) the same image field transferred by tapered fiber bundles instead of multiple relay optics.

Fig. 2.
Fig. 2.

Monocentric lens imaging with (a) a physical aperture stop at the center of the objective lens and (b) a “virtual stop” accomplished by limiting the NA of the image transfer, which, as drawn, are fiber bundles made from nonimaging low-NA optical fibers.

Fig. 3.
Fig. 3.

First- and third-order consideration of monocentric lens refocus.

Fig. 4.
Fig. 4.

Third-order aberration theory applied to monocentric lens design.

Fig. 5.
Fig. 5.

Monocentric lens real ray trace variables.

Fig. 6.
Fig. 6.

Example of dependence of criterion Q on the radius r1.

Fig. 7.
Fig. 7.

Image formation in the monocentric lens.

Fig. 8.
Fig. 8.

AWARE2 lens and global optimum solution.

Fig. 9.
Fig. 9.

MTF and ray aberration performance comparison of the (a) fabricated AWARE2 prototype and (b) new AWARE2 design candidate.

Fig. 10.
Fig. 10.

Correlation between polychromatic mean square wavefront deformation and MTF at 200lp/mm for candidates.

Fig. 11.
Fig. 11.

Highest ranked design solution (high-index center glass).

Fig. 12.
Fig. 12.

MTF performance of monocentric lens (a) focused at infinity (design) and (b) refocused at a flat object at 0.5 m.

Fig. 13.
Fig. 13.

Top member of the third family (lower center glass index) operating in (a) the physical aperture stop, including a 73° field angle to illustrate the effect of aperture vignetting and (b) “virtual” aperture stop mode, with uniform response up to 80°.

Fig. 14.
Fig. 14.

Comparison of two conventional wide-field lenses with a monocentric-waveguide lens. All have a 12 mm focal length, 120° field of view, and similar light collection, but the monocentric lens provides higher resolution in a compact volume.

Fig. 15.
Fig. 15.

Systematic diagram of photographic lens setup families, including monocentric multiscale and waveguide imagers. Figure adapted from [39].

Tables (4)

Tables Icon

Table 1. Optical Prescription of the Fabricated AWARE2 Lens

Tables Icon

Table 2. Optical Prescription of the Top Design Solution for the AWARE2 Lens

Tables Icon

Table 3. Top Solutions for a f/1.7 f=12mm Monocentric Lens

Tables Icon

Table 4. Optical Prescription for Refocusing of the Monocentric Lens, Showing Multiple Configurations for Three Object Distances

Equations (38)

Equations on this page are rendered with MathJax. Learn more.

Δx=f2dff2d.
Δx(β1)=f2cos(β1)d=Δxcos(β1)BinfQ¯=Δx(β1)cos(β1)=Δx=AinfA¯,
C=12s=1mhs(βs+1βs1ns+11ns)2(αs+1ns+1αsns),
D=12s=1m(1ns1ns+1)rs+C,
C=14nim(1Rt1Rs)andD=12nim1Rs,
ri=hini+1nini+1αi+1niαi,
αi+1=nini+1αi+hini+1nini+iri,
hi+1=hiαi+1di,
1f=2r1(11n2)+2r2(1n21n3).
B=12s=14hs(αs+1αs1ns+11ns)2(αs+1ns+1αsns).
W040=14Bρ4=ρ=118s=14hs(αs+1αs1ns+11ns)2(αs+1ns+1αsns),
r1=h1n21n2α2orα2=h1n21n2r1.
W040=h14((n223n2n3+n32)32f3(n2n3)2(n21)(n22n3)(n31)4n22(n2n3)2r13+(n21)2(n22+n2n3+n32)8fn22(n2n3)2r12(n21)(n22+n2n3+n32)16f2n2(n2n3)2r1).
L0=2W020=h12((n31)(2f(1n2)+n2r1)f(n2n3)r1n3v3(n21)(2f(1n3)+n3r1)f(n2n3)r1n2v2).
sin(ϕ1)=hr1.
sin(ϕ1)=hr1n2.
sin(ϕ2)=r1r2sin(ϕ1)=r1r2hr1n2=hr2n2.
sin(ϕ2)=n2n3sin(ϕ2)=n2n3hr2n2=hr2n3.
sin(ϕ3)=n3n2sin(ϕ3)=n3n2hr2n3=hr2n2.
sin(ϕ4)=r2r1sin(ϕ3)=hr1n2.
sin(ϕ4)=n2sin(ϕ4)=hr1.
ϕ4ϕ1.
S=r1sin(ϕ4)sin[180°(180°ϕ4)(180°ϕ1ϕ22ϕ33ϕ44)]
S=r1sin(ϕ4)sin(180°+2ϕ1+ϕ22+ϕ33+ϕ44).
ϕ1=arcsin(hr1).
ϕ22=ϕ2ϕ1=arcsin(hr2n2)arcsin(hr1n2),
ϕ33=180°2ϕ2=180°2arcsin(hr2n3),
ϕ44=ϕ3ϕ4=arcsin(hr2n2)arcsin(hr1n2).
S=hsin{2[arcsin(hr1)arcsin(hr1n2)+arcsin(hr2n2)arcsin(hr2n3)]}.
ΔS(hi)=S(hi)f.
Q=i=13Abs(ΔS(hi,λ))+j=13kjAbs[ΔS(hj,λ)ΔS(hk,λ)],
ΔY=WρλA,
ΔS(ρ)=ΔYAρ=WρλA2ρ,
ΔS(ρ,λi)A2λi=4C20+C40(24ρ212)+C60(120ρ4120ρ2+24)+C80(560ρ6840ρ4+360ρ240).
j=19[ΔS(ρ,λi)A2λi4C20(λi)C40(λi)(24ρ212)C60(λi)(120ρ4120ρ2+24)C80(λi)(560ρ6840ρ4+360ρ240)]2=min.
dS=4C20(λ2)λ2A2.
C20new(λi)=C20(λi)dSA24λi.
(ΔΦ)2=Φ¯2(Φ¯)2=12i=13{[C20new(λi)]23+[C40(λi)]25+[C60(λi)]27+[C80(λi)]29},

Metrics