Abstract

Diffractive optical elements (DOE) show great promise for imaging optics that are thinner and more lightweight than conventional refractive lenses while preserving their light efficiency. Unfortunately, severe spectral dispersion currently limits the use of DOEs in consumer-level lens design. In this article, we jointly design lightweight diffractive-refractive optics and post-processing algorithms to enable imaging under white light illumination. Using the Fresnel lens as a general platform, we show three phase-plate designs, including a super-thin stacked plate design, a diffractive-refractive-hybrid lens, and a phase coded-aperture lens. Combined with cross-channel deconvolution algorithm, both spherical and chromatic aberrations are corrected. Experimental results indicate that using our computational imaging approach, diffractive-refractive optics is an alternative candidate to build light efficient and thin optics for white light imaging.

© 2015 Optical Society of America

1. Introduction

Modern photography needs an efficient light acquisition procedure, in which high quality lens design is key. In the past decades, the revolution from film-based to digital photography and steadily increasing processing power in the capturing devices have enabled many computational imaging applications. This computational imaging technology has, in turn, created a demand for new and flexible lens systems [1]. However, designing an excellent digital imaging device requires consideration of multi-dimensional factors, such as light acquisition efficiency, material cost, size, design flexibility, and computational complexity. In modern photography, a common approach to handling the imperfections of optical systems is to remove the remaining optical aberrations in a post-capture deconvolution step.

1.1. Related work

Optical aberration and lens design space

Optical systems suffer from both monochromatic and chromatic aberrations [2], which cause unwanted blurring of the image. Conventional lens design attempts to minimize various aberrations by designing increasingly complex lens structures that consist of a large number of elements [3]. This involves designing aspherical surfaces as well as finding materials with better optical properties. Therefore, lens design is a compromise among various optical evaluation criteria [4]. From another perspective, computational imaging with a simple lens [5] provides the possibility of using post-processing computation to simplify modern lens design. We envision this combination to be one of major future directions in photography industry. Different criteria, e.g., design footprint, light and computational efficiency, should be traded off.

Diffractive optical elements imaging

Diffractive optical elements (DOE) are operated by means of interference and diffraction to produce arbitrary distributions of light [6, 7]. DOEs have been investigated intensively in the optics community, for applications such as extending the depth of field in microscopes [8], or realizing an optimal phase-only filter attached with lenses to gain expected optical transfer function [9]. DOEs have three main advantages over conventional refractive lenses: first, they can be fabricated on a thin sheet; second, a single DOE can perform many optical operations simultaneously, making it promising to act as a light modulation platform; third, because of the relatively flat structure, diffractive optics reduces distortions and therefore has much more uniform PSF distributions across the image plane. The last point is particularly beneficial for a post-capture deconvolution step.

The main drawback of DOE is its tendency to produce severe chromatic aberration in broadband light. A limited amount of work considers applying DOE to consumer-level cameras [10, 11]. One application is to use multi-layer DOEs to correct chromatic aberrations in conventional refractive lens systems [12], e.g., in Canon long-focus lenses.

However, diffractive phase modulation is promising in computational imaging [13, 14]. Two related concepts have been investigated: lensless computational sensors [15, 16]; and Fresnel lens imaging with post-processing [17]. The former proposes a novel imaging architecture that integrates the designed phase grating or functional micro-lens array on a CMOS, providing a thin structure with medium image resolution. The latter heuristically investigates the utility of a single Fresnel lens coupled with a gradient transfer post-processing to motivate inexpensive computer vision devices.

Deconvolution for aberration correction

From computational perspective, one can reconstruct the sharp through a deconvolution step. The core principle is to accurately build the image formation model and apply statistical priors to deal with optical aberrations [18–21].

With respect to chromatic aberration correction, the method in [22] measures chromatic aberrations at edges through color differences and compensates them locally. Although full PSFs are estimated, they are only used to remove chromatic aberrations, where a rough knowledge of the PSF is sufficient. Cross-channel optimization was investigated by [5]. Compared to the previous methods, this method exploits the correlation between gradients in different channels, which provides better localization and improves the image quality.

1.2. Motivation and contribution

Our work systematically considers the trade-off in a design space spanned from optical design to deconvolution design. Instead of purely relying on refractive optics, we seek an alternative way, diffractive phase modulation [23], to benefit modern lens design. One the one hand, we could obtain thinner structures, less material cost, better off-axis imaging performance than conventional refractive lenses. On the other hand, we could seek better chromatic performance than a single DOE. We focus on diffractive-refractive optics design jointly with computation to form a complete pipeline as shown in Fig. 1. Our method allows capturing a blurry intermediate image by a super thin phase modulated DOE, and then recovering the sharp image with deconvolution algorithms using cross-channel regularizations. Consequently, compact and lightweight broadband cameras could be designed. The results of our method indicate that our algorithm corrects the residual aberrations and preserves the details. We believe it is of considerable significance for future research and industrial designs.

 

Fig. 1 Compared to the imaging pipeline of conventional imaging with well corrected complex refractive lenses (top), our diffractive-refractive computational imaging allows capturing a blurry intermediate image by a super thin phase modulated DOE, and then recovering the sharp image with deconvolution algorithms using cross-channel regularizations. An exemplary blur image (bottom left) and its corresponding recovered image (bottom right) are shown. Insets of the images indicate that our algorithm corrects the residual aberrations and preserves the details.

Download Full Size | PPT Slide | PDF

The main technical contributions include: We propose a white light computational photography method using diffractive-refractive optics to design lenses with thin structures at a controllable cost. We analyze diffractive phase modulation to offer considerable benefits for broadband imaging by allowing flexible compromise between diverse efficiencies in different application scenarios, including the use of a super-thin diffractive optical lens, a diffractive-refractive-hybrid lens, and a phase coded-aperture lens. We employ a fast but robust deconvolution to correct chromatic aberrations introduced by the diffractive phase modulation.

2. Diffractive-refractive optics imaging

The goal is to provide alternative options for originally complex lens design in modern photography by implementing diffractive phase modulation on a thin substrate.

2.1. Optical characteristics of fresnel lenses

Before building the imaging model, two optical characteristics of Fresnel zone-plate (FZP), multi-order diffraction and wavelength sensitivity, need to be addressed.

Multi-order diffraction

The multi-order diffraction is accounted for as the main reason for low light efficiency of amplitude-type DOEs. Due to the fact that the incident light is distributed into a series of diffraction orders as shown in Fig. 2, diffraction efficiency is usually very low. Ideally, if a continuous profile could be fabricated in a phase-only DOE, the diffraction efficiency is 100%. In practice, continuous profiles are difficult to fabricate. Instead, they are approximated by multi-level binary optical elements. We use multi-level phase zone plates (PZPs) in our implementation for high light efficiency. For an N-level DOE, the diffraction efficiency [24] is given by

ηmN={sin[π((n1)dλm)]π((n1)dλm)}2{sin[π((n1)dλN)]π((n1)dλN)}2,
where λ is the designed principal wavelength, m is the diffraction order, d is total height of the micro-structures on the substrate, and n is the refractive index at the design wavelength. With as many levels as one can fabricate, a discrete PZP can approximate a continuous Fresnel lens with a high enough diffraction efficiency. A realistic design for a DOE has 16 layers, in which case the diffraction efficiency is 99%. This number drops to 95% with 8 layers.

 

Fig. 2 A binary amplitude Fresnel zone plate (top-left) diffracts light into multi-orders (0th, ±1st, ±3rd,…), resulting in a low diffraction efficiency. With continuous phase-only diffractive lens (top-right), the light falls into +1st order could theoretically reach 100%. In practice, multi-level microstructures (bottom) are usually easy to fabricate and approximates the continuous profile very well.

Download Full Size | PPT Slide | PDF

Wavelength-dependent imaging

PZPs are wavelength-dependent DOEs. When applied in broadband imaging, especially under visible white light illumination, PZPs could be focused for the green wavelength on the image plane, but red wavelength would focus in the front while blue wavelength would focus in back of the image plane. The difference of the focal lengths for different wavelengths is the main drawback any DOE suffers from. The equivalent Abbe number for a PZP is −3.45 [7]. It differs significantly from refractive lenses (usually a large positive number) both in the chromatic distribution and magnitude, leading to a strong negative chromatic aberration.

2.2. Imaging pipeline

As indicated in Fig. 1, the pipeline of our proposed computational imaging comprises an optical design part and a following optimization procedure. The optical design aims at pre-defining a Point Spread Function (PSF) with expected energy distribution in the spatial and spectral domain. The PSF could be obtained by the scalar diffraction theory [23] as the Fourier transform of the complex transmission function of a lens. The resulting PZPs are fabricated by state-of-the-art photolithography techniques. Our optimization is designed jointly with the optical design. It corrects the residual chromatic aberrations that are introduced by the use of DOEs. We will demonstrate that our method of combining the optical design and algorithm design shows comparable image quality.

3. Optical designs

We carry out a study of three designs in the PZP form to make reasonable compromises among the diverse dimensions of the design space mentioned previously: super-thin diffractive optics, diffractive-refractive-hybrid lens and phase-coded-aperture lens. We fix two focal lengths as 50mm and 25mm for each design with the same aperture diameter 12.7mm for all designs. The design parameters and comparisons are illustrated in Fig. 3.

 

Fig. 3 Schematic comparisons of a super-thin diffractive optics (top), a diffractive-refractive-hybrid lens (middle), and a simple refractive lens (bottom) for f1 = 50mm (left) and f2 = 25mm (right). The R,G,B colors represent the focal powers for different wavelengths. “+” indicates the focal length for red is longer than that of blue, and “−” for the other way round.

Download Full Size | PPT Slide | PDF

Super-thin diffractive optics

First, we use a single PZP or stacked multiple PZPs as a lens, of which the maximum thickness could be only 1mm. We make the following design comparison to illustrate the benefits of the super-thin DOE (see the top row in Fig. 3). By maintaining the same optical power, the phase-plate design is much thinner and lightweight than the refractive lens. The reduction of thickness and volume will become crucial when designing a lens with a larger aperture but shorter focal length.

Although a multi-level PZP gains high light efficiency and high material efficiency, its aberration performance is generally not satisfactory for white light imaging, especially with respect to (negative) chromatic aberration, but also some residual spherical aberration. These aberrations increase the computational complexity of the reconstruction step, and limit its efficacy.

Diffractive-refractive-hybrid lens

Instead of shifting all chromatic aberrations to the computation side, we design a diffractive-refractive-hybrid lens. Specifically, a simple lens is attached to the above super-thin DOE design, therefore, the huge negative chromatic gap is partially compensated for by the positive chromatic aberration of the refractive lens (see the middle row in Fig. 3).

Note that the idea is not to design an achromatic doublet, such as the ones in [12,25]. Instead, we seek a good compromise between obtaining a better PSF distribution for post-processing and maintaining a thin structure. The attached simple lens can be a cheap off-the-shelf lens. Our designed PZP still takes charge of major focusing ability. For maintaining the same optical power as well as keeping a compact structure, one can achieve a compromised optical performance between a pure phase-plate and a simple lens. Besides, the hybrid design helps weaken the negative influence of other aberrations, compared with using a simple lens only.

Phase-coded-aperture lens

Apart from its super-thin structure benefit, a DOE has the advantage of design freedom. One can design expected functions on a single DOE, rather than applying a physical coded-aperture or multiple-capture techniques on a refractive lens. To modulate the energy distribution of a PSF, further to discrete information of light rays, one can gain additional benefits for computational processing [19]. Generally, a designed binary mask is configured at the aperture plane to selectively block light rays. On the contrary, we yield the compromise of blocking light rays to encoding the phase distribution of a PZP. Therefore, through interference and diffraction of the sub-structure on the PZP, the expected PSF shaping is obtained. During this procedure, the total light energy is almost preserved.

According to the above analysis, we designed two 16-level DOEs as shown in Fig. 4. Referring to Eq. (1), a 16-level profile can already provide a satisfactory light efficiency in common lighting environment. All DOEs are designed at the principal wavelength 550nm, thereby, the red and blue channels suffer from severe blurring with considerable noise. Note that here in the figure, the gray scales indicate the discrete phase distributions in the central area. By making use of the anti-symmetric phase distribution (see Fig. 4 on the right), we divide the 2D circular plane by 6 sectors, each of which has a symmetric counterpart with the phase shifted by π. The original Gaussian-like PSF can then be encoded into a three-pike star PSF.

 

Fig. 4 Left: Microscopic images of our PZP and phase-coded-aperture lens on a Nikon Eclipse L200N 5X microscope. Right: Real color PSFs captured are shown with their individual RGB components. Note that the axises in the plots denote the pixel domain, and the colorbars indicate the relative intensity.

Download Full Size | PPT Slide | PDF

4. Image reconstruction

In this section, we analyze the post-capture computation to recover the images captured by the presented diffractive-refractive optical designs.

4.1. Imaging model

In information theory, the defocus effect could be explained as the intrinsic image convolved with a blur kernel. In traditional refractive optics imaging, the blurred observation model in vector-matrix format is formulated as

b=Ki+n,
where b ∈ ℝm, K ∈ ℝm×m, i ∈ ℝm, n ∈ ℝm are the captured blurry image, convolution matrix, sharp latent image and possible noise in the capture, respectively. The matrix K can be derived from the blur kernel in a certain configuration.

With respect to diffractive phase modulation, the above image model is subject to the variation of wavelength information. The blur kernels Pc for the RGB color channels c ∈ {1,2,3} are explained as integrals of the PSFs in the respective spectral range λc:

Pc=λcP(λ)dλ,

λ here is the wavelength of input light ray. Thus, in general the PSF is content dependent. A pre-calibrated PSF can therefore only reduce aberrations to a certain level, but never completely.

4.2. Optimization method

We next describe the image reconstruction as an optimization solving the inverse problem of Eq. (2). Our approach follows closely the recent work by Heide et al. [26], with several improvements to exploit the specific structure of chromatic aberrations in DOE optics, as well as several improvements to computational efficiency.

Heide et al.’s method achieves state-of-the-art deblurring results by using Bayesian optimization that exploits patch-based correlation and cross-channel gradient correlation in the unknown latent image. The method can be cast as an optimization problem, as follows:

ic=argminicμc2bcKcic22+Γ(ic),
where the first term is a standard least-square data fitting term, μc is the weight of this term for channel c, and the second term Γ(ic) is a collection of priors that regularize the reconstruction. The regularization terms are taken from [26] and are briefly summarized in the following.

Cross-channel prior

Diffractive optics tend to produce good image quality for a single design wavelength. In white light applications, chromatic aberration limits the overall image quality, but one channel in the RGB image can usually be focused quite well and is significantly sharper than the other channels. Heide et al. [5] proposed a cross-channel prior that we can modify to take advantage of this fact. The idea is that, in real world the edges in different color channels appear at the same place. Furthermore, normalizing with the channel intensity allows for luma gradients, but discourages chroma gradients. For a pair of channels l,m the cross-channel prior is

imililimim/imil/il

Therefore, we have the regularizer term rewritten as

Γ(ic)=mlDimilDiiim1,
where il,m stand for the images convolved with different blur kernels, again with the l1 norm. The convolution matrix D{x,y} implements the first derivatives in x and y direction of the image.

Patch-based prior

A DOE suffers from more aberrated blur kernels than the ones measured for a simple refractive lens. Our reconstructions may be sensitive to noise and extreme chromatic aberrations. We use a slight modification of the patch-based BM3D prior [27] which is described in the context of the optimization framework in [26]. To exploit even more cross-channel information, we perform multiple BM3D passes with block-matching and 3D thresholding in different color spaces, e.g. the OPP and YUV color space in our work. Using different color spaces in this process has the effect that the distance of different colors are weighted differently according to the color space itself. Therefore using different color spaces affects the similarity based matching and the 3D thresholding of the BM3D method, which gives us significant improvements.

4.3. Analysis on efficient optimization

Inspecting again Eq. (4), the careful readers notice that we have only included the blurry convolution matrix Kc in the data term, but no demosaicking mask as proposed in [26]. This has the benefit that we efficiently solve (4) in the Fourier domain. While a spatial domain solver (e.g. based on CG) would add significant computational overhead, in general, results can be dramatically improved when jointly solving demosaicking and deblurring. However, in our specific case, the PSFs may distribute more than a hundred pixels. Sequential demosaicking does not change the results much, but can drastically increase the speed. We also found that only a small number of iterations of the outer loop are sufficient to converge to a good image quality. Despite these performance improvements, we note that our method is effective at recovering good quality images from very challenging input data: the PSFs in the two blurry channels are of very large size and classical methods do fail here. For the data with the largest PSFs – corresponding to a design with a small F number – we employ a multi-scale deconvolution scheme [21] to further speed up the processing.

5. Results

In this section, we show a number of experimental results implementing the above designs. For the figures shown, some are cropped regions to highlight the details.

5.1. Prototypes

In the fabrication of DOEs, either continuous or discrete surface profiles are optional if different manufacture methods are adopted [28,29]. We build the following prototypes by repeatedly performing the photolithography and reactive ion etching techniques. The feature size for the fabrication is 1μm, which is the minimum size on the edges in all our designs. To simplify the fabrication, we discretized the profile into the 16 levels. The diameter of all the lenses are 8.0mm. Accordingly, we use off-the-shelf Thorlabs lens tubes and plano-convex lenses for building all the prototypes. Further we consider to use commercial scientific cameras (PointGrey GS3-U3-50S5C) to implement all the experiments. The pixel pitch of the sensor is 3.45μm. It has a full well capacity of 6402e- and outputs 14-bit images with maximum SNR 38dB. In the experiments, we measured the noise level under the same illumination setting and exposure setting as in Fig. 5. The sensor characterized noise level is represented by the variance of 8.233 × 10−5 with no gain added in the sensor setting, and the exposure time set to 100ms. All images in the experiments are based on one single exposure image.

 

Fig. 5 Top of (b–f): The same scene captured by our single PZP (f = 100mm), stacked PZP (f = 50mm), hybrid coded-aperture PZP and refractive lens (f = 50mm), hybrid PZP and refractive lens (f = 50mm), and a simple lens (f = 50mm) respectively. Middle of (b–f): Corresponding deconvolved images for each lens. Bottom of (b–f): R, G and B components of the PSFs for each setup. The scene is projected onto a board by a projector. Note that we only show the same cropped area for cross comparison here. The ground truth image captured by a Canon DSLR is presented for reference (a-2), with the PSNR plot provided (a-1) corresponding to the designs (c–f).

Download Full Size | PPT Slide | PDF

Super-thin DOE optics

We weight more on the super-thin and lightweight expectation such that we tolerate a certain amount of aberrations in hardware. One single 16-level PZP (f = 100mm, D = 8mm) can be exclusively used as an imaging lens with a thickness of only 0.5mm. Moreover, two PZPs can be stacked together to make a doublet lens f = 50mm with the thickness only 1mm. Figure 5 presents the results of the single and stacked DOEs (b) and (c). From which we see that there is some light efficiency loss due to the fabrication errors in our prototype, reflected by a slight drop of image contrast. However, the deconvolved results still make sense in those scenarios where high-quality imaging is not strictly required, but the lens volume and thickness matter significantly.

Diffractive-refractive-hybrid lens

For the prototype of the hybrid design, one 16-level PZP (f = 100mm, D = 8mm) is attached to the back side of a plano-convex lens (f = 100mm) to make a group lens f = 50mm with the optics thickness less than 2.6mm. Figure 5 present the results of the diffractive-refractive-hybrid lens, and conventional refractive lens (e) and (f). From this result we see that the recovered image is comparable with the one recovered from a simple lens, and even slightly better in off-axis regions. Compared to pure DOE designs, the essential huge negative chromatic gap has been mitigated to a level that the cross-channel deconvolution can handle. Compared to single refractive lens, the off-axis spherical aberration behavior is better with the thickness slightly reduced.

Phase-coded-aperture lens

We fabricate a 6-sector phase plate with anti-symmetric structures. Same as the above prototypes, we attach a refractive lens for experimental convenience, such that we have the effective f = 50mm with aperture size D = 8mm. Figure 5 presents the result of the phase-coded-aperture lens (d). We envision more latent information is preserved due to the shaping of PSFs, especially in red and blue channels, which shall benefit the deconvolution procedure. Note that here we only implement a possible coded-aperture design.

In Fig. 5, we compare the performance of the three designs for the same house scene, which was projected onto a board. We show a ground truth image captured by a Canon DSLR in the bottom left for reference. The RGB components of the corresponding PSFs are shown in the bottom row to demonstrate the individual characteristics of our designs. We seek to use PSNR to intuitively present the benefit and the downside of our designs. Despite some color reproduction artifacts, our designs generate detailed, sharp images that are visually comparable to the refractive simple lens results. The hybrid designs (c) and (d) indeed help to improve the image quality. These comparisons support our intentions of trading-off the image quality and lightweight requirement in the design space.

5.2. Experimental results

We then present several pairs of selected deconvolved results of natural scenes to validate the cross-channel optimization applicability, see Fig. 6, Fig. 7, Fig. 8, and Fig. 9 for the details. In each result, we show the captured blurry image on the left, and the deconvolved sharp image on the right. Insets of the cropped images show our algorithm recovers the spatial details well.

 

Fig. 6 Top: Images captured and deconvolved for a text scene with a single PZP (f = 100mm). The teapot and white texts portions show the ability of our algorithms to remove most of the chromatic aberrations. Bottom: Captured and deconvolved images for a flower and book scene with stacked PZPs (f = 50mm). The yellow and green objects are well recovered in the same scene. Note that both scenes are at the same distance from the lenses, and the field of views differ because of the focal lengths.

Download Full Size | PPT Slide | PDF

 

Fig. 7 Images captured (a) and deconvolved (b) for a box scene with our diffractive-refractive-hybrid lens (f = 50mm). By introducing a refractive lens, chromatic aberrations are smaller than in the single PZP and stacked PZP cases in Fig. 6. The inverse problem becomes more well-conditioned and shaper images could be recovered.

Download Full Size | PPT Slide | PDF

 

Fig. 8 Images captured (a) and deconvolved (b) for a ball scene with our phase-coded-aperture hybrid lens (f = 50mm). By coding the intensity distribution of the PSF, high frequency features as well as colors can be preserved better. Note that the net around the ball is recovered which otherwise would hardly be seen in the blurry image.

Download Full Size | PPT Slide | PDF

 

Fig. 9 Images captured (top) and deconvolved (bottom) for the ISO12233 resolution chart with our three designs: (from left to right) hybrid PZP and lens, hybrid phase-coded-aperture PZP and lens, and stacked PZPs. The focal lengths are f = 50mm for each design.

Download Full Size | PPT Slide | PDF

We use non-blind deconvolution, where the spatial variant PSFs are calibrated through deconvolving a series of random patterns or using optical simulations. The PSF estimation procedure is described in detail in [5]. All experiments are performed on a core of Intel Core2 i7 CPU with 2.40 GHz and 8GB RAM. The main computational cost is the cross-channel based optimization to deblur the image and remove chromatic aberrations. Generally, an image with around 2 megapixel size and 100 pixel size of moderate blur kernel needs less than 1 minute to get acceptable recovery with cross-channel prior, and 3 more minutes with one BM3D pass. As mentioned previously, we implement two priors jointly in the deconvolution procedure.

Spatial behavior of the PSF

A DOE exhibits lower geometric distortion than a simple lens, as illustrated in Fig. 10. Therefore, the monochromatic blurry is closer to the spatial invariant assumption, where PSF estimation for large field of view is not practical. This is a main advantage of introducing diffractive-refractive optics into the deconvolution problem. Note that the relatively lower contrast of the latter two is partially accounted on the light efficiency issue of our fabricated prototypes.

 

Fig. 10 Off-axis performance comparison for a simple refractive lens (left), our hybrid PZP lens (center), and stacked PZPs (right). Note that the first row presents the capture images of a black-white checkerboard, while the close-ups illustrate the on-axis and off-axis patches in their green channel, red channel, and blue channel visualization. From the comparison of patch A and patch B, we observe that our designs show better spatial uniformity, despite the chromatic aberration. All the focal lengths are f = 50mm and the full field of view is around 30°.

Download Full Size | PPT Slide | PDF

6. Discussion

Acquisition efficiency

We jointly consider three aspects, hardware structure (reflected by optical thickness and material volume cost), light transport value (reflected by the NA number), pre-conditioning performance for post-processing (evaluated by PSF distribution), regarding to the efficiency requirements for consumer-level capture devices.

The introduced hybrid design helps to maintain a relatively compromised effective F number, in our case around F6. With respect to hardware structure, all DOEs we fabricated are with the thickness 0.5mm. The fabrication technique may behave high-cost at the very beginning for a research lab, however, once a mask has been fabricated, mass product manufacturing can be operated smoothly. The substrate wafer is made from normal glass, thereby diffractive-refractive optics is more material efficient compared with modern refractive lenses, which usually require expensive glasses.

Image reconstruction robustness

The cross-channel prior added is itself not guaranteed convergence to a global optimum for all channels. However, in most cases, choosing appropriate solver and appropriate parameters can still give converged results.

Our design exhibits a robust reconstruction with a certain depth variation, illustrated in Fig. 11. For the outdoor scene with around 2m – 8m depth variation, we see that most chromatic aberration have been removed, with the exception of small artifacts in the red component of the background depth. We regard this as a limitation in our experiment due to insufficient PSF estimation. Further, similar to the DOF-extension potential in [30], DOEs focus different wavelengths at different depths. Therefore, by carefully selecting the weights of the cross-channel term for each spatial patch, one can recover an image with extended DOF.

 

Fig. 11 Images of captured (left) and deconvolved (right) for an outdoor scene with our hybrid PZP lens (f = 50mm). The far and near objects vary from around 8m (house) to around 2m (tree) from the lens.

Download Full Size | PPT Slide | PDF

Limitations and future work

Several essential limitations can not be ignored when applying diffractive phase modulation. Firstly, the number of levels of a DOE in practical manufacturing is dependent on its aperture size and required focal power. Generally, the phase modulation designs with discrete profile behave lower in diffraction efficiency, especially when being operated in harmonic diffraction orders [31], than conventional refractive lenses do. Fortunately, the ease of building larger apertures without increasing the thickness can compensate for the efficiency loss. The photolithography process used for our prototypes limits the feature size in the DOEs to 1μm. This directly affects the quality of the DOE, as well as the ability to scale down the optical systems to smaller cameras and sensors. Moving to e-beam lithography process would alleviate this limitation.

The chromatic performance is not compatible as that of a carefully designed lens currently. Although the cross-channel prior has removed most chromatic aberrations resulted from the phase modulation, color recovery is not entirely matched to the ground-truth data due to a possible mismatch between color spectrum of captured data and estimated PSFs. From the above deconvolution results, we still observe a fair amount of artifacts, including color ringing effects of edges and non-uniformity of less high-frequency containing regions. A color calibration step is to be added in the post-processing, which is essentially scene-dependent though.

A general design framework to guide users how to compromise the efficiency from multiple perspectives is to be investigated. Numerical optimization can be introduced into the aided-design procedure, as well as in post-processing. We envision a general synthetic framework will help increase the efficiency of designing different patterns.

The current design is not intended for replacing high-end DSLR lenses, but providing alternative options for camera designs with special form factor requirements. Considering the imaging performance of Fresnel lens designs, we regard one promising application is to act as the capture lens of modern mobile device or portable virtual reality device. In this market, the thickness and volume cost weight over the other factors. Referring to the current high-end mobile devices, even 0.5mm thinner can be a breakthrough.

7. Conclusion

In this paper, we propose a novel computational camera that bridges diffractive-refractive optics and computational algorithms, to build super-thin, compact and lightweight imaging systems. Three prototype designs using a combination of custom nano-fabricated DOEs and off-the-shelf products, are implemented to validate our concept. The hybrid designs hold smaller chromatic aberration than pure DOEs, while thinner and lighter structure, and better off-axis performance than that of a refractive lens. We employ a fast image deconvolution with cross-channel BM3D prior to tackle the chromatic aberration and the noise jointly.

In combining diffractive-refractive optics and computation algorithms, modern imaging device can be designed towards larger aperture, thinner structure, more flexible designs while still maintaining visually pleasing imaging performance. Overall speaking, our approach provides an alternative technical solution to design and fabricate fashion thin lenses. Therefore, we envision this concept be the trend for future commercial cameras.

Acknowledgments

This work was in part supported by King Abdullah University of Science and Technology (KAUST) baseline funding, the KAUST Advanced Nanofabrication Imaging and Characterization Core Lab, as well as a 4 Year Fellowship from the University of British Columbia (UBC).

References and links

1. S. W. Hasinoff and K. N. Kutulakos, “Light-efficient photography,” IEEE Trans. Pattern Analysis and Machine Intelligence 33, 2203–2214 (2011). [CrossRef]  

2. G. R. Fowles, Introduction to Modern Optics (Courier Dover Publications, 2012).

3. G. G. Sliusarev, “Aberration and optical design theory,” Adam Hilger, Ltd., Bristol, England, 672 p. ; Translation 1 (1984).

4. D. Malacara-Hernández and Z. Malacara-Hernández, Handbook of Optical Design (CRC, 2013).

5. F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-quality computational imaging through simple lenses,” ACM Trans. Graphics 32, 149 (2013). [CrossRef]  

6. S. Bernet, W. Harm, and M. Ritsch-Marte, “Demonstration of focus-tunable diffractive moiré-lenses,” Opt. Express 21, 6955–6966 (2013). [CrossRef]   [PubMed]  

7. D. C. O’Shea, T. J. Suleski, A. D. Kathman, and D. W. Prather, Diffractive Optics: Design, Fabrication, and Test, vol. 62 (SPIE, 2004).

8. S. Tucker, W. T. Cathey, and E. Dowski Jr et al., “Extended depth of field and aberration control for inexpensive digital microscope systems,” Opt. Express 4, 467–474 (1999). [CrossRef]   [PubMed]  

9. D. Elkind, Z. Zalevsky, U. Levy, and D. Mendlovic, “Optical transfer function shaping and depth of focus by using a phase only filter,” Appl. Opt. 42, 1925–1931 (2003). [CrossRef]   [PubMed]  

10. P. Trouve, F. Champagnat, G. L. Besnerais, G. Druart, and J. Idier, “Design of a chromatic 3d camera with an end-to-end performance model approach,” in “Computer Vision and Pattern Recognition Workshops (CVPRW), 2013 IEEE Conference on,” (IEEE, 2013), pp. 953–960.

11. Y. Ogura, N. Shirai, J. Tanida, and Y. Ichioka, “Wavelength-multiplexing diffractive phase elements: design, fabrication, and performance evaluation,” J. Opt. Soc. Am. A 18, 1082–1092 (2001). [CrossRef]  

12. T. Nakai and H. Ogawa, “Research on multi-layer diffractive optical elements and their application to camera lenses,” in “Diffractive Optics and Micro-Optics,” (Optical Society of America, 2002), p. DMA2. [CrossRef]  

13. V. A. Soifer, V. Kotlar, and L. Doskolovich, Iteractive Methods For Diffractive Optical Elements Computation (CRC, 2003).

14. C. J. Schuler, M. Hirsch, S. Harmeling, and B. Scholkopf, “Non-stationary correction of optical aberrations,” in “Computer Vision (ICCV), 2011 IEEE International Conference on,” (IEEE, 2011), pp. 659–666.

15. P. R. Gill and D. G. Stork, “Lensless ultra-miniature imagers using odd-symmetry spiral phase gratings,” in “Computational Optical Sensing and Imaging,” (Optical Society of America, 2013), pp. CW4C–3.

16. Y. Kitamura, R. Shogenji, K. Yamada, S. Miyatake, M. Miyamoto, T. Morimoto, Y. Masaki, N. Kondou, D. Miyazaki, and J. Tanida et al., “Reconstruction of a high-resolution image on a compound-eye image-capturing system,” Appl. Opt. 43, 1719–1727 (2004). [CrossRef]   [PubMed]  

17. A. Nikonorov, R. Skidanov, V. Fursov, M. Petrov, S. Bibikov, and Y. Yuzifovich, “Fresnel lens imaging with post-capture image processing,” in “Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops,” (2015), pp. 33–41.

18. Q. Shan, J. Jia, and A. Agarwala, “High-quality motion deblurring from a single image,” ACM Trans. Graphics 27, 73 (2008). [CrossRef]  

19. A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM Trans. Graphics 26, 70 (2007). [CrossRef]  

20. N. Joshi, C. L. Zitnick, R. Szeliski, and D. Kriegman, “Image deblurring and denoising using color priors,” in “Computer Vision and Pattern Recognition, 2009,” (IEEE, 2009), pp. 1550–1557. [CrossRef]  

21. L. Yuan, J. Sun, L. Quan, and H.-Y. Shum, “Progressive inter-scale and intra-scale non-blind image deconvolution,” ACM Trans. Graphics 27, 74 (2008).

22. S.-W. Chung, B.-K. Kim, and W.-J. Song, “Detecting and eliminating chromatic aberration in digital images,” in “Image Processing (ICIP), 2009 16th IEEE International Conference on,” (IEEE, 2009), pp. 3905–3908.

23. J. Goodman, Introduction to Fourier Optics (McGraw-hill, 2008).

24. G. J. Swanson, “Binary optics technology: theoretical limits on the diffraction efficiency of multilevel diffractive optical elements,” Tech. rep., DTIC Document (1991).

25. M. M. Meyers, “Hybrid refractive/diffractive achromatic camera lens,” (August 61998). US Patent 5,715,091

26. F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014). [CrossRef]  

27. K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3-d transform-domain collaborative filtering,” IEEE Trans. Image Processing 16, 2080–2095 (2007). [CrossRef]  

28. W. Duoshu, C. Luo, Y. Xiong, T. Chen, H. Liu, and J. Wang, “Fabrication technology of the centrosymmetric continuous relief diffractive optical elements,” Physics Procedia 18, 95–99 (2011). [CrossRef]  

29. J. Zhou, L. Li, N. Naples, T. Sun, and Y. Y. Allen, “Fabrication of continuous diffractive optical elements using a fast tool servo diamond turning process,” J. Micromech. Microeng. 23, 075010 (2013). [CrossRef]  

30. O. Cossairt and S. Nayar, “Spectral focal sweep: Extended depth of field from chromatic aberrations,” in “Computational Photography (ICCP), 2010 IEEE International Conference on,” (IEEE, 2010), pp. 1–8.

31. D. W. Sweeney and G. E. Sommargren, “Harmonic diffractive lenses,” Appl. Opt. 34, 2469–2475 (1995). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. S. W. Hasinoff and K. N. Kutulakos, “Light-efficient photography,” IEEE Trans. Pattern Analysis and Machine Intelligence 33, 2203–2214 (2011).
    [Crossref]
  2. G. R. Fowles, Introduction to Modern Optics (Courier Dover Publications, 2012).
  3. G. G. Sliusarev, “Aberration and optical design theory,” Adam Hilger, Ltd., Bristol, England, 672 p. ; Translation 1 (1984).
  4. D. Malacara-Hernández and Z. Malacara-Hernández, Handbook of Optical Design (CRC, 2013).
  5. F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-quality computational imaging through simple lenses,” ACM Trans. Graphics 32, 149 (2013).
    [Crossref]
  6. S. Bernet, W. Harm, and M. Ritsch-Marte, “Demonstration of focus-tunable diffractive moiré-lenses,” Opt. Express 21, 6955–6966 (2013).
    [Crossref] [PubMed]
  7. D. C. O’Shea, T. J. Suleski, A. D. Kathman, and D. W. Prather, Diffractive Optics: Design, Fabrication, and Test, vol. 62 (SPIE, 2004).
  8. S. Tucker, W. T. Cathey, E. Dowski, and et al., “Extended depth of field and aberration control for inexpensive digital microscope systems,” Opt. Express 4, 467–474 (1999).
    [Crossref] [PubMed]
  9. D. Elkind, Z. Zalevsky, U. Levy, and D. Mendlovic, “Optical transfer function shaping and depth of focus by using a phase only filter,” Appl. Opt. 42, 1925–1931 (2003).
    [Crossref] [PubMed]
  10. P. Trouve, F. Champagnat, G. L. Besnerais, G. Druart, and J. Idier, “Design of a chromatic 3d camera with an end-to-end performance model approach,” in “Computer Vision and Pattern Recognition Workshops (CVPRW), 2013 IEEE Conference on,” (IEEE, 2013), pp. 953–960.
  11. Y. Ogura, N. Shirai, J. Tanida, and Y. Ichioka, “Wavelength-multiplexing diffractive phase elements: design, fabrication, and performance evaluation,” J. Opt. Soc. Am. A 18, 1082–1092 (2001).
    [Crossref]
  12. T. Nakai and H. Ogawa, “Research on multi-layer diffractive optical elements and their application to camera lenses,” in “Diffractive Optics and Micro-Optics,” (Optical Society of America, 2002), p. DMA2.
    [Crossref]
  13. V. A. Soifer, V. Kotlar, and L. Doskolovich, Iteractive Methods For Diffractive Optical Elements Computation (CRC, 2003).
  14. C. J. Schuler, M. Hirsch, S. Harmeling, and B. Scholkopf, “Non-stationary correction of optical aberrations,” in “Computer Vision (ICCV), 2011 IEEE International Conference on,” (IEEE, 2011), pp. 659–666.
  15. P. R. Gill and D. G. Stork, “Lensless ultra-miniature imagers using odd-symmetry spiral phase gratings,” in “Computational Optical Sensing and Imaging,” (Optical Society of America, 2013), pp. CW4C–3.
  16. Y. Kitamura, R. Shogenji, K. Yamada, S. Miyatake, M. Miyamoto, T. Morimoto, Y. Masaki, N. Kondou, D. Miyazaki, J. Tanida, and et al., “Reconstruction of a high-resolution image on a compound-eye image-capturing system,” Appl. Opt. 43, 1719–1727 (2004).
    [Crossref] [PubMed]
  17. A. Nikonorov, R. Skidanov, V. Fursov, M. Petrov, S. Bibikov, and Y. Yuzifovich, “Fresnel lens imaging with post-capture image processing,” in “Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops,” (2015), pp. 33–41.
  18. Q. Shan, J. Jia, and A. Agarwala, “High-quality motion deblurring from a single image,” ACM Trans. Graphics 27, 73 (2008).
    [Crossref]
  19. A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM Trans. Graphics 26, 70 (2007).
    [Crossref]
  20. N. Joshi, C. L. Zitnick, R. Szeliski, and D. Kriegman, “Image deblurring and denoising using color priors,” in “Computer Vision and Pattern Recognition, 2009,” (IEEE, 2009), pp. 1550–1557.
    [Crossref]
  21. L. Yuan, J. Sun, L. Quan, and H.-Y. Shum, “Progressive inter-scale and intra-scale non-blind image deconvolution,” ACM Trans. Graphics 27, 74 (2008).
  22. S.-W. Chung, B.-K. Kim, and W.-J. Song, “Detecting and eliminating chromatic aberration in digital images,” in “Image Processing (ICIP), 2009 16th IEEE International Conference on,” (IEEE, 2009), pp. 3905–3908.
  23. J. Goodman, Introduction to Fourier Optics (McGraw-hill, 2008).
  24. G. J. Swanson, “Binary optics technology: theoretical limits on the diffraction efficiency of multilevel diffractive optical elements,” Tech. rep., DTIC Document (1991).
  25. M. M. Meyers, “Hybrid refractive/diffractive achromatic camera lens,” (August61998). US Patent5,715,091
  26. F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
    [Crossref]
  27. K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3-d transform-domain collaborative filtering,” IEEE Trans. Image Processing 16, 2080–2095 (2007).
    [Crossref]
  28. W. Duoshu, C. Luo, Y. Xiong, T. Chen, H. Liu, and J. Wang, “Fabrication technology of the centrosymmetric continuous relief diffractive optical elements,” Physics Procedia 18, 95–99 (2011).
    [Crossref]
  29. J. Zhou, L. Li, N. Naples, T. Sun, and Y. Y. Allen, “Fabrication of continuous diffractive optical elements using a fast tool servo diamond turning process,” J. Micromech. Microeng. 23, 075010 (2013).
    [Crossref]
  30. O. Cossairt and S. Nayar, “Spectral focal sweep: Extended depth of field from chromatic aberrations,” in “Computational Photography (ICCP), 2010 IEEE International Conference on,” (IEEE, 2010), pp. 1–8.
  31. D. W. Sweeney and G. E. Sommargren, “Harmonic diffractive lenses,” Appl. Opt. 34, 2469–2475 (1995).
    [Crossref] [PubMed]

2014 (1)

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

2013 (3)

J. Zhou, L. Li, N. Naples, T. Sun, and Y. Y. Allen, “Fabrication of continuous diffractive optical elements using a fast tool servo diamond turning process,” J. Micromech. Microeng. 23, 075010 (2013).
[Crossref]

F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-quality computational imaging through simple lenses,” ACM Trans. Graphics 32, 149 (2013).
[Crossref]

S. Bernet, W. Harm, and M. Ritsch-Marte, “Demonstration of focus-tunable diffractive moiré-lenses,” Opt. Express 21, 6955–6966 (2013).
[Crossref] [PubMed]

2011 (2)

S. W. Hasinoff and K. N. Kutulakos, “Light-efficient photography,” IEEE Trans. Pattern Analysis and Machine Intelligence 33, 2203–2214 (2011).
[Crossref]

W. Duoshu, C. Luo, Y. Xiong, T. Chen, H. Liu, and J. Wang, “Fabrication technology of the centrosymmetric continuous relief diffractive optical elements,” Physics Procedia 18, 95–99 (2011).
[Crossref]

2008 (2)

L. Yuan, J. Sun, L. Quan, and H.-Y. Shum, “Progressive inter-scale and intra-scale non-blind image deconvolution,” ACM Trans. Graphics 27, 74 (2008).

Q. Shan, J. Jia, and A. Agarwala, “High-quality motion deblurring from a single image,” ACM Trans. Graphics 27, 73 (2008).
[Crossref]

2007 (2)

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM Trans. Graphics 26, 70 (2007).
[Crossref]

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3-d transform-domain collaborative filtering,” IEEE Trans. Image Processing 16, 2080–2095 (2007).
[Crossref]

2004 (1)

2003 (1)

2001 (1)

1999 (1)

1995 (1)

Agarwala, A.

Q. Shan, J. Jia, and A. Agarwala, “High-quality motion deblurring from a single image,” ACM Trans. Graphics 27, 73 (2008).
[Crossref]

Allen, Y. Y.

J. Zhou, L. Li, N. Naples, T. Sun, and Y. Y. Allen, “Fabrication of continuous diffractive optical elements using a fast tool servo diamond turning process,” J. Micromech. Microeng. 23, 075010 (2013).
[Crossref]

Bernet, S.

Besnerais, G. L.

P. Trouve, F. Champagnat, G. L. Besnerais, G. Druart, and J. Idier, “Design of a chromatic 3d camera with an end-to-end performance model approach,” in “Computer Vision and Pattern Recognition Workshops (CVPRW), 2013 IEEE Conference on,” (IEEE, 2013), pp. 953–960.

Bibikov, S.

A. Nikonorov, R. Skidanov, V. Fursov, M. Petrov, S. Bibikov, and Y. Yuzifovich, “Fresnel lens imaging with post-capture image processing,” in “Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops,” (2015), pp. 33–41.

Cathey, W. T.

Champagnat, F.

P. Trouve, F. Champagnat, G. L. Besnerais, G. Druart, and J. Idier, “Design of a chromatic 3d camera with an end-to-end performance model approach,” in “Computer Vision and Pattern Recognition Workshops (CVPRW), 2013 IEEE Conference on,” (IEEE, 2013), pp. 953–960.

Chen, T.

W. Duoshu, C. Luo, Y. Xiong, T. Chen, H. Liu, and J. Wang, “Fabrication technology of the centrosymmetric continuous relief diffractive optical elements,” Physics Procedia 18, 95–99 (2011).
[Crossref]

Chung, S.-W.

S.-W. Chung, B.-K. Kim, and W.-J. Song, “Detecting and eliminating chromatic aberration in digital images,” in “Image Processing (ICIP), 2009 16th IEEE International Conference on,” (IEEE, 2009), pp. 3905–3908.

Cossairt, O.

O. Cossairt and S. Nayar, “Spectral focal sweep: Extended depth of field from chromatic aberrations,” in “Computational Photography (ICCP), 2010 IEEE International Conference on,” (IEEE, 2010), pp. 1–8.

Dabov, K.

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3-d transform-domain collaborative filtering,” IEEE Trans. Image Processing 16, 2080–2095 (2007).
[Crossref]

Doskolovich, L.

V. A. Soifer, V. Kotlar, and L. Doskolovich, Iteractive Methods For Diffractive Optical Elements Computation (CRC, 2003).

Dowski, E.

Druart, G.

P. Trouve, F. Champagnat, G. L. Besnerais, G. Druart, and J. Idier, “Design of a chromatic 3d camera with an end-to-end performance model approach,” in “Computer Vision and Pattern Recognition Workshops (CVPRW), 2013 IEEE Conference on,” (IEEE, 2013), pp. 953–960.

Duoshu, W.

W. Duoshu, C. Luo, Y. Xiong, T. Chen, H. Liu, and J. Wang, “Fabrication technology of the centrosymmetric continuous relief diffractive optical elements,” Physics Procedia 18, 95–99 (2011).
[Crossref]

Durand, F.

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM Trans. Graphics 26, 70 (2007).
[Crossref]

Egiazarian, K.

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3-d transform-domain collaborative filtering,” IEEE Trans. Image Processing 16, 2080–2095 (2007).
[Crossref]

Elkind, D.

Fergus, R.

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM Trans. Graphics 26, 70 (2007).
[Crossref]

Foi, A.

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3-d transform-domain collaborative filtering,” IEEE Trans. Image Processing 16, 2080–2095 (2007).
[Crossref]

Fowles, G. R.

G. R. Fowles, Introduction to Modern Optics (Courier Dover Publications, 2012).

Freeman, W. T.

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM Trans. Graphics 26, 70 (2007).
[Crossref]

Fursov, V.

A. Nikonorov, R. Skidanov, V. Fursov, M. Petrov, S. Bibikov, and Y. Yuzifovich, “Fresnel lens imaging with post-capture image processing,” in “Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops,” (2015), pp. 33–41.

Gallo, O.

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

Gill, P. R.

P. R. Gill and D. G. Stork, “Lensless ultra-miniature imagers using odd-symmetry spiral phase gratings,” in “Computational Optical Sensing and Imaging,” (Optical Society of America, 2013), pp. CW4C–3.

Goodman, J.

J. Goodman, Introduction to Fourier Optics (McGraw-hill, 2008).

Harm, W.

Harmeling, S.

C. J. Schuler, M. Hirsch, S. Harmeling, and B. Scholkopf, “Non-stationary correction of optical aberrations,” in “Computer Vision (ICCV), 2011 IEEE International Conference on,” (IEEE, 2011), pp. 659–666.

Hasinoff, S. W.

S. W. Hasinoff and K. N. Kutulakos, “Light-efficient photography,” IEEE Trans. Pattern Analysis and Machine Intelligence 33, 2203–2214 (2011).
[Crossref]

Heide, F.

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-quality computational imaging through simple lenses,” ACM Trans. Graphics 32, 149 (2013).
[Crossref]

Heidrich, W.

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-quality computational imaging through simple lenses,” ACM Trans. Graphics 32, 149 (2013).
[Crossref]

Hirsch, M.

C. J. Schuler, M. Hirsch, S. Harmeling, and B. Scholkopf, “Non-stationary correction of optical aberrations,” in “Computer Vision (ICCV), 2011 IEEE International Conference on,” (IEEE, 2011), pp. 659–666.

Hullin, M. B.

F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-quality computational imaging through simple lenses,” ACM Trans. Graphics 32, 149 (2013).
[Crossref]

Ichioka, Y.

Idier, J.

P. Trouve, F. Champagnat, G. L. Besnerais, G. Druart, and J. Idier, “Design of a chromatic 3d camera with an end-to-end performance model approach,” in “Computer Vision and Pattern Recognition Workshops (CVPRW), 2013 IEEE Conference on,” (IEEE, 2013), pp. 953–960.

Jia, J.

Q. Shan, J. Jia, and A. Agarwala, “High-quality motion deblurring from a single image,” ACM Trans. Graphics 27, 73 (2008).
[Crossref]

Joshi, N.

N. Joshi, C. L. Zitnick, R. Szeliski, and D. Kriegman, “Image deblurring and denoising using color priors,” in “Computer Vision and Pattern Recognition, 2009,” (IEEE, 2009), pp. 1550–1557.
[Crossref]

Kathman, A. D.

D. C. O’Shea, T. J. Suleski, A. D. Kathman, and D. W. Prather, Diffractive Optics: Design, Fabrication, and Test, vol. 62 (SPIE, 2004).

Katkovnik, V.

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3-d transform-domain collaborative filtering,” IEEE Trans. Image Processing 16, 2080–2095 (2007).
[Crossref]

Kim, B.-K.

S.-W. Chung, B.-K. Kim, and W.-J. Song, “Detecting and eliminating chromatic aberration in digital images,” in “Image Processing (ICIP), 2009 16th IEEE International Conference on,” (IEEE, 2009), pp. 3905–3908.

Kitamura, Y.

Kolb, A.

F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-quality computational imaging through simple lenses,” ACM Trans. Graphics 32, 149 (2013).
[Crossref]

Kondou, N.

Kotlar, V.

V. A. Soifer, V. Kotlar, and L. Doskolovich, Iteractive Methods For Diffractive Optical Elements Computation (CRC, 2003).

Kriegman, D.

N. Joshi, C. L. Zitnick, R. Szeliski, and D. Kriegman, “Image deblurring and denoising using color priors,” in “Computer Vision and Pattern Recognition, 2009,” (IEEE, 2009), pp. 1550–1557.
[Crossref]

Kutulakos, K. N.

S. W. Hasinoff and K. N. Kutulakos, “Light-efficient photography,” IEEE Trans. Pattern Analysis and Machine Intelligence 33, 2203–2214 (2011).
[Crossref]

Labitzke, B.

F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-quality computational imaging through simple lenses,” ACM Trans. Graphics 32, 149 (2013).
[Crossref]

Levin, A.

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM Trans. Graphics 26, 70 (2007).
[Crossref]

Levy, U.

Li, L.

J. Zhou, L. Li, N. Naples, T. Sun, and Y. Y. Allen, “Fabrication of continuous diffractive optical elements using a fast tool servo diamond turning process,” J. Micromech. Microeng. 23, 075010 (2013).
[Crossref]

Liu, H.

W. Duoshu, C. Luo, Y. Xiong, T. Chen, H. Liu, and J. Wang, “Fabrication technology of the centrosymmetric continuous relief diffractive optical elements,” Physics Procedia 18, 95–99 (2011).
[Crossref]

Liu, J.

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

Luo, C.

W. Duoshu, C. Luo, Y. Xiong, T. Chen, H. Liu, and J. Wang, “Fabrication technology of the centrosymmetric continuous relief diffractive optical elements,” Physics Procedia 18, 95–99 (2011).
[Crossref]

Malacara-Hernández, D.

D. Malacara-Hernández and Z. Malacara-Hernández, Handbook of Optical Design (CRC, 2013).

Malacara-Hernández, Z.

D. Malacara-Hernández and Z. Malacara-Hernández, Handbook of Optical Design (CRC, 2013).

Masaki, Y.

Mendlovic, D.

Meyers, M. M.

M. M. Meyers, “Hybrid refractive/diffractive achromatic camera lens,” (August61998). US Patent5,715,091

Miyamoto, M.

Miyatake, S.

Miyazaki, D.

Morimoto, T.

Nakai, T.

T. Nakai and H. Ogawa, “Research on multi-layer diffractive optical elements and their application to camera lenses,” in “Diffractive Optics and Micro-Optics,” (Optical Society of America, 2002), p. DMA2.
[Crossref]

Naples, N.

J. Zhou, L. Li, N. Naples, T. Sun, and Y. Y. Allen, “Fabrication of continuous diffractive optical elements using a fast tool servo diamond turning process,” J. Micromech. Microeng. 23, 075010 (2013).
[Crossref]

Nayar, S.

O. Cossairt and S. Nayar, “Spectral focal sweep: Extended depth of field from chromatic aberrations,” in “Computational Photography (ICCP), 2010 IEEE International Conference on,” (IEEE, 2010), pp. 1–8.

Nikonorov, A.

A. Nikonorov, R. Skidanov, V. Fursov, M. Petrov, S. Bibikov, and Y. Yuzifovich, “Fresnel lens imaging with post-capture image processing,” in “Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops,” (2015), pp. 33–41.

O’Shea, D. C.

D. C. O’Shea, T. J. Suleski, A. D. Kathman, and D. W. Prather, Diffractive Optics: Design, Fabrication, and Test, vol. 62 (SPIE, 2004).

Ogawa, H.

T. Nakai and H. Ogawa, “Research on multi-layer diffractive optical elements and their application to camera lenses,” in “Diffractive Optics and Micro-Optics,” (Optical Society of America, 2002), p. DMA2.
[Crossref]

Ogura, Y.

Pajak, D.

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

Petrov, M.

A. Nikonorov, R. Skidanov, V. Fursov, M. Petrov, S. Bibikov, and Y. Yuzifovich, “Fresnel lens imaging with post-capture image processing,” in “Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops,” (2015), pp. 33–41.

Prather, D. W.

D. C. O’Shea, T. J. Suleski, A. D. Kathman, and D. W. Prather, Diffractive Optics: Design, Fabrication, and Test, vol. 62 (SPIE, 2004).

Quan, L.

L. Yuan, J. Sun, L. Quan, and H.-Y. Shum, “Progressive inter-scale and intra-scale non-blind image deconvolution,” ACM Trans. Graphics 27, 74 (2008).

Reddy, D.

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

Ritsch-Marte, M.

Rouf, M.

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-quality computational imaging through simple lenses,” ACM Trans. Graphics 32, 149 (2013).
[Crossref]

Scholkopf, B.

C. J. Schuler, M. Hirsch, S. Harmeling, and B. Scholkopf, “Non-stationary correction of optical aberrations,” in “Computer Vision (ICCV), 2011 IEEE International Conference on,” (IEEE, 2011), pp. 659–666.

Schuler, C. J.

C. J. Schuler, M. Hirsch, S. Harmeling, and B. Scholkopf, “Non-stationary correction of optical aberrations,” in “Computer Vision (ICCV), 2011 IEEE International Conference on,” (IEEE, 2011), pp. 659–666.

Shan, Q.

Q. Shan, J. Jia, and A. Agarwala, “High-quality motion deblurring from a single image,” ACM Trans. Graphics 27, 73 (2008).
[Crossref]

Shirai, N.

Shogenji, R.

Shum, H.-Y.

L. Yuan, J. Sun, L. Quan, and H.-Y. Shum, “Progressive inter-scale and intra-scale non-blind image deconvolution,” ACM Trans. Graphics 27, 74 (2008).

Skidanov, R.

A. Nikonorov, R. Skidanov, V. Fursov, M. Petrov, S. Bibikov, and Y. Yuzifovich, “Fresnel lens imaging with post-capture image processing,” in “Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops,” (2015), pp. 33–41.

Sliusarev, G. G.

G. G. Sliusarev, “Aberration and optical design theory,” Adam Hilger, Ltd., Bristol, England, 672 p. ; Translation 1 (1984).

Soifer, V. A.

V. A. Soifer, V. Kotlar, and L. Doskolovich, Iteractive Methods For Diffractive Optical Elements Computation (CRC, 2003).

Sommargren, G. E.

Song, W.-J.

S.-W. Chung, B.-K. Kim, and W.-J. Song, “Detecting and eliminating chromatic aberration in digital images,” in “Image Processing (ICIP), 2009 16th IEEE International Conference on,” (IEEE, 2009), pp. 3905–3908.

Steinberger, M.

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

Stork, D. G.

P. R. Gill and D. G. Stork, “Lensless ultra-miniature imagers using odd-symmetry spiral phase gratings,” in “Computational Optical Sensing and Imaging,” (Optical Society of America, 2013), pp. CW4C–3.

Suleski, T. J.

D. C. O’Shea, T. J. Suleski, A. D. Kathman, and D. W. Prather, Diffractive Optics: Design, Fabrication, and Test, vol. 62 (SPIE, 2004).

Sun, J.

L. Yuan, J. Sun, L. Quan, and H.-Y. Shum, “Progressive inter-scale and intra-scale non-blind image deconvolution,” ACM Trans. Graphics 27, 74 (2008).

Sun, T.

J. Zhou, L. Li, N. Naples, T. Sun, and Y. Y. Allen, “Fabrication of continuous diffractive optical elements using a fast tool servo diamond turning process,” J. Micromech. Microeng. 23, 075010 (2013).
[Crossref]

Swanson, G. J.

G. J. Swanson, “Binary optics technology: theoretical limits on the diffraction efficiency of multilevel diffractive optical elements,” Tech. rep., DTIC Document (1991).

Sweeney, D. W.

Szeliski, R.

N. Joshi, C. L. Zitnick, R. Szeliski, and D. Kriegman, “Image deblurring and denoising using color priors,” in “Computer Vision and Pattern Recognition, 2009,” (IEEE, 2009), pp. 1550–1557.
[Crossref]

Tanida, J.

Trouve, P.

P. Trouve, F. Champagnat, G. L. Besnerais, G. Druart, and J. Idier, “Design of a chromatic 3d camera with an end-to-end performance model approach,” in “Computer Vision and Pattern Recognition Workshops (CVPRW), 2013 IEEE Conference on,” (IEEE, 2013), pp. 953–960.

Tsai, Y.-T.

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

Tucker, S.

Wang, J.

W. Duoshu, C. Luo, Y. Xiong, T. Chen, H. Liu, and J. Wang, “Fabrication technology of the centrosymmetric continuous relief diffractive optical elements,” Physics Procedia 18, 95–99 (2011).
[Crossref]

Xiong, Y.

W. Duoshu, C. Luo, Y. Xiong, T. Chen, H. Liu, and J. Wang, “Fabrication technology of the centrosymmetric continuous relief diffractive optical elements,” Physics Procedia 18, 95–99 (2011).
[Crossref]

Yamada, K.

Yuan, L.

L. Yuan, J. Sun, L. Quan, and H.-Y. Shum, “Progressive inter-scale and intra-scale non-blind image deconvolution,” ACM Trans. Graphics 27, 74 (2008).

Yuzifovich, Y.

A. Nikonorov, R. Skidanov, V. Fursov, M. Petrov, S. Bibikov, and Y. Yuzifovich, “Fresnel lens imaging with post-capture image processing,” in “Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops,” (2015), pp. 33–41.

Zalevsky, Z.

Zhou, J.

J. Zhou, L. Li, N. Naples, T. Sun, and Y. Y. Allen, “Fabrication of continuous diffractive optical elements using a fast tool servo diamond turning process,” J. Micromech. Microeng. 23, 075010 (2013).
[Crossref]

Zitnick, C. L.

N. Joshi, C. L. Zitnick, R. Szeliski, and D. Kriegman, “Image deblurring and denoising using color priors,” in “Computer Vision and Pattern Recognition, 2009,” (IEEE, 2009), pp. 1550–1557.
[Crossref]

ACM Trans. Graphics (5)

F. Heide, M. Rouf, M. B. Hullin, B. Labitzke, W. Heidrich, and A. Kolb, “High-quality computational imaging through simple lenses,” ACM Trans. Graphics 32, 149 (2013).
[Crossref]

Q. Shan, J. Jia, and A. Agarwala, “High-quality motion deblurring from a single image,” ACM Trans. Graphics 27, 73 (2008).
[Crossref]

A. Levin, R. Fergus, F. Durand, and W. T. Freeman, “Image and depth from a conventional camera with a coded aperture,” ACM Trans. Graphics 26, 70 (2007).
[Crossref]

L. Yuan, J. Sun, L. Quan, and H.-Y. Shum, “Progressive inter-scale and intra-scale non-blind image deconvolution,” ACM Trans. Graphics 27, 74 (2008).

F. Heide, M. Steinberger, Y.-T. Tsai, M. Rouf, D. Pajak, D. Reddy, O. Gallo, J. Liu, W. Heidrich, and K. Egiazarian, “Flexisp: A flexible camera image processing framework,” ACM Trans. Graphics 33, 231 (2014).
[Crossref]

Appl. Opt. (3)

IEEE Trans. Image Processing (1)

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Image denoising by sparse 3-d transform-domain collaborative filtering,” IEEE Trans. Image Processing 16, 2080–2095 (2007).
[Crossref]

IEEE Trans. Pattern Analysis and Machine Intelligence (1)

S. W. Hasinoff and K. N. Kutulakos, “Light-efficient photography,” IEEE Trans. Pattern Analysis and Machine Intelligence 33, 2203–2214 (2011).
[Crossref]

J. Micromech. Microeng. (1)

J. Zhou, L. Li, N. Naples, T. Sun, and Y. Y. Allen, “Fabrication of continuous diffractive optical elements using a fast tool servo diamond turning process,” J. Micromech. Microeng. 23, 075010 (2013).
[Crossref]

J. Opt. Soc. Am. A (1)

Opt. Express (2)

Physics Procedia (1)

W. Duoshu, C. Luo, Y. Xiong, T. Chen, H. Liu, and J. Wang, “Fabrication technology of the centrosymmetric continuous relief diffractive optical elements,” Physics Procedia 18, 95–99 (2011).
[Crossref]

Other (16)

O. Cossairt and S. Nayar, “Spectral focal sweep: Extended depth of field from chromatic aberrations,” in “Computational Photography (ICCP), 2010 IEEE International Conference on,” (IEEE, 2010), pp. 1–8.

S.-W. Chung, B.-K. Kim, and W.-J. Song, “Detecting and eliminating chromatic aberration in digital images,” in “Image Processing (ICIP), 2009 16th IEEE International Conference on,” (IEEE, 2009), pp. 3905–3908.

J. Goodman, Introduction to Fourier Optics (McGraw-hill, 2008).

G. J. Swanson, “Binary optics technology: theoretical limits on the diffraction efficiency of multilevel diffractive optical elements,” Tech. rep., DTIC Document (1991).

M. M. Meyers, “Hybrid refractive/diffractive achromatic camera lens,” (August61998). US Patent5,715,091

D. C. O’Shea, T. J. Suleski, A. D. Kathman, and D. W. Prather, Diffractive Optics: Design, Fabrication, and Test, vol. 62 (SPIE, 2004).

P. Trouve, F. Champagnat, G. L. Besnerais, G. Druart, and J. Idier, “Design of a chromatic 3d camera with an end-to-end performance model approach,” in “Computer Vision and Pattern Recognition Workshops (CVPRW), 2013 IEEE Conference on,” (IEEE, 2013), pp. 953–960.

G. R. Fowles, Introduction to Modern Optics (Courier Dover Publications, 2012).

G. G. Sliusarev, “Aberration and optical design theory,” Adam Hilger, Ltd., Bristol, England, 672 p. ; Translation 1 (1984).

D. Malacara-Hernández and Z. Malacara-Hernández, Handbook of Optical Design (CRC, 2013).

T. Nakai and H. Ogawa, “Research on multi-layer diffractive optical elements and their application to camera lenses,” in “Diffractive Optics and Micro-Optics,” (Optical Society of America, 2002), p. DMA2.
[Crossref]

V. A. Soifer, V. Kotlar, and L. Doskolovich, Iteractive Methods For Diffractive Optical Elements Computation (CRC, 2003).

C. J. Schuler, M. Hirsch, S. Harmeling, and B. Scholkopf, “Non-stationary correction of optical aberrations,” in “Computer Vision (ICCV), 2011 IEEE International Conference on,” (IEEE, 2011), pp. 659–666.

P. R. Gill and D. G. Stork, “Lensless ultra-miniature imagers using odd-symmetry spiral phase gratings,” in “Computational Optical Sensing and Imaging,” (Optical Society of America, 2013), pp. CW4C–3.

A. Nikonorov, R. Skidanov, V. Fursov, M. Petrov, S. Bibikov, and Y. Yuzifovich, “Fresnel lens imaging with post-capture image processing,” in “Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops,” (2015), pp. 33–41.

N. Joshi, C. L. Zitnick, R. Szeliski, and D. Kriegman, “Image deblurring and denoising using color priors,” in “Computer Vision and Pattern Recognition, 2009,” (IEEE, 2009), pp. 1550–1557.
[Crossref]

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1
Fig. 1 Compared to the imaging pipeline of conventional imaging with well corrected complex refractive lenses (top), our diffractive-refractive computational imaging allows capturing a blurry intermediate image by a super thin phase modulated DOE, and then recovering the sharp image with deconvolution algorithms using cross-channel regularizations. An exemplary blur image (bottom left) and its corresponding recovered image (bottom right) are shown. Insets of the images indicate that our algorithm corrects the residual aberrations and preserves the details.
Fig. 2
Fig. 2 A binary amplitude Fresnel zone plate (top-left) diffracts light into multi-orders (0th, ±1st, ±3rd,…), resulting in a low diffraction efficiency. With continuous phase-only diffractive lens (top-right), the light falls into +1st order could theoretically reach 100%. In practice, multi-level microstructures (bottom) are usually easy to fabricate and approximates the continuous profile very well.
Fig. 3
Fig. 3 Schematic comparisons of a super-thin diffractive optics (top), a diffractive-refractive-hybrid lens (middle), and a simple refractive lens (bottom) for f1 = 50mm (left) and f2 = 25mm (right). The R,G,B colors represent the focal powers for different wavelengths. “+” indicates the focal length for red is longer than that of blue, and “−” for the other way round.
Fig. 4
Fig. 4 Left: Microscopic images of our PZP and phase-coded-aperture lens on a Nikon Eclipse L200N 5X microscope. Right: Real color PSFs captured are shown with their individual RGB components. Note that the axises in the plots denote the pixel domain, and the colorbars indicate the relative intensity.
Fig. 5
Fig. 5 Top of (b–f): The same scene captured by our single PZP (f = 100mm), stacked PZP (f = 50mm), hybrid coded-aperture PZP and refractive lens (f = 50mm), hybrid PZP and refractive lens (f = 50mm), and a simple lens (f = 50mm) respectively. Middle of (b–f): Corresponding deconvolved images for each lens. Bottom of (b–f): R, G and B components of the PSFs for each setup. The scene is projected onto a board by a projector. Note that we only show the same cropped area for cross comparison here. The ground truth image captured by a Canon DSLR is presented for reference (a-2), with the PSNR plot provided (a-1) corresponding to the designs (c–f).
Fig. 6
Fig. 6 Top: Images captured and deconvolved for a text scene with a single PZP (f = 100mm). The teapot and white texts portions show the ability of our algorithms to remove most of the chromatic aberrations. Bottom: Captured and deconvolved images for a flower and book scene with stacked PZPs (f = 50mm). The yellow and green objects are well recovered in the same scene. Note that both scenes are at the same distance from the lenses, and the field of views differ because of the focal lengths.
Fig. 7
Fig. 7 Images captured (a) and deconvolved (b) for a box scene with our diffractive-refractive-hybrid lens (f = 50mm). By introducing a refractive lens, chromatic aberrations are smaller than in the single PZP and stacked PZP cases in Fig. 6. The inverse problem becomes more well-conditioned and shaper images could be recovered.
Fig. 8
Fig. 8 Images captured (a) and deconvolved (b) for a ball scene with our phase-coded-aperture hybrid lens (f = 50mm). By coding the intensity distribution of the PSF, high frequency features as well as colors can be preserved better. Note that the net around the ball is recovered which otherwise would hardly be seen in the blurry image.
Fig. 9
Fig. 9 Images captured (top) and deconvolved (bottom) for the ISO12233 resolution chart with our three designs: (from left to right) hybrid PZP and lens, hybrid phase-coded-aperture PZP and lens, and stacked PZPs. The focal lengths are f = 50mm for each design.
Fig. 10
Fig. 10 Off-axis performance comparison for a simple refractive lens (left), our hybrid PZP lens (center), and stacked PZPs (right). Note that the first row presents the capture images of a black-white checkerboard, while the close-ups illustrate the on-axis and off-axis patches in their green channel, red channel, and blue channel visualization. From the comparison of patch A and patch B, we observe that our designs show better spatial uniformity, despite the chromatic aberration. All the focal lengths are f = 50mm and the full field of view is around 30°.
Fig. 11
Fig. 11 Images of captured (left) and deconvolved (right) for an outdoor scene with our hybrid PZP lens (f = 50mm). The far and near objects vary from around 8m (house) to around 2m (tree) from the lens.

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

η m N = { sin [ π ( ( n 1 ) d λ m ) ] π ( ( n 1 ) d λ m ) } 2 { sin [ π ( ( n 1 ) d λ N ) ] π ( ( n 1 ) d λ N ) } 2 ,
b = Ki + n ,
P c = λ c P ( λ ) d λ ,
i c = arg min i c μ c 2 b c K c i c 2 2 + Γ ( i c ) ,
i m i l i l i m i m / i m i l / i l
Γ ( i c ) = m l D i m i l D i i i m 1 ,

Metrics