Abstract

A power-balanced hybrid optical imaging system has a diffractive computational camera, introduced in this paper, with image formation by a refractive lens and multilevel phase mask (MPM). This system provides a long focal depth with low chromatic aberrations thanks to MPM and a high energy light concentration due to the refractive lens. We introduce the concept of optical power balance between the lens and MPM, which controls the contribution of each element to modulate the incoming light. Additional features of our MPM design are the inclusion of the quantization of the MPM’s shape on the number of levels and the Fresnel order (thickness) using a smoothing function. To optimize the optical power balance as well as the MPM, we built a fully differentiable image formation model for joint optimization of optical and imaging parameters for the proposed camera using neural network techniques. We also optimized a single Wiener-like optical transfer function (OTF) invariant to depth to reconstruct a sharp image. We numerically and experimentally compare the designed system with its counterparts, lensless and just-lens optical systems, for the visible wavelength interval (400–700) nm and the depth-of-field range (0.5-$\infty$ m for numerical and 0.5–2 m for experimental). We believe the attained results demonstrate that the proposed system equipped with the optimal OTF overcomes its counterparts––even when they are used with optimized OTF––in terms of the reconstruction quality for off-focus distances. The simulation results also reveal that optimizing the optical power balance, Fresnel order, and the number of levels parameters are essential for system performance attaining an improvement of up to 5 dB of PSNR using the optimized OTF compared to its counterpart lensless setup.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. INTRODUCTION

Computational imaging with encoding flat diffractive optical elements (DOEs) such as binary/multilevel phase elements [1,2] and meta-optical elements included [3] is a multidisciplinary research field at the intersection of optics, mathematics, and digital image processing. This particular exciting academic intersection has an increasing interest in designing DOEs for applications such as computational photography [4,5], augmented reality [6], spectral imaging [7], and microscopy [8], among others that are leading the need for highly miniaturized optical systems [9,10]. The design of imaging setups for the aforementioned applications involves the optimization of optical elements such as amplitude masks [1114], refractive lenses [15], DOEs [2], diffusers [9], and various types of phase masks/plates [16]. In this work, the elements of interest to be jointly designed are a refractive lens and DOE to improve the depth of field (DoF) and reduce the chromatic aberrations of the system, a problem that is also known as achromatic extended depth-of-field (EDoF) imaging.

Two basic approaches are exploited to effectively design and develop all-in-focus optical imaging. The first one deals with the design of optical systems that are highly insensitive to the distance between the object and camera. Contrary to it, another approach designs optical systems that are highly sensitive to variations of this distance. The depth map of the scene is reconstructed and used for all-in-focus imaging [17], or to control the physical parameters of the system using tunable and programmable optical devices [1820]. In this work, we follow the mainstream of the first approach to develop photography cameras and the corresponding reconstruction algorithm for sharp and high-quality imaging with achromatic EDoF.

Here, we propose an optical power-balanced hybrid setup that is composed of a refractive lens and a multilevel phase mask (MPM) as a DOE to extend the DoF capabilities of a camera. We include the number of levels to control the quantization level and the Fresnel order that tune the thickness (manufacturing aspects) in the MPM design, unique features that are flexible and can be optimized. This system introduces the concept of an optical power balance between the lens and MPM, which controls the contribution of each element to modulate the incoming light. We mathematically motivate the use of our optical power-balance hybrid system by providing analytical remarks showing that they allow better estimation of an image in terms of accuracy, visual perception, and less chromatic aberration over a wide DoF compared to setups without power-balanced optics that employ either a lens or are lensless (just DOE).

To optimize the optical power balance and MPM, we build a fully differentiable image formation model for joint optimization of optical and imaging parameters for the designed computational camera using neural networks (NN). We follow this NN approach using the software library Pytorch [21] to have a joint optimization framework that goes from the input data through the mathematical image formation model all the way to complete image recovery, a strategy that has been demonstrated in recent works as suitable to design DOEs [25]. In particular, for the number of levels and Fresnel order features, we introduce a smoothing function because both parameters are modeled as piecewise continuous operations. Our optimization framework, in contrast to previous works that have pursued the achievement of narrow point spread functions (PSFs) along the depth to facilitate inverse imaging, designs a single Wiener-like optical transfer function (OTF) to estimate a scene from blurred observations invariant to depth. Moreover, a BM3DSHARP algorithm [22] is included for sparsity modeling as a prior for images to be reconstructed. Numerical results reveal that the optimization of the optical power balance, Fresnel order, and the number of levels parameters are essential for system performance attaining an improvement up to 5 dB of PSNR using the optimized OTF compared to its counterpart lensless setup. We also point out that, due to the new included variables in the MPM design (number of levels, thickness, and power balance), the number of possible combinations to compare the proposed system with the state-of-the-art is huge. To avoid building several MPMs to physically analyze the performance of our camera, we build a setup that exploits the phase capabilities of a spatial light modulator (SLM) to investigate the performance of the proposed MPM design.

The contribution of this work can be summarized as follows:

  • • The power-balanced hybrid optical system with optimized power-balance between the refractive and diffractive elements is presented and studied;
  • • The Fresnel order and the number of levels is included in the optimization pipeline of MPM via a smoothing function;
  • • A novel single Wiener-like OTF is proposed for inverse imaging from blurred observations invariant to depth;
  • • The optimal parameters of the system and MPM are produced in an end-to-end framework solving the multi-objective optimization problem with PSNR as criterion function; and
  • • The performance of the proposed optical setup is demonstrated by numerical simulation and experimental tests with SLM for implementation of MPM.

We believe that the proposed framework for end-to-end optimization provides steps forward for low-level features of the optics (thickness and number of levels) and high-level contributions with the designed single Wiener-like OTF for inverse imaging. The insights provided by the optical power-balance concept also lead to a better understanding of how optics can be designed to improve imaging quality by proper light modulation.

2. RELATED WORK

In Table 1, we provide the relevant references to the achromatic EDoF problem of diffractive optics imaging. In this table, we present the mathematical models for phase modulation design as well as the manufacturing aspects subject to optimization. Our target is to build an optical system equipped with computational inverse imaging. As a result, the optics with MPM for direct focusing/imaging on the sensor are out of our consideration.

Tables Icon

Table 1. Comparison of Different Phase Modulation Setups for EDoF Enabling the State-of-the-Arta$^{^,}$b

Because we primarily focus on the MPM design, it is natural to classify them accordingly; i.e., according to the basic ideas of the approach and the design methodology. We distinguish the following three basic groups of optical models that appear in design problems: free-shape phase (often absolute phase); lensless optics; and hybrid optics conceived as a composition of the lens and MPM. In the first group, the object of the design is a modulation phase (often an absolute phase) that enables a desirable manipulation of wavefronts. Examples are given in rows 1 and 2 of Table 1.

In the first row, a simple model of the quadratic and cubic phases is used. The quadratic components are fixed by the parameters of the corresponding focusing lens and the variables of the design are the parameters of the cubic component. The main disadvantage of this technique is that, for extreme depth values, the cubic component produces strong aberrations that reduce the reconstruction quality. In the second row of the table, this cubic component is replaced by arbitrary phase functions $\varphi (x,y)$, which is a free-shape phase design. The alternative second group of design optical models is lensless with various phase plates/masks (flat and non-flat) instead of the conventional refractive lens. In Table 1, the examples of this group of optical elements can be seen in rows 4, 5, 6, 7, and 8. However, just like a phase mask employed to extend the DoF of an imaging system, they suffer from strong chromatic aberrations, limiting the achievable reconstruction quality.

The so-called hybrids form the third group of optical models that are presented in row 3 of Table 1. In this approach, a binary phase mask using concentric rings is produced to enable the fabrication of thin optical elements that encode the incoming light. However, the fact that just two levels are employed to the phase profile of the phase mask also limits the achievable DoF of the system. In the hybrids, the lens is combined with a phase mask/plate that is usually flat. Thus, the phase design is restricted by the structure parameters of mask/plate that differentiates these designs from those in group 1, where the free-shape phase can be arbitrary. The last row of the table addresses the optical setup that is the topic of this paper. It is a hybrid with DOE as a special MPM with the optimized optical power balance between the lens and MPM.

The introduced classification of the optical elements is not one-to-one, as in the prominent wavefront coding (WFC) proposed by Dowski and Cathey in 1995. Therefore, Row 1 in Table 1 can be treated as a hybrid of the lens with the cubic phase mask. However, the existence of the lens is less important for the methodology because the design is focused on the optimization of the phase $\varphi (x,y)$, which can be arbitrary and not restricted to the cubic. The fact that the lens is used appears as an essential point at the stage of implementation when the cubic phase can be engraved on the lens surface or presented by the cubic phase mask as an additional optical element.

Broadband imaging with a phase mask is a promising technique for achromatic EDoF imaging. One of the challenges in broadband imaging with a phase mask is a strong dispersion that causes significant color aberration. Nevertheless, a flow of publications demonstrates significant progress in this field of research. It is shown in the last column of Table 1 that optimization and design methods could be divided into three different frameworks: analytical, PSF engineering (fitting), and end-to-end optimization. Recently, the superiority of end-to-end optimization using convolutional neural networks for image processing and optics design is demonstrated in a number of publications [2,4,28,33,4648]. In terms of the introduced classification, these works mostly belong to the first group of the phase mask models with a free-shape phase design, despite all the differences in implementation that may concern lensless or hybrid structures.

3. END-TO-END OPTIMIZATION OF OPTICAL POWER-BALANCED HYBRID OPTICS

Our proposed optical setup shown in Fig. 1, where the object, aperture, and sensor are 2D flat, and where ${d_1}$ is a distance between the object and the aperture, ${d_2}$ is a distance from the aperture to the sensor (${d_2} \ll {d_1}$), and ${f_{{\lambda _0}}}$ is a focal distance of the optics. In what follows, we use coordinates $(\xi ,\eta)$, $(x,y)$, and $(u,v)$ for the object, aperture, and sensor, respectively.

 figure: Fig. 1.

Fig. 1. Light wave with a given wavelength and a curvature for a point source at a distance ${d_1}$ propagates on the aperture plane containing MPM (refractive index $n$) to be designed. The MPM modulates the phase of the incident wavefront. The resulting wavefront propagates through the lens to the aperture-sensor, distance ${d_2}$, via the Fresnel propagation model. The intensities of the sensor-incident wavefront define PSFs.

Download Full Size | PPT Slide | PDF

A. Image Formation Model

1. PSF-Based RGB Imaging

From the Fourier wave optics theory, the response of an optical system to an input wavefront is modeled as a convolution of the system’s PSF and a true object image. If we assume that there are both a lens and MPM in the aperture, then a generalized pupil function of the system shown in Fig. 1 is of the form [49]

$${{\cal P}_\lambda}(x,y) = {{\cal P}_A}(x,y){e^{\frac{{j\pi}}{\lambda}\left({\frac{1}{{{d_1}}} + \frac{1}{{{d_2}}} - \frac{1}{{{f_\lambda}}}} \right)\left({{x^2} + {y^2}} \right) + j{\varphi _{{\lambda _0},\lambda}}(x,y)}}.$$

In Eq. (1), ${f_\lambda}$ is a lens focal distance for the wavelength $\lambda$, ${P_A}(x,y)$ represents the aperture of the optics, and ${\varphi _{{\lambda _0},\lambda}}(x,y)$ models the phase delay enabled by MPM for the wavelength $\lambda$ provided that ${\lambda _0}$ is the wavelength design parameter for MPM. In this formula, the phase $\frac{{j\pi}}{\lambda}({\frac{1}{{{d_1}}} + \frac{1}{{{d_2}}}})({{x^2} + {y^2}})$ appears due to propagation of the coherent wavefront from the object to the aperture (distance ${d_1}$) and from the aperture to the sensor plane (distance ${d_2}$), and $\frac{{- j\pi}}{{\lambda {f_\lambda}}}({{x^2} + {y^2}})$ is a quadratic phase delay due to the lens. For the lensless system,

$${{\cal P}_\lambda}(x,y) = {{\cal P}_A}(x,y){e^{\frac{{j\pi}}{\lambda}\left({\frac{1}{{{d_1}}} + \frac{1}{{{d_2}}}} \right)\left({{x^2} + {y^2}} \right) + j{\varphi _{{\lambda _0},\lambda}}(x,y)}},$$
and for the lens system without MPM, ${\varphi _{{\lambda _0},\lambda}}(x,y) \equiv 0$ in Eq. (1).

In the hybrid system, which is the topic of this paper, the optical power of the lens $1/{f_\lambda}$ is shared between the lens and the MPM and the generalized aperture takes the form

$$\!\!\!{{\cal P}_\lambda}(x,y) = {{\cal P}_A}(x,y){e^{\frac{{j\pi}}{\lambda}\left({\frac{1}{{{d_1}}} + \frac{1}{{{d_2}}} - \frac{{1 - \alpha}}{{{f_\lambda}}}} \right)\left({{x^2} + {y^2}} \right) + j{\varphi _{{\lambda _0},\lambda ,\alpha}}(x,y)}},\!$$
where the parameter $\alpha \in [0, 1]$. Observe that $\alpha = 0$ or $\alpha = 1$ correspond to system without balancing the light modulation between the MPM and lens. In addition, the index $\alpha$ in ${\varphi _{{\lambda _0},\lambda ,\alpha}}(x,y)$ shows that the magnitude of the quadratic component of the absolute phase used in the MPM design is defined by this parameter value supporting a proper sharing of the optical power. We point out that this parametrization of the optics phase profile is simple and powerful because it allows changing between two different systems (hybrid and just MPM) by setting values of $\alpha$. To the best of our knowledge, this is the first time that the optimization of the optics phase profile of an imaging system includes a parameter with the capability to determine the appropriate setup for the achromatic EDoF task.

The PSF of the coherent monochromatic optical system for the wavelength $\lambda$ is calculated by [49]

$${\rm{PSF}}_\lambda ^{{\rm{coh}}}(u,v) = {{\cal F}_{{{\cal P}_\lambda}}}\left({\frac{u}{{{d_2}\lambda}},\frac{v}{{{d_2}\lambda}}} \right),$$
where ${{\cal F}_{{{\cal P}_\lambda}}}$ is the Fourier transform of ${{\cal P}_\lambda}(x,y)$. Then, PSF for the corresponding incoherent imaging, which is a topic of this paper, is a squared absolute value of ${\rm{PSF}}_\lambda ^{{\rm{coh}}}(u,v)$. After normalization, this PSF function takes the form
$${{\rm{PSF}}_\lambda}(u,v) = \frac{{{{\left| {{\rm{PSF}}_\lambda ^{{\rm{coh}}}(u,v)} \right|}^2}}}{{\int \int_{- \infty}^\infty {{\left| {{\rm{PSF}}_\lambda ^{{\rm{coh}}}(u,v)} \right|}^2}{\rm{d}}u{\rm{d}}v}}.$$

We calculate PSF for RGB color imaging, assuming that the incoherent radiation is broadband and the intensity registered by an RGB sensor per $c$-band channel is an integration of the monochromatic intensity over the wavelength range $\Lambda$ with the weights ${T_c}(\lambda)$ defined by the sensor color filter array (CFA) and spectral response of the sensor. Normalizing these sensitivities on $\lambda$ [i.e. $\int_\Lambda {T_c}(\lambda){\rm{d}}\lambda = 1$], we obtain RGB channels PSFs by

$$\begin{split}&{{\rm{PSF}}_c}(u,v) \\&\quad= \frac{{\int_\Lambda {{\rm{PSF}}_\lambda}(u,v){T_c}(\lambda){\rm{d}}\lambda}}{{\int \int_{- \infty}^\infty \int_\Lambda {{\rm{PSF}}_\lambda}(u,v){T_c}(\lambda){\rm{d}}\lambda {\rm{d}}u{\rm{d}}v}},\quad c \in \{r,g,b\} ,\end{split}$$
where the monochromatic ${{\rm{PSF}}_\lambda}$ is averaged over $\lambda$ with the weights ${T_c}(\lambda)$.

Contrary to the conventional approaches for PSF-based RGB imaging, which use Eq. (5) with three fixed wavelengths $\lambda$ (often, 450, 550, and 650 nm) [4], we take into consideration spectral properties of the sensor and in this way obtain more accurate modeling of the image formation [43]. Additionally, Eq. (6) significantly differs from the current literature because of the normalization term that we introduced. This term allows the contribution of each RGB channels PSFs to be kept the same. Thus, the OTF for Eq. (6) is calculated as the Fourier transform of ${{\rm{PSF}}_c}(u,v)$:

$$\!\!\!{{\rm{OTF}}_c}({f_x},{f_y}) = \iint_{- \infty}^\infty {{\rm{PSF}}_c}(u,v){e^{- j2\pi ({f_x}u + {f_y}v)}}{\rm{d}}u{\rm{d}}v,\!$$
where $({f_x},{f_y})$ are the Fourier frequency variables.

2. From PSF to Image

Let us introduce ${\rm{PSFs}}$ for the defocus scenarios with notation ${{\rm{PSF}}_{c,\delta}}(x,y)$, where $\delta$ is a defocus distance in ${d_1}$, such that ${d_1} = d_1^0 + \delta$ with $d_1^0$ equal to the focal distance between the aperture and the object. Introduce a set ${\cal D}$ of defocus values $\delta \in {\cal D}$ defining the area of the desirable EDoF. It is worth noting that the corresponding optical transfer functions are used with notation ${{\rm{OTF}}_{c,\delta}}({f_x},{f_y})$. The definition of ${{\rm{OTF}}_{c,\delta}}({f_x},{f_y})$ corresponds to Eq. (7), where ${{\rm{PSF}}_c}$ is replaced by ${{\rm{PSF}}_{c,\delta}}$. Thus, let $I_{c,\delta}^s(u,v)$ and $I_c^o(u,v)$ be wavefront intensities at the sensor (registered focused/misfocused images) and the intensity of the object (true image), respectively. Then, $I_{c,\delta}^s(u,v)$ are obtained by convolving the true object image $I_c^o(u,v)$ with ${{\rm{PSF}}_{c,\delta}}(u,v)$ forming the set of misfocused (blurred) color images

$$I_{c,\delta}^s(x,y) = {{\rm{PSF}}_{c,\delta}}(x,y)\circledast I_c^o(x,y),$$
where $\circledast$ stays for convolution. In the Fourier domain we have
$$I_{c,\delta}^s({f_x},{f_y}) = {{\rm{OTF}}_{c,\delta}}({f_x},{f_y}) \cdot I_c^o({f_x},{f_y}).$$

The indexes $(o,s)$ stay for object and sensor, respectively.

3. EDoF Image Reconstruction

For image reconstruction from the blurred data $\{I_{c,\delta}^{s,k}({f_x},{f_y})\}$, we use a linear filter with the transfer function ${H_c}$, which is the same for any defocus $\delta \in {\cal D}$. Let us formulate the design of the inverse imaging transfer function ${H_c}$ as an optimization problem:

$${\hat H_c} \in {\rm{argmi}}{{\rm{n}}_{{H_c}}}\;\underbrace {\frac{1}{{{\sigma ^2}}}\sum\limits_{\delta ,k,c} {\omega _\delta}||I_c^{o,k} - {H_c} \cdot I_{c,\delta}^{s,k}||_2^2 + \frac{1}{\gamma}\sum\limits_c ||{H_c}||_2^2}_J,$$
where $k \in K$ stays for different images, $I_c^{o,k}$ and $I_{c,\delta}^{s,k}$ are sets of the true and observed blurred images (Fourier transformed), $c$ for color, ${\sigma ^2}$ stands for the variance of the Gaussian noise, and $\gamma$ is a Tikhonov regularization parameter. The ${\omega _{\delta ,0}} \le {\omega _\delta} \le 1$ are the residual weights in the criterion $J$ in Eq. (10). We calculate these weights as the exponential function ${\omega _\delta} = {\exp}(-\mu\cdot |\delta |)$ with the parameter $\mu \gt 0$. The norm $|| \cdot ||_2^2$ is Euclidean defined in the Fourier domain for complex-valued variables.

Thus, we aimed to find ${H_c}$ such that the estimates ${H_c} \cdot I_{c,\delta}^{s,k}$ would be close to FT of the corresponding true images $I_c^{o,k}$. The second summand stays as a regularizer for ${H_c}$. Due to Eq. (9), minimization on ${H_c}$ is straightforward, leading to

$${\hat H_c}({f_x},{f_y}) = \frac{{\sum\limits_{\delta \in {\cal D}} {\omega _\delta}{\rm{OTF}}_{c,\delta}^ * ({f_x},{f_y})}}{{\sum\limits_{\delta \in {\cal D}} {\omega _\delta}|{{\rm{OTF}}_{c,\delta}}({f_x},{f_y}{{)|}^2} + \frac{{{\rm{reg}}}}{{\sum\limits_k |I_c^{o,k}({f_x},{f_y}{{)|}^2}}}}},$$
where the regularization parameter reg stays instead of the ratio $\frac{{{\sigma ^2}}}{\gamma}$. The details to derive Eq. (11) are deferred to Appendix A. In our experiments, we compare the inverse imaging for two versions of this Wiener filter. In the first one, the optical transfer function ${\hat H_c}$ is defined as above in Eq. (11). In the second one, we assume that the sum $\sum\nolimits_k |I_c^{o,k}({f_x},{f_y}{)|^2}$ is nearly invariant, and the optical transfer function can take the form
$${\hat H_c}({f_x},{f_y}) = \frac{{\sum\limits_{\delta \in {\cal D}} {\omega _\delta}{\rm{OTF}}_{c,\delta}^ * ({f_x},{f_y})}}{{\sum\limits_{\delta \in {\cal D}} {\omega _\delta}|{{\rm{OTF}}_{c,\delta}}({f_x},{f_y}{{)|}^2} + {\rm{reg}}}}.$$

To make a difference between two corresponding inverse imaging procedures, we call the inverse imaging OTF ${\hat H_c}$ (Wiener filter) defined by Eq. (11) and defined by Eq. (12) as the inverse imaging OTF (Wiener filter) with varying and invariant regularization, respectively. Sometimes, for simplicity, we call these two procedures as varying or invariant Wiener filters.

Therefore, the reconstructed images are calculated as

$$\hat I_c^{o,k}(x,y) = {{\cal F}^{- 1}}\{{\hat H_c} \cdot I_{c,\delta}^{s,k}\} ,$$
where ${{\cal F}^{- 1}}$ models the inverse Fourier transform. For the exponential weight, ${\omega _\delta} = {\exp}(-\mu\cdot |\delta |),\mu \gt 0$ is a parameter that is optimized. The derived OTFs in Eqs. (11) and (12) are optimal to make the estimates in Eq. (13) efficient for all $\delta \in {\cal D}$; in this way, we are targeted on EDoF imaging. Note that ${H_c}$ in Eq. (12) with invariant regularization was proposed in [45].

B. MPM Modeling and Design Parameters

In our design of MPM, we follow the methodology proposed in [43]. The details of this design also can be seen in Supplement 1. The following parameters characterize the free-shape piecewise invariant MPM: $h$ is a thickness of the varying part of the mask, and $N$ is a number of levels, which may be of different height.

1. Absolute Phase Model

The proposed absolute phase ${\varphi _{{\lambda _0},\alpha}}$ for our MPM takes the form

$$\begin{split}{\varphi _{{\lambda _0},\alpha}}(x,y) &= \frac{{- \pi \alpha}}{{{\lambda _0}{f_{{\lambda _0}}}}}({x^2} + {y^2}) + \beta ({x^3} + {y^3}) \\&\quad+ \sum\limits_{r = 1,r \ne 4}^R {\rho _r}{P_r}(x,y),\end{split}$$
where the first term is the quadratic phase, $\alpha$ is the optical power balance between the lens and phase mask for wavelength ${\lambda _0}$, and the focal distance is ${f_{{\lambda _0}}}$. The cubic phase of magnitude $\beta$ is a typical component for EDoF, the third group of the items is for parametric approximation of the free-shape MPM using the Zernike polynomials ${P_r}(x,y)$ with coefficients ${\rho _r}$ to be estimated. We exclude the fourth Zernike polynomial (defocus term, i.e., quadratic) because it is considered in the first term.

The proposed design is a combination of symmetry (quadratic $+$ Zernike polynomials) and nonsymmetry (cubic) phase terms. Observe that our parametrization of the optics phase profile significantly differs from works such as [4] that also considers Zernike polynomials, because the optimizing coefficient $\frac{{- \pi \alpha}}{{{\lambda _0}{f_{{\lambda _0}}}}}$ also affects the contribution in the modulation of the light by the refractive lens employed in our optical system due to $\alpha$. Additionally, since $\alpha$ is the power-balance variable, it deeply affects the physical setup because of the capability to change between hybrid and lensless systems.

2. Fresnel Order (Thickness)

In radians, the mask thickness is defined as $Q = 2\pi {m_Q}$, where ${m_Q}$ is called “Fresnel order” of the mask which, in genera,l is not necessarily an integer. Then the phase profile of MPM considering the thickness is calculated as

$${\hat \varphi _{{\lambda _0},\alpha}}(x,y) = {\rm{mod}}({\varphi _{{\lambda _0},\alpha}}(x,y) + Q/2,Q) - Q/2.$$

The operation in Eq. (15) returns ${\hat \varphi _{{\lambda _0},\alpha}}(x,y)$, taking the values in the interval $[- Q/2$, $Q/2)$. The parameter ${m_Q}$ is known as the Fresnel order of the mask. For ${m_Q} = 1$ this restriction to the interval $[- \pi$, $\pi)$ corresponds to the standard phase-wrapping operation.

3. Number of Levels

The mask is defined on the $2D$ grid $(X,Y)$ with the computational sampling period (pixel) ${\Delta _{{\rm{comp}}}}$. We obtain a piecewise invariant surface for MPM after nonlinear transformation of the absolute phase. The uniform grid discretization of the wrap phase profile ${\hat \varphi _{{\lambda _0},\alpha}}(x,y)$ to the $N$ levels is performed as

$${\theta _{{\lambda _0},\alpha}}(x,y) = \lfloor{\hat \varphi _{{\lambda _0},\alpha}}(x,y)/N \rfloor\cdot N,$$
where $\lfloor w \rfloor$ stays for the integer part of $w$. The values of ${\theta _{{\lambda _0},\alpha}}(x,y)$ are restricted to the interval [$- Q/2$, $Q/2$). $Q$ is an upper bound for the thickness phase of ${\theta _{{\lambda _0},\alpha}}(x,y)$.
 figure: Fig. 2.

Fig. 2. Differentiable optimization framework of phase-coded optics for achromatic EDoF. The spectral PSFs are convolved with a batch of RGB images. The inverse imaging OTFs provide estimates of the true images. Finally, a differentiable quality loss ${\cal L}$, such as mean squared error with respect to the ground-truth image (or PSNR criterion), is defined on the reconstructed images.

Download Full Size | PPT Slide | PDF

It is worth mentioning that the floor and modulo functions are not differentiable; therefore, we use a smoothing approximation to be able to optimize the thickness and number of levels of MPM. The details of this approximated function can be found in Appendix B. The physical size of the mask’s pixel is ${m_w}{\Delta _{{\rm{comp}}}}$, where ${m_w}$ is the width of the mask’s pixel with respect to the computational pixels. The mask is designed for wavelength ${\lambda _0}$. Thus, the piecewise phase profile of MPM is calculated as

$${\varphi _{{{\rm{MPM}}_{{\lambda _0},\lambda ,\alpha}}}}(x,y) = \frac{{{\lambda _0}(n(\lambda) - 1)}}{{\lambda (n({\lambda _o}) - 1)}}{\theta _{{\lambda _0},\alpha}}(x,y),$$
where ${\theta _{{\lambda _0},\alpha}}$ is the phase shift of the designed MPM and $n(\lambda)$ is the refractive index of the MPM material $x \in X,y \in Y$. The MPM thickness $h$ in length units is of the form
$${h_{{\lambda _0}}}(x,y) = \frac{{{\lambda _0}}}{{(n({\lambda _o}) - 1)}}\frac{{{\theta _{{\lambda _0},\alpha}}}}{{2\pi}}{\rm{.}}$$

C. Optimization Framework

We develop a framework to optimize the proposed optical system that is summarized in Fig. 2 by stochastic gradient methods with the ADAM optimizer in PyTorch [21], an optimized tensor library for neural network (NN) learning using GPUs. We express each stage of the model described in the following subsections as differentiable modules.

1. Loss Function

Let $\Theta$ be a full set of the optimization parameters defined as

$$\Theta = (\alpha ,\beta ,{m_Q},reg,N,{\rho _r}).$$

Then, we use the following multi-objective formulation of our optimization goals:

$$\hat \Theta = {\rm{argma}}{{\rm{x}}_\Theta}({\rm{PSNR}}(\Theta ,\delta),\delta \in {\cal D}).$$

In this formulation, we maximize all ${\rm{PSNR}}(\Theta ,\delta)$, $\delta \in {\cal D}$, simultaneously to achieve the best accuracy for all focus and defocus situations. Here, ${\rm{PSNR}}(\Theta ,\delta)$ is calculated as the mean value of ${{\rm{PSNR}}^k}(\Theta ,\delta)$ over the set of the test images $k \in K$:

$${\rm{PSNR}}(\Theta ,\delta) = {\rm{mea}}{{\rm{n}}_{k \in K}}({{\rm{PSNR}}^k}(\Theta ,\delta)).$$

There are various formalized scalarization techniques reducing the multi-objective criterion to a simple scalar one. Usually, it is achieved by aggregation of multiple criteria in a single one such as in [50]. In this paper, we follow pragmatical heuristics comparing ${\rm{PSNR}}(\hat \Theta ,\delta)$ as the $1D$ curve functions of $\delta$ to maximize ${\rm{PSNR}}(\Theta ,\delta)$ for each $\delta \in {\cal D}$. Here, $\hat \Theta$ are estimates of the optimization parameter. In this heuristic, we follow the aim of the multi-objective optimization Eq. (20). The key challenges in developing the proposed optimization framework were to satisfy manufacturing constraints, find stable optimization algorithms, and fit models within memory limits.

2. Parameters Setup

We simulate a sensor with a pixel size of 3.45 µm and a resolution of $512 \times 512$ pixels. All Zernike coefficients are initialized to zero at the beginning of the optimization. The optical element is discretized with a 2 µm feature size on a $3000 \times 3000$ grid. In the learning phase, which includes optimizing the optical element and finding the optimal $\alpha$, $\beta$, ${m_Q}$, $N$, and reg for the reconstruction, we use a step size of $5 \times {10^{- 3}}$ with Adam stochastic gradient descent solver. Recall that $\alpha$ only can take values over the set [0,1]. We limit the study of the number of levels (N) in the interval [2,255]. We range the Fresnel order ${m_Q}$ from 2 to 16. A diameter $D$ of the aperture ${P_A}$ is equal to 6 mm. A fixed distance between the aperture and the sensor plane ${d_2}$ was chosen to have a sharp image at 1 m and the distance between the object and the aperture is discretized as 0.5, 1.0, 2.0, 2.4 m and $\infty$. The focal distance of the lens is equal to $f = 10\;{\rm{mm}}$; then, the focal distance of the system is $d_1^0 = 1m$. We experimentally observe that $R = 14$ (Zernike coefficients excluding the fourth polynomial) is enough and larger values do not significantly improve the image quality. The design wavelength is ${\lambda _0} = 510\;{\rm{nm}}$. After applying the wave optics image formation model, we also include Gaussian noise in the measurements, with the variance equal to $1 \times {10^{- 4}}$. We choose 31 visible wavelengths, in 10 nm intervals, in (400–700) nm to estimate the RGB imaging PSFs. The optimization stage employs 200 epochs, which takes approximately 6 h on an NVIDIA TESLA V 100 with a 32 GB memory.

3. Dataset

We sample images from a dataset of 1244 high-resolution images [51] to design the proposed hybrid optics. From this dataset, we present PSNR curves as a function of ${d_1}$ for different Fresnel orders from 2 to 16. Reported PSNRs for each depth ${d_1}$ are calculated as the average over 24 RGB Kodak images [52].

 figure: Fig. 3.

Fig. 3. Reconstruction quality (accuracy) in PSNR(dB) versus depth distance ${d_1}$ using the inverse imaging OTFs with invariant/varying regularization for the proposed power-balanced hybrid and lensless systems. The red horizontal lines correspond to the desirable values of ${\rm{PSNR}} = 30\;{\rm{dB}}$. The effect of the Fresnel order on reconstruction quality is shown. These numerical results suggest that the highest value of Fresnel order provides the most precise reconstruction. A gain up to 2 dB is obtained by inverse imaging OTF with varying regularization in comparison with the OTF with the invariant regularization. Finally, the power-balanced hybrid overcomes its lensless counterpart up to 5 dB of PSNR.

Download Full Size | PPT Slide | PDF

4. Memory Constraints

Fitting models within memory limits was difficult due to the fine discretization of the optical elements necessary for high-fidelity Fresnel propagation. The essential insight to keep memory usage tractable was to take advantage of the stochastic optimization scheme by randomly binning the depths and wavelengths present in each image into a smaller set. For instance, a given random image is placed over the wide DoF of interest.

4. SIMULATION TESTS

A. MPM Parameters Effect over Power-Balanced and Lensless Systems

In what follows, the quality of imaging is evaluated by PSNR calculated jointly for RGB channels. Due to the huge amount of possible combinations to analyze the optimizing parameters of MPM, we present the most informative combinations to illustrate their importance. For instance, since the thickness of the MPM is an imposed condition by the implementation procedure, we analyze the obtained quality by fixing the Fresnel order and optimizing the rest of the parameters. These results are summarized in Fig. 3 for both optical power-balanced hybrid and lensless systems.

In these figures, we present PSNR curves as a function of ${d_1}$. For the lensless setup, $\alpha$ is equal to 1 according to Eq. (3). The best performance is achieved by fixing the Fresnel order and optimizing the remaining parameters for each scenario and optical setup. The colored regions between lines correspond to the obtained quality when the parameters $N$, $\alpha$, and Fresnel order are selected and excluded from the set of parameters subject to optimization, which can be considered as the state-of-the-art strategy. For this second scenario, we chose 1000 random values for $N$ and $\alpha$ (for the lensless case just $N$). These results clearly suggest the advantage of optimizing the number of levels, thickness, and optical power-balance parameters of MPM.

The images in Fig. 3 allow multiple comparisons. Note that for each of the presented scenarios, the systems are optimized in an end-to-end manner to have a fair comparison of the potential of the different optical setups. First, comparing the PSNR curves in the images, we may conclude that for the absolute phase with $\alpha$, $\beta$, plus Zernike polynomials, the performance of both optical systems improves with higher values of PSNRs for larger values of the Fresnel orders varying from 2 to 16. For smaller values of the Fresnel order, this improvement can be very large. For instance, for the optical power-balanced hybrid, we gain about 2–3 dB if the Fresnel order is changed from 2 to 4. At the same time, a change of the Fresnel order from 8 to 16 gives only about a 1 dB improvement. It is worth mentioning that the performance gain for Fresnel order larger than 16 is minor and nearly negligible.

Second, comparing the left and right columns in Fig. 3, we may note that the OTFs with the Wiener varying regularization demonstrate an improved performance of about 1 dB compared to the counterpart with the invariant regularization for both optical setups. In the second row, for the optical power-balanced hybrid, the peak of the upper curve for OTF with the varying regularization case is about 1 dB more than the corresponding PSNR value for OTF with the invariant regularization. Third, the PSNR curves in each image are given for different absolute phase models for the design of MPM, marked by the parameters $\alpha$, $\beta$, and Zernike polynomials. Comparing these curves, we may evaluate the advantage of more complex models. In particular, the optimized Zernike polynomials demonstrate visible improvements in performance for all algorithms. Fourth, the comparison of the lensless system versus the proposed optical power-balanced hybrid is definitely in favor of the latter one with the great advantage of about 5 dB.

Overall, the absolutely best results are clearly demonstrated by the proposed system with the phase profile defined by including $\alpha$,$\beta$, and Zernike polynomials in Eq. (14), OTFs with Wiener varying regularization in Eq. (11), and Fresnel order equal to 16. Regarding the achieved DoF, we note that for the proposed hybrid with power-balanced optical power, DoF covers the design interval $[0.5,\infty] m$ with the clear advantage of the reconstruction with varying regularization. The lensless system fails to achieve a similar result. It also can be shown that the lens-only system is still far from reaching this goal.

To complement the previous numerical results related to the optimized MPM parameters, we present the behavior of the designing framework for the number of levels and optical power balance versus Fresnel order. These results are summarized in Fig. 4. We can observe a clear dependence between the Fresnel order and these two variables, suggesting the need for the number of levels ($N$) and the optical power-balance ($\alpha$) in the optimization pipeline, indicating in this case that $N$ and $\alpha$ must be higher when the Fresnel order increases for the particular task of an extended depth of field. These results reveal that lensless or just lens systems (without optical power balance) are suboptimal optical solutions for EDoF.

 figure: Fig. 4.

Fig. 4. Average of the optimal number of levels (N) and optical power-balance ($\alpha$) values obtained for the proposed system with the end-to-end framework with varying/ invariant regularization versus different Fresnel order values. The solid line stands for the mean along with all the number of levels and power-balance values, and the colored area models the variance. These results suggest a direct dependence between the Fresnel order, the number of levels, and optical power balance. This dependence shows that the number of levels and optical power balance are higher since the Fresnel order increases.

Download Full Size | PPT Slide | PDF

We finalize this section illustrating profiles of the designed MPM. This design is illustrated in Fig. 5 by a 2D image plane along with the cross section of each term of the phase function (values of $\alpha$, $\beta$, and ${\rho _r}$) for MPM profile. The variances of the shape of MPM corresponding to quadratic, cubic, and some of the most valuable Zernike polynomials components are presented. It is worth mentioning that the MPM designed for the sum of these components is not additive with respect to these fragments of MPM. The results shown in Fig. 5 are given for the optical power-sharing variable $\alpha = 0.22$ under the simulation setup in Section 3 of this paper, which means that the MPM modulates 22% of the passing light to achieve achromatic EDoF.

B. Inverse Imaging

To explain the results shown in Fig. 3, we analyze the behavior of PSF functions and OTF designed for inverse imaging. Figure 6(a) provides additional confirmation that the effect of color imaging without chromatic aberrations is indeed successfully achieved. In Fig. 6(a), the cross-sections of PSF functions of the proposed system after deblurring by Eq. (11) are shown for three RGB color channels and three different distances. This deblurring process is calculated as ${{\cal F}^{- 1}}\{{\hat H_c} \cdot {{\rm{OTF}}_{c,\delta}}\}$, for $\delta = 0.5, 1.0,$ and $\infty$. The curves are shown for the proposed system with a Fresnel order equal to 16 and $\alpha = 0.22$ following Fig. 4. The curves are well concentrated around the focal point and consolidated with respect to each other, which suggests high-quality imaging for all color channels.

 figure: Fig. 5.

Fig. 5. Phase profile of the designed MPM. The contribution of each term (value of $\alpha ,\beta$, and ${\rho _r}$) in MPM profile is illustrated; that is, the shape of MPM corresponding to quadratic, cubic, and some of the most valuable Zernike polynomials components is presented where $R = \sqrt {{x^2} + {y^2}}$ and $\phi = \arctan (y/x)$.

Download Full Size | PPT Slide | PDF

 figure: Fig. 6.

Fig. 6. (a) Central cross-sections of PSF functions of the optical system after deblurring by Eq. (11). The ideal deblurring should be close to $\delta$ function, and we observe that the curves are well concentrated around the central point and not far different from each other for all channels and distances. The best result we can see is for the blue color channel. (b) Average of the optimal $\mu$ values obtained for the proposed and lensless systems with the end-to-end framework versus different Fresnel order values. The solid line stands for the mean along all the $\mu$ values and the colored area models the variance. These results suggest a direct dependence between the Fresnel order and the inversion rule in Eq. (13) through the value of $\mu$, establishing that $\mu$ can be discarded when the Fresnel order is large enough, meaning that the inversion rule in Eq. (13) plays an important role when the Fresnel order is small.

Download Full Size | PPT Slide | PDF

 figure: Fig. 7.

Fig. 7. Evaluation of achromatic EDoF imaging in simulation for two depths, ${d_1} = 0.5,1.0$, using the proposed mixed OTF in Eq. (11) in terms of PSNR. We compare the performance of a Fresnel lens optimized for one of the target wavelengths, a refractive lens, lensless setup with cubic MPM (column $\beta$), lensless system with phase profile in [4], optical power-balanced hybrid without cubic component (column $\alpha + {\rm{Zern}}{\rm{.Poly}}$), and the proposed system (column $\alpha + \beta + {\rm{Zern}}{\rm{.Poly}}$) with Fresnel order equal to 16 and $\alpha = 0.22$ following results in Fig. 3. The resulting PSFs are shown for all target wavelengths and depths. Sensor images and PSFs (rows 2 and 4) exhibit strong chromatic aberrations for the Fresnel lens, refractive lens, and the lensless system with cubic MPM, whereas the lensless system in [4], a power-balanced hybrid without cubic component and the proposed optics mitigate wavelength-dependent image variations much better where our optics has the best performance (the curves are better concentrated around the focal point). After deconvolution (rows 3 and 5), the sharpest images are obtained with the proposed optics.

Download Full Size | PPT Slide | PDF

Here, we also study the relation between $\mu$ and the Fresnel order obtained under the end-to-end optimization. Therefore, we want to determine how the Fresnel order value affects the performance of the mixed OTF through the variable $\mu$. Recall that $\mu$ is needed to estimate the scene when it is composed of several objects simultaneously located at different distances and defines the weights ${\omega _\delta}$ for the designed OTF. The desired analysis between Fresnel order and $\mu$ is summarized in Fig. 6(b). Specifically, this figure presents the average of the optimal $\mu$ values obtained for the proposed and lensless systems with the end-to-end framework. These optimal values were estimated in the experiments performed for Fig. 3. The solid line in Fig. 6(b) stands for the mean along all the $\mu$ values and the colored blue area models the variance. These results suggest a direct dependence between the Fresnel order an the inversion rule in Eq. (13) through the value of $\mu$. This dependence reflects that $\mu$ can be discarded when the Fresnel order is large enough; that is, the inversion rule in Eq. (13) plays an important role when the Fresnel order is small (less than 8).

C. Simulated Reconstructions

We present simulated reconstructions for two depths ${d_1} = 0.5$ and $1.0{\rm {m}}$ using the proposed mixed OTF in Eq. (11) that is summarized in Fig. 7. This figure analyzes the Fresnel lens optimized for one of the target wavelengths, a refractive lens, lensless setup with cubic MPM (column $\beta$), lensless system with phase profile in [4], optical power-balanced hybrid without cubic component (column $\alpha + {\rm{Zern}}{\rm{.Poly}}$), and the proposed system (column $\alpha + \beta + {\rm{Zern}}{\rm{.Poly}}$) with a Fresnel order equal to 16, $\alpha = 0.22$, and the number of levels equal to 62, following the results in Fig. 4. From these results, we observe that the resulting PSFs are shown for all target wavelengths and depths. Sensor images and PSFs (rows 2 and 4) exhibit strong chromatic aberrations for the Fresnel lens, refractive lens, and the lensless system with cubic MPM, whereas the lensless system in [4], power-balanced hybrid without cubic component and the proposed optics mitigate wavelength-dependent image variations much better where our optics has the best performance (the PSF curves are better concentrated around the focal point). After deconvolution (rows 3 and 5), the sharpest images are obtained with the proposed optics. Additionally, combining the results from columns, we clearly observe the motivation to use the phase profile in Eq. (14).

5. EXPERIMENTAL TESTS

A. Optical Setup and Equipment

Due to the new included variables in the MPM design (number of levels, thickness, and power balance), the number of possible combinations to compare the proposed system with the state-of-the-art is huge and to avoid building several MPM to physically analyze the performance of our camera, we build a setup that exploits the phase capabilities of a SLM to investigate the performance of the proposed MPM design. The optical setup is depicted in Fig. 8, where Scene denotes objects under investigation; the polarizer P keeps the light polarization needed for a proper wavefront modulation by SLM; the beamsplitter BS governs SLM illumination and further light passing; the lenses ${{\rm{L}}_1}$ and ${{\rm{L}}_2}$ form a 4f-telescopic system that transfers the light wavefront modified by SLM to the lens ${{\rm{L}}_3}$ plane; and the lens ${{\rm{L}}_3}$ forms an image of the Sceneon the imaging detector CMOS.

 figure: Fig. 8.

Fig. 8. Experimental setup. P, polarizer; BS, beamsplitter; and SLM, spatial light modulator. The lenses ${{\rm{L}}_1}$ and ${{\rm{L}}_2}$ form the 4f-telescopic system projecting wavefront from the SLM plane to the imaging lens ${{\rm{L}}_3}$, and CMOS is a registering camera. ${d_1}$ is a distance between the scene and the plane of the hybrid imaging system Lens & MPM, and ${d_2}$ is a distance between this optical system and the sensor.

Download Full Size | PPT Slide | PDF

For MPM implementation, we use the Holoeye phase-only GAEA-2-vis SLM panel, with a resolution of $4160 \times 2464$, a pixel size of 3.74 µm, ${{\rm{L}}_1}$ and ${{\rm{L}}_2}$ achromatic doublet lenses with a diameter of 12.7 mm and a focal distance of 50 mm, BK7 glass lens ${{\rm{L}}_3}$ with a diameter of 6 mm and a focal distance of 10.0 mm; a CMOS Blackfly S board level camera with the color pixel matrix Sony IMX264, with 3.45 µm pixels and $2448 \times 2048$ pixels. This SLM allows us to experimentally study the optical power-balanced hybrid with the phase distribution of the designed MPM (implemented on SLM) additive to the imaging lens ${{\rm{L}}_3}$. The MPM phase was created as an 8-bit *.bmp file and imaged on SLM. We calibrated the SLM phase delay response to the maximum value of $3.6\pi$ for a wavelength of 510 nm. This $3.6\pi$ corresponds to the value 255 of *.bmp file for the phase image of MPM.

  • 1. Imaging: Test images in the Scene plane are displayed on a screen with $1440 \times 2960$ pixels and 570ppi. The distance ${d_1}$, between the screen and SLM, is varied from 0.5 m to 2 m. The distance 1.0 m is the focal point and the others for the defocus. This distance range is enough to study and experimentally prove the advantages of the proposed system in terms of sharpness and low chromatic aberrations.
  • 2. SLM and MPM design: We recall that the SLM only provides a phase delay of up to $3.6\pi$, which is a limiting aspect to fully study the effect of the designing variable Fresnel order. In fact, the simulation tests show that larger values of the Fresnel order result in better imaging. Despite this limitation, this work studied two cases, when the Fresnel ${\rm{order}} = {1.2}$ and 1.4. Nevertheless, even with these small values, we demonstrate high-quality imaging and the advantage of the higher value of the Fresnel order.
  • 3. PSF acquisition: To calibrate the system in Fig. 8, we use the flashlight of the Samsung S8 phone and a fiber (Quartz Fiber, SMA to SMA, 200 µm/0.22 NA) with approximately uniform spectral distribution in the range [400–700] nm as a point source for white light in a dark room. Additionally, the PSFs are downsampled to a resolution of 3.74 µm of resolution to fit the pitch size of the employed SLM. From these acquired PSFs, we produce experiments for the 10 distances ${d_1}$ equal to 0.5, 0.65, 0.8, 0.9, 1.0, 1.15, 1.3, 1.5, 1.8, and 2.0 m, and use these estimates to compute the deblurring varying/invariant OTF following Eqs. (11) and (12).
  • 4. Reconstruction algorithm: To estimate the scene, the experimental PSFs and blurred images are needed as input. Then, the deblurring OTF either invariant or varying formula is computed following Eqs. (12) and (11) to estimate the image using Eq. (13). After this step, a denoising process equipped with a sharpening procedure [22] is performed over the estimated scene as a sparsity prior for all compared optical setups. This final denoised image is returned as the estimated scene from the experimental data.

B. Experimental Results

We present observations and images reconstructed from the observed blurred measurements for a depth range 0.5–2.0 m two Fresnel orders (1.2 and 1.4) using the varying/invariant Wiener filtering methods as in Eqs. (11) and (12), following the optical setup described in Fig. 8. We present the experimental and simulated PSFs for each system and different distances employing the process acquisition for the physical setup described in the previous subsection. This experimental data is acquired for the proposed optimized optical power-balanced hybrid, lens+cubic phase MPM, lensless with cubic phase MPM, and lens systems to validate the simulated results provided in Section 4. The optimal value for the power-balance variable $\alpha$ is 0.05. The estimated images are obtained using the reconstruction algorithm summarized described above. The OTF step in this algorithm demands the choice of the reg value following Eqs. (11) and (12) that we select by cross-validation to obtain the best visual quality for the reconstructed image in the interval ${[10^{- 5}}{,10^{- 3}}]$.

 figure: Fig. 9.

Fig. 9. Reconstructed images from experimental blurred measurements that contain three objects at three different distances ${d_1} = 0.5,1.0$, and 1.2 m, a casual real scene (depth range 0.5–2.0 m), and two Fresnel orders are compared for four optical setups: proposed power-balanced hybrid (column $\alpha + \beta +$ Zern. Poly), lens $+$ MPM corresponding to cubic companion (column $\alpha + \beta$), lensless with cubic phase MPM (column $\beta$), and lens only (column Lens). To estimate the scene, we use varying/invariant Wiener filtering as in Eqs. (11) and (12) for distances ${d_1} = 0.5,0.65,0.8,0.9,1.0,1.15,1.3,1.5,1.8$, and 2.0 m. The experimental and simulated PSFs are shown in the measurement rows. The imaging results confirm that the highest value of Fresnel order provides the best reconstruction quality and that the advantage of the varying Wiener filtering in Eq. (11) over its invariant version in Eq. (12) is a mitigation of the chromatic aberrations and reduction of the noise. Additionally, we verify that the optimized proposed system is superior to its competitors in terms of sharpness and chromatic aberrations, suggesting the effectiveness of the optimization framework described in Fig. 2. Finally, we confirm the ability of the optimized OTF in Eq. (11) to estimate the longitudinal scene.

Download Full Size | PPT Slide | PDF

In this experiment, the first scene contains three objects (three parrot images) located simultaneously at three different distances ${d_1} = 0.5,1.0,$ and $1.2 {\rm {m}}$, and the second is a casual real scene (depth range 0.5–2.0 m). We acquire the mixed blurred measurements to determine the effectiveness of the optimized OTF in Eq. (11). These results are summarized in Fig. 9. In addition, to complement these experiments, in Supplement 1, we provide additional material that contains the behavior of the proposed system for different values of $\alpha$. In this case, to estimate the scene, we use varying/invariant Wiener filtering with the optimized OTF in Eqs. (11) and (12), showing also the experimental and simulated PSFs for all the systems where the value of $\mu$ in the formula of the weights ${\omega _\delta}$ in Eq. (10) is fixed as $1 \times {10^{- 3}}$. Notice that the effect of $\mu$ in the reconstruction quality can be only analyzed when the scene is composed of several objects simultaneously located at different distances, as is the case for Fig. 9. In this scenario, the reference images (column Fresnel ${\rm{order}} = {1.4}$, rows “varying Wiener filter”) suggest that the highest value of Fresnel order provides the best reconstruction quality, which is obtained for Fresnel order = 1.4 compared to the Fresnel ${\rm{order}} = {1.2}$, in terms of sharpness and low chromatic aberrations.

The inserted zoomed images in Fig. 9 reveal that the lens$+$cubic MPM (column $\alpha + \beta$), lensless with cubic MPM (column $\beta$), and lens systems have strong chromatic aberrations. In fact, it is now clear that the advantage of the varying Wiener filtering in Eq. (11) over its invariant version in Eq. (12) is the mitigation of the chromatic aberrations and reduction of the noise since the proposed OTF partially corrects these issues for the lens$+$cubic MPM, lensless with cubic MPM, and lens systems and in parallel, suggesting the superiority of the proposed optical power-balanced hybrid system. Therefore, we have shown that the returned design optics from the optimization framework described in Fig. 2 does provide achromatic EDoF behavior. Finally, we verify the effectiveness of the optimized OTF in Eq. (11) to estimate the longitudinal scene, and the smoothing function to model the number of levels and Fresnel order of MPM. More results can be found in Supplement 1. Finally, we consider that a future research direction for Eq. (11) and Eq. (12) can be dynamic imaging because of their computational efficiency and scalability.

6. CONCLUSION

This paper shows that the optimized power-balanced hybrid optical system composed of a refractive lens and diffractive phase coding MPM in the scenario of achromatic EDoF imaging demonstrates advanced performance compared to the two counterparts: the single refractive lens and the lensless system with MPM as an optical element. The optical power balance in the proposed hybrid and the design of MPM that were both optimized in the end-to-end framework are crucial elements of this advance. The algorithm for multi-objective optimization balances PSNR’s values for imaging with different defocus distances and, in this way, enables EDoF imaging. The designed hybrid optics is insensitive to defocus and in this way automatically enables achromatic imaging because these PSFs also are insensitive to the dispersion of the spectral characteristics of MPM and the lens. One of the original elements of this paper is the OTF (in two versions with invariant and varying regularization) that is optimal for inverse imaging in EDoF scenarios. We also show that the Fresnel order (thickness) and number of levels of MPM are important design parameters that suggest the effectiveness of the smoothing function. To the best of our knowledge, this is an original observation. The advanced performance of the proposed optical setup is demonstrated by numerical simulation and experimental tests. For our implementation of MPM, we use a high-resolution SLM. In further work, we will consider the design and implementation of the optical power-balanced hybrid camera with a thick MPM.

 figure: Fig. 10.

Fig. 10. Comparison between the smoothing approximation of the floor function described in Eq. (26). From this figure, we observe that Eq. (26) provides a close estimation of the true value.

Download Full Size | PPT Slide | PDF

APPENDIX A: SOLUTION OF EQ. (10)

Observe that the criterion $J$ can be rewritten as

$$J = \frac{1}{{{\sigma ^2}}}\sum\limits_{\delta ,k,c} {\omega _\delta}||I_c^{o,k} - {H_c} \cdot {{\rm{OTF}}_{c,\delta}} \cdot I_c^{o,k}||_2^2 + \frac{1}{\gamma}\sum\limits_c ||{H_c}||_2^2.$$

The minimum condition for $J$ is calculated as $\frac{{\partial J({f_x},{f_y})}}{{\partial H_c^ * ({f_x},{f_y})}} = 0,$ where (*) stays for complex conjugate. After the derivative calculation and some manipulations, we arrive at:

$$\begin{split}&\frac{1}{{{\sigma ^2}}}\sum\limits_\delta {\omega _\delta}{H_c}({f_x},{f_y}) \cdot |{{\rm{OTF}}_{c,\delta}}({f_x},{f_y}{)|^2} \cdot \sum\limits_k |I_c^{o,k}({f_x},{f_y}{)|^2} \\&\quad {-}\frac{1}{{{\sigma ^2}}}\sum\limits_\delta {\omega _\delta}{\rm{OTF}}_{c,\delta}^ * ({f_x},{f_y}) \cdot \sum\limits_k |I_c^{o,k}({f_x},{f_y}{)|^2}\\&\quad + \frac{1}{\gamma}{H_c}({f_x},{f_y}) = 0\end{split}$$
with the solution for ${H_c}$:
$${\hat H_c}({f_x},{f_y}) = \frac{{\sum\limits_{\delta \in {\cal D}} {\omega _\delta}{\rm{OTF}}_{c,\delta}^ * ({f_x},{f_y})}}{{\sum\limits_{\delta \in {\cal D}} {\omega _\delta}|{{\rm{OTF}}_{c,\delta}}({f_x},{f_y}{{)|}^2} + \frac{{{\sigma ^2}}}{{\gamma \sum\limits_k |I_c^{o,k}({f_x},{f_y}{{)|}^2}}}}}.$$

Here, $\sum\nolimits_k |I_c^{o,k}({f_x},{f_y}{)|^2}/{\sigma ^2}$ is a SNR typical for the Wiener filter. It is assumed that $\sum\nolimits_k |I_c^{o,k}({f_x},{f_y}{)|^2} \gt 0$. We use this solution in the form

$${\hat H_c}({f_x},{f_y}) = \frac{{\sum\limits_{\delta \in {\cal D}} {\omega _\delta}{\rm{OTF}}_{c,\delta}^ * ({f_x},{f_y})}}{{\sum\limits_{\delta \in {\cal D}} {\omega _\delta}|{{\rm{OTF}}_{c,\delta}}({f_x},{f_y}{{)|}^2} + \frac{{{\rm{reg}}}}{{\sum\limits_k |I_c^{o,k}({f_x},{f_y}{{)|}^2}}}}},$$
where the regularization parameter reg stays instead of the ratio $\frac{{{\sigma ^2}}}{\gamma}$. This reg is used for tuning the filter replacing $\gamma$.

APPENDIX B: SMOOTHING APPROXIMATION FUNCTIONS

To have a fully differentiable end-to-end design framework optimizing the number of levels and Fresnel order of MPM, we take advantage of the following two smoothing functions to approximate the floor and round operations. Specifically, we approximate $\lfloor w \rfloor$ in Eq. (16) as

$$ \lfloor w \rfloor \approx g(w) = w-\frac{1}{2}-\frac{1}{\pi }\mathop{\tan }^{-1}\left( \frac{-0.9999\sin (2\pi w)}{1-0.9999\cos (2\pi w)} \right).$$

The quality of the above approximation is summarized in Fig. 10. From this figure, we observe that Eq. (B1) provides a close estimation of the true $\lfloor w \rfloor$ value.

Now, to approximate the round function we use

$${\rm{round}}(w) \approx g(w + 0.5).$$

We remark that Eq. (B2) is used to approximate the nondifferentiable operation in Eq. (15). Specifically, we have

$${\rm{mod}}(w,z) \approx w - zg(w/z).$$

The quality of $g(w)$ to approximate ${\rm{round}}(w)$ is also summarized in Fig. 10, since it depends on Eq. (B1).

Funding

Jane ja Aatos Erkon Säätiö; Teknologiateollisuuden 100-Vuotisjuhlasäätiö.

Acknowledgment

This work is supported by the CIWIL project funded by “Jane and Aatos Erkko” and “Technology Industries of Finland Centennial” Foundations, Finland. Samuel Pinilla acknowledges support from the EMET Research Institute in Colombia.

Disclosures

The authors declare no conflicts of interest.

Data Availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Supplemental document

See Supplement 1 for supporting content.

REFERENCES

1. O. Lévêque, C. Kulcsár, A. Lee, H. Sauer, A. Aleksanyan, P. Bon, L. Cognet, and F. Goudail, “Co-designed annular binary phase masks for depth-of-field extension in single-molecule localization microscopy,” Opt. Express 28, 32426–32446 (2020). [CrossRef]  

2. S.-H. Baek, H. Ikoma, D. S. Jeon, Y. Li, W. Heidrich, G. Wetzstein, and M. H. Kim, “End-to-end hyperspectral-depth imaging with learned diffractive optics,” arXiv:2009.00463 (2020).

3. W. T. Chen, A. Y. Zhu, and F. Capasso, “Flat optics with dispersion-engineered metasurfaces,” Nat. Rev. Mater. 5, 604–620 (2020). [CrossRef]  

4. V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018). [CrossRef]  

5. X. Dun, H. Ikoma, G. Wetzstein, Z. Wang, X. Cheng, and Y. Peng, “Learned rotationally symmetric diffractive achromat for full-spectrum computational imaging,” Optica 7, 913–922 (2020). [CrossRef]  

6. B. Krajancich, N. Padmanaban, and G. Wetzstein, “Factored occlusion: Single spatial light modulator occlusion-capable optical see-through augmented reality display,” IEEE Trans. Visual. Comput. Graph. 26, 1871–1879 (2020). [CrossRef]  

7. D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019). [CrossRef]  

8. J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017). [CrossRef]  

9. N. Antipa, G. Kuo, R. Heckel, B. Mildenhall, E. Bostan, R. Ng, and L. Waller, “DiffuserCam: lensless single-exposure 3D imaging,” Optica 5, 1–9 (2018). [CrossRef]  

10. K. Yanny, N. Antipa, R. Ng, and L. Waller, “Miniature 3D fluorescence microscope using random microlenses,” in Optics and the Brain (Optical Society of America, 2019), paper BT3A–4.

11. S. Pinilla, J. Poveda, and H. Arguello, “Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation,” Opt. Commun. 410, 707–716 (2018). [CrossRef]  

12. C. V. Correa, H. Arguello, and G. R. Arce, “Spatiotemporal blue noise coded aperture design for multi-shot compressive spectral imaging,” J. Opt. Soc. Am. A 33, 2312–2322 (2016). [CrossRef]  

13. J. Bacca, S. Pinilla, and H. Arguello, “Super-resolution phase retrieval from designed coded diffraction patterns,” IEEE Trans. Image Process. 29, 2598–2609 (2019). [CrossRef]  

14. A. Jerez, S. Pinilla, and H. Arguello, “Fast target detection via template matching in compressive phase retrieval,” IEEE Trans. Comput. Imaging 6, 934–944 (2020). [CrossRef]  

15. G. Kuo, F. L. Liu, I. Grossrubatscher, R. Ng, and L. Waller, “On-chip fluorescence microscopy with a random microlens diffuser,” Opt. Express 28, 8384–8399 (2020). [CrossRef]  

16. V. Boominathan, J. Adams, J. Robinson, and A. Veeraraghavan, “PhlatCam: designed phase-mask based thin lensless camera,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1618–1629 (2020). [CrossRef]  

17. H. Haim, A. Bronstein, and E. Marom, “Computational multi-focus imaging combining sparse model with color dependent phase mask,” Opt. Express 23, 24547–24556 (2015). [CrossRef]  

18. H. Li, J. Peng, F. Pan, Y. Wu, Y. Zhang, and X. Xie, “Focal stack camera in all-in-focus imaging via an electrically tunable liquid crystal lens doped with multi-walled carbon nanotubes,” Opt. Express 26, 12441–12454 (2018). [CrossRef]  

19. Y. Hua, S. Nakamura, M. S. Asif, and A. C. Sankaranarayanan, “SweepCam—Depth-aware lensless imaging using programmable masks,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1606–1617 (2020). [CrossRef]  

20. O. Cossairt and S. Nayar, “Spectral focal sweep: extended depth of field from chromatic aberrations,” in IEEE International Conference on Computational Photography (ICCP) (IEEE, 2010), pp. 1–8.

21. Pytorch, https://pytorch.org/.

22. K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Joint image sharpening and denoising by 3D transform-domain collaborative filtering,” in International Workshop on Spectral Methods and Multirate Signal Processing (SMMSP) (2007).

23. E. R. Dowski and W. T. Cathey, “Extended depth of field through wave-front coding,” Appl. Opt. 34, 1859–1866 (1995). [CrossRef]  

24. A. Flores, M. R. Wang, and J. J. Yang, “Achromatic hybrid refractive-diffractive lens with extended depth of focus,” Appl. Opt. 43, 5618–5630 (2004). [CrossRef]  

25. S. S. Sherif, W. T. Cathey, and E. R. Dowski, “Phase plate to extend the depth of field of incoherent hybrid imaging systems,” Appl. Opt. 43, 2709–2721 (2004). [CrossRef]  

26. Q. Yang, L. Liu, and J. Sun, “Optimized phase pupil masks for extended depth of field,” Opt. Commun. 272, 56–66 (2007). [CrossRef]  

27. F. Zhou, G. Li, H. Zhang, and D. Wang, “Rational phase mask to extend the depth of field in optical-digital hybrid imaging systems,” Opt. Lett. 34, 380–382 (2009). [CrossRef]  

28. U. Akpinar, E. Sahin, M. Meem, R. Menon, and A. Gotchev, “Learning wavefront coding for extended depth of field imaging,” IEEE Trans. Image Process. 30, 3307–3320 (2021). [CrossRef]  

29. E. Ben-Eliezer, N. Konforti, B. Milgrom, and E. Marom, “An optimal binary amplitude-phase mask for hybrid imaging systems that exhibit high resolution and extended depth of field,” Opt. Express 16, 20540–20561 (2008). [CrossRef]  

30. B. Milgrom, N. Konforti, M. A. Golub, and E. Marom, “Pupil coding masks for imaging polychromatic scenes with high resolution and extended depth of field,” Opt. Express 18, 15569–15584 (2010). [CrossRef]  

31. M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015). [CrossRef]  

32. S. Ryu and C. Joo, “Design of binary phase filters for depth-of-focus extension via binarization of axisymmetric aberrations,” Opt. Express 25, 30312–30326 (2017). [CrossRef]  

33. S. Elmalem, R. Giryes, and E. Marom, “Learned phase coded aperture for the benefit of depth of field extension,” Opt. Express 26, 15316–15331 (2018). [CrossRef]  

34. A. Fontbonne, H. Sauer, C. Kulcsár, A.-L. Coutrot, and F. Goudail, “Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks,” Opt. Eng. 58, 113107 (2019). [CrossRef]  

35. E. E. García-Guerrero, E. R. Méndez, H. M. Escamilla, T. A. Leskova, and A. A. Maradudin, “Design and fabrication of random phase diffusers for extending the depth of focus,” Opt. Express 15, 910–923 (2007). [CrossRef]  

36. O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” in ACM SIGGRAPH (2010), pp. 1–10.

37. L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020). [CrossRef]  

38. N. Caron and Y. Sheng, “Polynomial phase masks for extending the depth of field of a microscope,” Appl. Opt. 47, E39–E43 (2008). [CrossRef]  

39. F. Zhou, R. Ye, G. Li, H. Zhang, and D. Wang, “Optimized circularly symmetric phase mask to extend the depth of focus,” J. Opt. Soc. Am. A 26, 1889–1895 (2009). [CrossRef]  

40. S. Banerji, M. Meem, A. Majumder, B. Sensale-Rodriguez, and R. Menon, “Extreme-depth-of-focus imaging with a flat lens,” Optica 7, 214–217 (2020). [CrossRef]  

41. E. González-Amador, A. Padilla-Vivanco, C. Toxqui-Quitl, J. Arines, and E. Acosta, “Jacobi–Fourier phase mask for wavefront coding,” Opt. Laser Eng. 126, 105880 (2020). [CrossRef]  

42. C. J. Sheppard and S. Mehta, “Three-level filter for increased depth of focus and Bessel beam generation,” Opt. Express 20, 27212–27221 (2012). [CrossRef]  

43. V. Katkovnik, M. Ponomarenko, and K. Egiazarian, “Lensless broadband diffractive imaging with improved depth of focus: wavefront modulation by multilevel phase masks,” J. Mod. Opt. 66, 335–352 (2019). [CrossRef]  

44. M. Ponomarenko, V. Katkovnik, and K. Egiazarian, “Phase masks optimization for broadband diffractive imaging,” Electron. Imaging 2019, 258 (2019). [CrossRef]  

45. S. R. M. Rostami, V. Katkovnik, and K. Egiazarian, “Extended DoF and achromatic inverse imaging for lens and lensless MPM camera based on Wiener filtering of defocused OTFs,” Opt. Eng. 60, 1–14 (2021). [CrossRef]  

46. S. S. Khan, V. Sundar, V. Boominathan, A. Veeraraghavan, and K. Mitra, “Flatnet: Towards photorealistic scene reconstruction from lensless measurements,” IEEE Trans. Pattern Anal. Mach. Intell. (2020). [CrossRef]  

47. Y. Wu, V. Boominathan, H. Chen, A. Sankaranarayanan, and A. Veeraraghavan, “PhaseCam3D—learning phase masks for passive single view depth estimation,” in IEEE International Conference on Computational Photography (ICCP) (2019), pp. 1–12.

48. C. A. Metzler, H. Ikoma, Y. Peng, and G. Wetzstein, “Deep optics for single-shot high-dynamic-range imaging,” in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020).

49. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005).

50. M. T. Emmerich and A. H. Deutz, “A tutorial on multiobjective optimization: fundamentals and evolutionary methods,” Nat. Comput. 17, 585–609 (2018). [CrossRef]  

51. Training stage image databases, https://data.vision.ee.ethz.ch/cvl/DIV2K/; http://cv.snu.ac.kr/research/EDSR/Flickr2K.tar.

52. Kodak Lossless True Color Image Suite, http://r0k.us/graphics/kodak/.

References

  • View by:

  1. O. Lévêque, C. Kulcsár, A. Lee, H. Sauer, A. Aleksanyan, P. Bon, L. Cognet, and F. Goudail, “Co-designed annular binary phase masks for depth-of-field extension in single-molecule localization microscopy,” Opt. Express 28, 32426–32446 (2020).
    [Crossref]
  2. S.-H. Baek, H. Ikoma, D. S. Jeon, Y. Li, W. Heidrich, G. Wetzstein, and M. H. Kim, “End-to-end hyperspectral-depth imaging with learned diffractive optics,” arXiv:2009.00463 (2020).
  3. W. T. Chen, A. Y. Zhu, and F. Capasso, “Flat optics with dispersion-engineered metasurfaces,” Nat. Rev. Mater. 5, 604–620 (2020).
    [Crossref]
  4. V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
    [Crossref]
  5. X. Dun, H. Ikoma, G. Wetzstein, Z. Wang, X. Cheng, and Y. Peng, “Learned rotationally symmetric diffractive achromat for full-spectrum computational imaging,” Optica 7, 913–922 (2020).
    [Crossref]
  6. B. Krajancich, N. Padmanaban, and G. Wetzstein, “Factored occlusion: Single spatial light modulator occlusion-capable optical see-through augmented reality display,” IEEE Trans. Visual. Comput. Graph. 26, 1871–1879 (2020).
    [Crossref]
  7. D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019).
    [Crossref]
  8. J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
    [Crossref]
  9. N. Antipa, G. Kuo, R. Heckel, B. Mildenhall, E. Bostan, R. Ng, and L. Waller, “DiffuserCam: lensless single-exposure 3D imaging,” Optica 5, 1–9 (2018).
    [Crossref]
  10. K. Yanny, N. Antipa, R. Ng, and L. Waller, “Miniature 3D fluorescence microscope using random microlenses,” in Optics and the Brain (Optical Society of America, 2019), paper BT3A–4.
  11. S. Pinilla, J. Poveda, and H. Arguello, “Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation,” Opt. Commun. 410, 707–716 (2018).
    [Crossref]
  12. C. V. Correa, H. Arguello, and G. R. Arce, “Spatiotemporal blue noise coded aperture design for multi-shot compressive spectral imaging,” J. Opt. Soc. Am. A 33, 2312–2322 (2016).
    [Crossref]
  13. J. Bacca, S. Pinilla, and H. Arguello, “Super-resolution phase retrieval from designed coded diffraction patterns,” IEEE Trans. Image Process. 29, 2598–2609 (2019).
    [Crossref]
  14. A. Jerez, S. Pinilla, and H. Arguello, “Fast target detection via template matching in compressive phase retrieval,” IEEE Trans. Comput. Imaging 6, 934–944 (2020).
    [Crossref]
  15. G. Kuo, F. L. Liu, I. Grossrubatscher, R. Ng, and L. Waller, “On-chip fluorescence microscopy with a random microlens diffuser,” Opt. Express 28, 8384–8399 (2020).
    [Crossref]
  16. V. Boominathan, J. Adams, J. Robinson, and A. Veeraraghavan, “PhlatCam: designed phase-mask based thin lensless camera,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1618–1629 (2020).
    [Crossref]
  17. H. Haim, A. Bronstein, and E. Marom, “Computational multi-focus imaging combining sparse model with color dependent phase mask,” Opt. Express 23, 24547–24556 (2015).
    [Crossref]
  18. H. Li, J. Peng, F. Pan, Y. Wu, Y. Zhang, and X. Xie, “Focal stack camera in all-in-focus imaging via an electrically tunable liquid crystal lens doped with multi-walled carbon nanotubes,” Opt. Express 26, 12441–12454 (2018).
    [Crossref]
  19. Y. Hua, S. Nakamura, M. S. Asif, and A. C. Sankaranarayanan, “SweepCam—Depth-aware lensless imaging using programmable masks,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1606–1617 (2020).
    [Crossref]
  20. O. Cossairt and S. Nayar, “Spectral focal sweep: extended depth of field from chromatic aberrations,” in IEEE International Conference on Computational Photography (ICCP) (IEEE, 2010), pp. 1–8.
  21. Pytorch, https://pytorch.org/ .
  22. K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Joint image sharpening and denoising by 3D transform-domain collaborative filtering,” in International Workshop on Spectral Methods and Multirate Signal Processing (SMMSP) (2007).
  23. E. R. Dowski and W. T. Cathey, “Extended depth of field through wave-front coding,” Appl. Opt. 34, 1859–1866 (1995).
    [Crossref]
  24. A. Flores, M. R. Wang, and J. J. Yang, “Achromatic hybrid refractive-diffractive lens with extended depth of focus,” Appl. Opt. 43, 5618–5630 (2004).
    [Crossref]
  25. S. S. Sherif, W. T. Cathey, and E. R. Dowski, “Phase plate to extend the depth of field of incoherent hybrid imaging systems,” Appl. Opt. 43, 2709–2721 (2004).
    [Crossref]
  26. Q. Yang, L. Liu, and J. Sun, “Optimized phase pupil masks for extended depth of field,” Opt. Commun. 272, 56–66 (2007).
    [Crossref]
  27. F. Zhou, G. Li, H. Zhang, and D. Wang, “Rational phase mask to extend the depth of field in optical-digital hybrid imaging systems,” Opt. Lett. 34, 380–382 (2009).
    [Crossref]
  28. U. Akpinar, E. Sahin, M. Meem, R. Menon, and A. Gotchev, “Learning wavefront coding for extended depth of field imaging,” IEEE Trans. Image Process. 30, 3307–3320 (2021).
    [Crossref]
  29. E. Ben-Eliezer, N. Konforti, B. Milgrom, and E. Marom, “An optimal binary amplitude-phase mask for hybrid imaging systems that exhibit high resolution and extended depth of field,” Opt. Express 16, 20540–20561 (2008).
    [Crossref]
  30. B. Milgrom, N. Konforti, M. A. Golub, and E. Marom, “Pupil coding masks for imaging polychromatic scenes with high resolution and extended depth of field,” Opt. Express 18, 15569–15584 (2010).
    [Crossref]
  31. M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
    [Crossref]
  32. S. Ryu and C. Joo, “Design of binary phase filters for depth-of-focus extension via binarization of axisymmetric aberrations,” Opt. Express 25, 30312–30326 (2017).
    [Crossref]
  33. S. Elmalem, R. Giryes, and E. Marom, “Learned phase coded aperture for the benefit of depth of field extension,” Opt. Express 26, 15316–15331 (2018).
    [Crossref]
  34. A. Fontbonne, H. Sauer, C. Kulcsár, A.-L. Coutrot, and F. Goudail, “Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks,” Opt. Eng. 58, 113107 (2019).
    [Crossref]
  35. E. E. García-Guerrero, E. R. Méndez, H. M. Escamilla, T. A. Leskova, and A. A. Maradudin, “Design and fabrication of random phase diffusers for extending the depth of focus,” Opt. Express 15, 910–923 (2007).
    [Crossref]
  36. O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” in ACM SIGGRAPH (2010), pp. 1–10.
  37. L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
    [Crossref]
  38. N. Caron and Y. Sheng, “Polynomial phase masks for extending the depth of field of a microscope,” Appl. Opt. 47, E39–E43 (2008).
    [Crossref]
  39. F. Zhou, R. Ye, G. Li, H. Zhang, and D. Wang, “Optimized circularly symmetric phase mask to extend the depth of focus,” J. Opt. Soc. Am. A 26, 1889–1895 (2009).
    [Crossref]
  40. S. Banerji, M. Meem, A. Majumder, B. Sensale-Rodriguez, and R. Menon, “Extreme-depth-of-focus imaging with a flat lens,” Optica 7, 214–217 (2020).
    [Crossref]
  41. E. González-Amador, A. Padilla-Vivanco, C. Toxqui-Quitl, J. Arines, and E. Acosta, “Jacobi–Fourier phase mask for wavefront coding,” Opt. Laser Eng. 126, 105880 (2020).
    [Crossref]
  42. C. J. Sheppard and S. Mehta, “Three-level filter for increased depth of focus and Bessel beam generation,” Opt. Express 20, 27212–27221 (2012).
    [Crossref]
  43. V. Katkovnik, M. Ponomarenko, and K. Egiazarian, “Lensless broadband diffractive imaging with improved depth of focus: wavefront modulation by multilevel phase masks,” J. Mod. Opt. 66, 335–352 (2019).
    [Crossref]
  44. M. Ponomarenko, V. Katkovnik, and K. Egiazarian, “Phase masks optimization for broadband diffractive imaging,” Electron. Imaging 2019, 258 (2019).
    [Crossref]
  45. S. R. M. Rostami, V. Katkovnik, and K. Egiazarian, “Extended DoF and achromatic inverse imaging for lens and lensless MPM camera based on Wiener filtering of defocused OTFs,” Opt. Eng. 60, 1–14 (2021).
    [Crossref]
  46. S. S. Khan, V. Sundar, V. Boominathan, A. Veeraraghavan, and K. Mitra, “Flatnet: Towards photorealistic scene reconstruction from lensless measurements,” IEEE Trans. Pattern Anal. Mach. Intell. (2020).
    [Crossref]
  47. Y. Wu, V. Boominathan, H. Chen, A. Sankaranarayanan, and A. Veeraraghavan, “PhaseCam3D—learning phase masks for passive single view depth estimation,” in IEEE International Conference on Computational Photography (ICCP) (2019), pp. 1–12.
  48. C. A. Metzler, H. Ikoma, Y. Peng, and G. Wetzstein, “Deep optics for single-shot high-dynamic-range imaging,” in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020).
  49. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005).
  50. M. T. Emmerich and A. H. Deutz, “A tutorial on multiobjective optimization: fundamentals and evolutionary methods,” Nat. Comput. 17, 585–609 (2018).
    [Crossref]
  51. Training stage image databases, https://data.vision.ee.ethz.ch/cvl/DIV2K/ ; http://cv.snu.ac.kr/research/EDSR/Flickr2K.tar .
  52. Kodak Lossless True Color Image Suite, http://r0k.us/graphics/kodak/ .

2021 (2)

U. Akpinar, E. Sahin, M. Meem, R. Menon, and A. Gotchev, “Learning wavefront coding for extended depth of field imaging,” IEEE Trans. Image Process. 30, 3307–3320 (2021).
[Crossref]

S. R. M. Rostami, V. Katkovnik, and K. Egiazarian, “Extended DoF and achromatic inverse imaging for lens and lensless MPM camera based on Wiener filtering of defocused OTFs,” Opt. Eng. 60, 1–14 (2021).
[Crossref]

2020 (11)

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

S. Banerji, M. Meem, A. Majumder, B. Sensale-Rodriguez, and R. Menon, “Extreme-depth-of-focus imaging with a flat lens,” Optica 7, 214–217 (2020).
[Crossref]

E. González-Amador, A. Padilla-Vivanco, C. Toxqui-Quitl, J. Arines, and E. Acosta, “Jacobi–Fourier phase mask for wavefront coding,” Opt. Laser Eng. 126, 105880 (2020).
[Crossref]

X. Dun, H. Ikoma, G. Wetzstein, Z. Wang, X. Cheng, and Y. Peng, “Learned rotationally symmetric diffractive achromat for full-spectrum computational imaging,” Optica 7, 913–922 (2020).
[Crossref]

B. Krajancich, N. Padmanaban, and G. Wetzstein, “Factored occlusion: Single spatial light modulator occlusion-capable optical see-through augmented reality display,” IEEE Trans. Visual. Comput. Graph. 26, 1871–1879 (2020).
[Crossref]

O. Lévêque, C. Kulcsár, A. Lee, H. Sauer, A. Aleksanyan, P. Bon, L. Cognet, and F. Goudail, “Co-designed annular binary phase masks for depth-of-field extension in single-molecule localization microscopy,” Opt. Express 28, 32426–32446 (2020).
[Crossref]

W. T. Chen, A. Y. Zhu, and F. Capasso, “Flat optics with dispersion-engineered metasurfaces,” Nat. Rev. Mater. 5, 604–620 (2020).
[Crossref]

A. Jerez, S. Pinilla, and H. Arguello, “Fast target detection via template matching in compressive phase retrieval,” IEEE Trans. Comput. Imaging 6, 934–944 (2020).
[Crossref]

G. Kuo, F. L. Liu, I. Grossrubatscher, R. Ng, and L. Waller, “On-chip fluorescence microscopy with a random microlens diffuser,” Opt. Express 28, 8384–8399 (2020).
[Crossref]

V. Boominathan, J. Adams, J. Robinson, and A. Veeraraghavan, “PhlatCam: designed phase-mask based thin lensless camera,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1618–1629 (2020).
[Crossref]

Y. Hua, S. Nakamura, M. S. Asif, and A. C. Sankaranarayanan, “SweepCam—Depth-aware lensless imaging using programmable masks,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1606–1617 (2020).
[Crossref]

2019 (5)

J. Bacca, S. Pinilla, and H. Arguello, “Super-resolution phase retrieval from designed coded diffraction patterns,” IEEE Trans. Image Process. 29, 2598–2609 (2019).
[Crossref]

D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019).
[Crossref]

A. Fontbonne, H. Sauer, C. Kulcsár, A.-L. Coutrot, and F. Goudail, “Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks,” Opt. Eng. 58, 113107 (2019).
[Crossref]

V. Katkovnik, M. Ponomarenko, and K. Egiazarian, “Lensless broadband diffractive imaging with improved depth of focus: wavefront modulation by multilevel phase masks,” J. Mod. Opt. 66, 335–352 (2019).
[Crossref]

M. Ponomarenko, V. Katkovnik, and K. Egiazarian, “Phase masks optimization for broadband diffractive imaging,” Electron. Imaging 2019, 258 (2019).
[Crossref]

2018 (6)

M. T. Emmerich and A. H. Deutz, “A tutorial on multiobjective optimization: fundamentals and evolutionary methods,” Nat. Comput. 17, 585–609 (2018).
[Crossref]

S. Elmalem, R. Giryes, and E. Marom, “Learned phase coded aperture for the benefit of depth of field extension,” Opt. Express 26, 15316–15331 (2018).
[Crossref]

V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
[Crossref]

N. Antipa, G. Kuo, R. Heckel, B. Mildenhall, E. Bostan, R. Ng, and L. Waller, “DiffuserCam: lensless single-exposure 3D imaging,” Optica 5, 1–9 (2018).
[Crossref]

S. Pinilla, J. Poveda, and H. Arguello, “Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation,” Opt. Commun. 410, 707–716 (2018).
[Crossref]

H. Li, J. Peng, F. Pan, Y. Wu, Y. Zhang, and X. Xie, “Focal stack camera in all-in-focus imaging via an electrically tunable liquid crystal lens doped with multi-walled carbon nanotubes,” Opt. Express 26, 12441–12454 (2018).
[Crossref]

2017 (2)

J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
[Crossref]

S. Ryu and C. Joo, “Design of binary phase filters for depth-of-focus extension via binarization of axisymmetric aberrations,” Opt. Express 25, 30312–30326 (2017).
[Crossref]

2016 (1)

2015 (2)

H. Haim, A. Bronstein, and E. Marom, “Computational multi-focus imaging combining sparse model with color dependent phase mask,” Opt. Express 23, 24547–24556 (2015).
[Crossref]

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

2012 (1)

2010 (1)

2009 (2)

2008 (2)

2007 (2)

2004 (2)

1995 (1)

Acosta, E.

E. González-Amador, A. Padilla-Vivanco, C. Toxqui-Quitl, J. Arines, and E. Acosta, “Jacobi–Fourier phase mask for wavefront coding,” Opt. Laser Eng. 126, 105880 (2020).
[Crossref]

Adams, J.

V. Boominathan, J. Adams, J. Robinson, and A. Veeraraghavan, “PhlatCam: designed phase-mask based thin lensless camera,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1618–1629 (2020).
[Crossref]

Adams, J. K.

J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
[Crossref]

Akpinar, U.

U. Akpinar, E. Sahin, M. Meem, R. Menon, and A. Gotchev, “Learning wavefront coding for extended depth of field imaging,” IEEE Trans. Image Process. 30, 3307–3320 (2021).
[Crossref]

Aleksanyan, A.

Antipa, N.

N. Antipa, G. Kuo, R. Heckel, B. Mildenhall, E. Bostan, R. Ng, and L. Waller, “DiffuserCam: lensless single-exposure 3D imaging,” Optica 5, 1–9 (2018).
[Crossref]

K. Yanny, N. Antipa, R. Ng, and L. Waller, “Miniature 3D fluorescence microscope using random microlenses,” in Optics and the Brain (Optical Society of America, 2019), paper BT3A–4.

Arce, G. R.

Arguello, H.

A. Jerez, S. Pinilla, and H. Arguello, “Fast target detection via template matching in compressive phase retrieval,” IEEE Trans. Comput. Imaging 6, 934–944 (2020).
[Crossref]

J. Bacca, S. Pinilla, and H. Arguello, “Super-resolution phase retrieval from designed coded diffraction patterns,” IEEE Trans. Image Process. 29, 2598–2609 (2019).
[Crossref]

S. Pinilla, J. Poveda, and H. Arguello, “Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation,” Opt. Commun. 410, 707–716 (2018).
[Crossref]

C. V. Correa, H. Arguello, and G. R. Arce, “Spatiotemporal blue noise coded aperture design for multi-shot compressive spectral imaging,” J. Opt. Soc. Am. A 33, 2312–2322 (2016).
[Crossref]

Arines, J.

E. González-Amador, A. Padilla-Vivanco, C. Toxqui-Quitl, J. Arines, and E. Acosta, “Jacobi–Fourier phase mask for wavefront coding,” Opt. Laser Eng. 126, 105880 (2020).
[Crossref]

Asif, M. S.

Y. Hua, S. Nakamura, M. S. Asif, and A. C. Sankaranarayanan, “SweepCam—Depth-aware lensless imaging using programmable masks,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1606–1617 (2020).
[Crossref]

Avants, B. W.

J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
[Crossref]

Bacca, J.

J. Bacca, S. Pinilla, and H. Arguello, “Super-resolution phase retrieval from designed coded diffraction patterns,” IEEE Trans. Image Process. 29, 2598–2609 (2019).
[Crossref]

Badaoui, H.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

Baek, S.-H.

D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019).
[Crossref]

S.-H. Baek, H. Ikoma, D. S. Jeon, Y. Li, W. Heidrich, G. Wetzstein, and M. H. Kim, “End-to-end hyperspectral-depth imaging with learned diffractive optics,” arXiv:2009.00463 (2020).

Banerji, S.

Baraniuk, R. G.

J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
[Crossref]

Ben-Eliezer, E.

Bon, P.

Boominathan, V.

V. Boominathan, J. Adams, J. Robinson, and A. Veeraraghavan, “PhlatCam: designed phase-mask based thin lensless camera,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1618–1629 (2020).
[Crossref]

J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
[Crossref]

S. S. Khan, V. Sundar, V. Boominathan, A. Veeraraghavan, and K. Mitra, “Flatnet: Towards photorealistic scene reconstruction from lensless measurements,” IEEE Trans. Pattern Anal. Mach. Intell. (2020).
[Crossref]

Y. Wu, V. Boominathan, H. Chen, A. Sankaranarayanan, and A. Veeraraghavan, “PhaseCam3D—learning phase masks for passive single view depth estimation,” in IEEE International Conference on Computational Photography (ICCP) (2019), pp. 1–12.

Bostan, E.

Boyd, S.

V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
[Crossref]

Bronstein, A.

Burcklen, M.-A.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Capasso, F.

W. T. Chen, A. Y. Zhu, and F. Capasso, “Flat optics with dispersion-engineered metasurfaces,” Nat. Rev. Mater. 5, 604–620 (2020).
[Crossref]

Caron, N.

Cathey, W. T.

Chen, H.

Y. Wu, V. Boominathan, H. Chen, A. Sankaranarayanan, and A. Veeraraghavan, “PhaseCam3D—learning phase masks for passive single view depth estimation,” in IEEE International Conference on Computational Photography (ICCP) (2019), pp. 1–12.

Chen, W. T.

W. T. Chen, A. Y. Zhu, and F. Capasso, “Flat optics with dispersion-engineered metasurfaces,” Nat. Rev. Mater. 5, 604–620 (2020).
[Crossref]

Cheng, X.

Cognet, L.

Coole, J. B.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

Correa, C. V.

Cossairt, O.

O. Cossairt and S. Nayar, “Spectral focal sweep: extended depth of field from chromatic aberrations,” in IEEE International Conference on Computational Photography (ICCP) (IEEE, 2010), pp. 1–8.

O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” in ACM SIGGRAPH (2010), pp. 1–10.

Coutrot, A.-L.

A. Fontbonne, H. Sauer, C. Kulcsár, A.-L. Coutrot, and F. Goudail, “Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks,” Opt. Eng. 58, 113107 (2019).
[Crossref]

Dabov, K.

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Joint image sharpening and denoising by 3D transform-domain collaborative filtering,” in International Workshop on Spectral Methods and Multirate Signal Processing (SMMSP) (2007).

Delboulbe, A.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Denel, S.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Deutz, A. H.

M. T. Emmerich and A. H. Deutz, “A tutorial on multiobjective optimization: fundamentals and evolutionary methods,” Nat. Comput. 17, 585–609 (2018).
[Crossref]

Diamond, S.

V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
[Crossref]

Diaz, F.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Dowski, E. R.

Duhem, F.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Dun, X.

X. Dun, H. Ikoma, G. Wetzstein, Z. Wang, X. Cheng, and Y. Peng, “Learned rotationally symmetric diffractive achromat for full-spectrum computational imaging,” Optica 7, 913–922 (2020).
[Crossref]

D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019).
[Crossref]

V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
[Crossref]

Egiazarian, K.

S. R. M. Rostami, V. Katkovnik, and K. Egiazarian, “Extended DoF and achromatic inverse imaging for lens and lensless MPM camera based on Wiener filtering of defocused OTFs,” Opt. Eng. 60, 1–14 (2021).
[Crossref]

M. Ponomarenko, V. Katkovnik, and K. Egiazarian, “Phase masks optimization for broadband diffractive imaging,” Electron. Imaging 2019, 258 (2019).
[Crossref]

V. Katkovnik, M. Ponomarenko, and K. Egiazarian, “Lensless broadband diffractive imaging with improved depth of focus: wavefront modulation by multilevel phase masks,” J. Mod. Opt. 66, 335–352 (2019).
[Crossref]

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Joint image sharpening and denoising by 3D transform-domain collaborative filtering,” in International Workshop on Spectral Methods and Multirate Signal Processing (SMMSP) (2007).

Elmalem, S.

Emmerich, M. T.

M. T. Emmerich and A. H. Deutz, “A tutorial on multiobjective optimization: fundamentals and evolutionary methods,” Nat. Comput. 17, 585–609 (2018).
[Crossref]

Escamilla, H. M.

Flores, A.

Foi, A.

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Joint image sharpening and denoising by 3D transform-domain collaborative filtering,” in International Workshop on Spectral Methods and Multirate Signal Processing (SMMSP) (2007).

Fontbonne, A.

A. Fontbonne, H. Sauer, C. Kulcsár, A.-L. Coutrot, and F. Goudail, “Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks,” Opt. Eng. 58, 113107 (2019).
[Crossref]

Fu, Q.

D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019).
[Crossref]

García-Guerrero, E. E.

Gillenwater, A. M.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

Giryes, R.

Golub, M. A.

González-Amador, E.

E. González-Amador, A. Padilla-Vivanco, C. Toxqui-Quitl, J. Arines, and E. Acosta, “Jacobi–Fourier phase mask for wavefront coding,” Opt. Laser Eng. 126, 105880 (2020).
[Crossref]

Goodman, J. W.

J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005).

Gotchev, A.

U. Akpinar, E. Sahin, M. Meem, R. Menon, and A. Gotchev, “Learning wavefront coding for extended depth of field imaging,” IEEE Trans. Image Process. 30, 3307–3320 (2021).
[Crossref]

Goudail, F.

O. Lévêque, C. Kulcsár, A. Lee, H. Sauer, A. Aleksanyan, P. Bon, L. Cognet, and F. Goudail, “Co-designed annular binary phase masks for depth-of-field extension in single-molecule localization microscopy,” Opt. Express 28, 32426–32446 (2020).
[Crossref]

A. Fontbonne, H. Sauer, C. Kulcsár, A.-L. Coutrot, and F. Goudail, “Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks,” Opt. Eng. 58, 113107 (2019).
[Crossref]

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Grossrubatscher, I.

Haim, H.

Heckel, R.

Heide, F.

V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
[Crossref]

Heidrich, W.

D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019).
[Crossref]

V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
[Crossref]

S.-H. Baek, H. Ikoma, D. S. Jeon, Y. Li, W. Heidrich, G. Wetzstein, and M. H. Kim, “End-to-end hyperspectral-depth imaging with learned diffractive optics,” arXiv:2009.00463 (2020).

Hua, Y.

Y. Hua, S. Nakamura, M. S. Asif, and A. C. Sankaranarayanan, “SweepCam—Depth-aware lensless imaging using programmable masks,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1606–1617 (2020).
[Crossref]

Ikoma, H.

X. Dun, H. Ikoma, G. Wetzstein, Z. Wang, X. Cheng, and Y. Peng, “Learned rotationally symmetric diffractive achromat for full-spectrum computational imaging,” Optica 7, 913–922 (2020).
[Crossref]

S.-H. Baek, H. Ikoma, D. S. Jeon, Y. Li, W. Heidrich, G. Wetzstein, and M. H. Kim, “End-to-end hyperspectral-depth imaging with learned diffractive optics,” arXiv:2009.00463 (2020).

C. A. Metzler, H. Ikoma, Y. Peng, and G. Wetzstein, “Deep optics for single-shot high-dynamic-range imaging,” in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020).

Jeon, D. S.

D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019).
[Crossref]

S.-H. Baek, H. Ikoma, D. S. Jeon, Y. Li, W. Heidrich, G. Wetzstein, and M. H. Kim, “End-to-end hyperspectral-depth imaging with learned diffractive optics,” arXiv:2009.00463 (2020).

Jerez, A.

A. Jerez, S. Pinilla, and H. Arguello, “Fast target detection via template matching in compressive phase retrieval,” IEEE Trans. Comput. Imaging 6, 934–944 (2020).
[Crossref]

Jin, L.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

Joo, C.

Katkovnik, V.

S. R. M. Rostami, V. Katkovnik, and K. Egiazarian, “Extended DoF and achromatic inverse imaging for lens and lensless MPM camera based on Wiener filtering of defocused OTFs,” Opt. Eng. 60, 1–14 (2021).
[Crossref]

V. Katkovnik, M. Ponomarenko, and K. Egiazarian, “Lensless broadband diffractive imaging with improved depth of focus: wavefront modulation by multilevel phase masks,” J. Mod. Opt. 66, 335–352 (2019).
[Crossref]

M. Ponomarenko, V. Katkovnik, and K. Egiazarian, “Phase masks optimization for broadband diffractive imaging,” Electron. Imaging 2019, 258 (2019).
[Crossref]

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Joint image sharpening and denoising by 3D transform-domain collaborative filtering,” in International Workshop on Spectral Methods and Multirate Signal Processing (SMMSP) (2007).

Khan, S. S.

S. S. Khan, V. Sundar, V. Boominathan, A. Veeraraghavan, and K. Mitra, “Flatnet: Towards photorealistic scene reconstruction from lensless measurements,” IEEE Trans. Pattern Anal. Mach. Intell. (2020).
[Crossref]

Kim, M. H.

D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019).
[Crossref]

S.-H. Baek, H. Ikoma, D. S. Jeon, Y. Li, W. Heidrich, G. Wetzstein, and M. H. Kim, “End-to-end hyperspectral-depth imaging with learned diffractive optics,” arXiv:2009.00463 (2020).

Konforti, N.

Koudoli, A.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Krajancich, B.

B. Krajancich, N. Padmanaban, and G. Wetzstein, “Factored occlusion: Single spatial light modulator occlusion-capable optical see-through augmented reality display,” IEEE Trans. Visual. Comput. Graph. 26, 1871–1879 (2020).
[Crossref]

Kulcsár, C.

O. Lévêque, C. Kulcsár, A. Lee, H. Sauer, A. Aleksanyan, P. Bon, L. Cognet, and F. Goudail, “Co-designed annular binary phase masks for depth-of-field extension in single-molecule localization microscopy,” Opt. Express 28, 32426–32446 (2020).
[Crossref]

A. Fontbonne, H. Sauer, C. Kulcsár, A.-L. Coutrot, and F. Goudail, “Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks,” Opt. Eng. 58, 113107 (2019).
[Crossref]

Kuo, G.

Lee, A.

Lee, M.-S. L.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Lemonnier, F.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Lepretre, F.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Leskova, T. A.

Lévêque, O.

Li, G.

Li, H.

Li, Y.

S.-H. Baek, H. Ikoma, D. S. Jeon, Y. Li, W. Heidrich, G. Wetzstein, and M. H. Kim, “End-to-end hyperspectral-depth imaging with learned diffractive optics,” arXiv:2009.00463 (2020).

Liu, F. L.

Liu, L.

Q. Yang, L. Liu, and J. Sun, “Optimized phase pupil masks for extended depth of field,” Opt. Commun. 272, 56–66 (2007).
[Crossref]

Loiseaux, B.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Majumder, A.

Maradudin, A. A.

Marom, E.

Meem, M.

U. Akpinar, E. Sahin, M. Meem, R. Menon, and A. Gotchev, “Learning wavefront coding for extended depth of field imaging,” IEEE Trans. Image Process. 30, 3307–3320 (2021).
[Crossref]

S. Banerji, M. Meem, A. Majumder, B. Sensale-Rodriguez, and R. Menon, “Extreme-depth-of-focus imaging with a flat lens,” Optica 7, 214–217 (2020).
[Crossref]

Mehta, S.

Méndez, E. R.

Menon, R.

U. Akpinar, E. Sahin, M. Meem, R. Menon, and A. Gotchev, “Learning wavefront coding for extended depth of field imaging,” IEEE Trans. Image Process. 30, 3307–3320 (2021).
[Crossref]

S. Banerji, M. Meem, A. Majumder, B. Sensale-Rodriguez, and R. Menon, “Extreme-depth-of-focus imaging with a flat lens,” Optica 7, 214–217 (2020).
[Crossref]

Metzler, C. A.

C. A. Metzler, H. Ikoma, Y. Peng, and G. Wetzstein, “Deep optics for single-shot high-dynamic-range imaging,” in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020).

Mildenhall, B.

Milgrom, B.

Millet, P.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Mitra, K.

S. S. Khan, V. Sundar, V. Boominathan, A. Veeraraghavan, and K. Mitra, “Flatnet: Towards photorealistic scene reconstruction from lensless measurements,” IEEE Trans. Pattern Anal. Mach. Intell. (2020).
[Crossref]

Nakamura, S.

Y. Hua, S. Nakamura, M. S. Asif, and A. C. Sankaranarayanan, “SweepCam—Depth-aware lensless imaging using programmable masks,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1606–1617 (2020).
[Crossref]

Nayar, S.

O. Cossairt and S. Nayar, “Spectral focal sweep: extended depth of field from chromatic aberrations,” in IEEE International Conference on Computational Photography (ICCP) (IEEE, 2010), pp. 1–8.

O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” in ACM SIGGRAPH (2010), pp. 1–10.

Ng, R.

Padilla-Vivanco, A.

E. González-Amador, A. Padilla-Vivanco, C. Toxqui-Quitl, J. Arines, and E. Acosta, “Jacobi–Fourier phase mask for wavefront coding,” Opt. Laser Eng. 126, 105880 (2020).
[Crossref]

Padmanaban, N.

B. Krajancich, N. Padmanaban, and G. Wetzstein, “Factored occlusion: Single spatial light modulator occlusion-capable optical see-through augmented reality display,” IEEE Trans. Visual. Comput. Graph. 26, 1871–1879 (2020).
[Crossref]

Pan, F.

Peng, J.

Peng, Y.

X. Dun, H. Ikoma, G. Wetzstein, Z. Wang, X. Cheng, and Y. Peng, “Learned rotationally symmetric diffractive achromat for full-spectrum computational imaging,” Optica 7, 913–922 (2020).
[Crossref]

V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
[Crossref]

C. A. Metzler, H. Ikoma, Y. Peng, and G. Wetzstein, “Deep optics for single-shot high-dynamic-range imaging,” in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020).

Pinilla, S.

A. Jerez, S. Pinilla, and H. Arguello, “Fast target detection via template matching in compressive phase retrieval,” IEEE Trans. Comput. Imaging 6, 934–944 (2020).
[Crossref]

J. Bacca, S. Pinilla, and H. Arguello, “Super-resolution phase retrieval from designed coded diffraction patterns,” IEEE Trans. Image Process. 29, 2598–2609 (2019).
[Crossref]

S. Pinilla, J. Poveda, and H. Arguello, “Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation,” Opt. Commun. 410, 707–716 (2018).
[Crossref]

Ponomarenko, M.

V. Katkovnik, M. Ponomarenko, and K. Egiazarian, “Lensless broadband diffractive imaging with improved depth of focus: wavefront modulation by multilevel phase masks,” J. Mod. Opt. 66, 335–352 (2019).
[Crossref]

M. Ponomarenko, V. Katkovnik, and K. Egiazarian, “Phase masks optimization for broadband diffractive imaging,” Electron. Imaging 2019, 258 (2019).
[Crossref]

Poveda, J.

S. Pinilla, J. Poveda, and H. Arguello, “Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation,” Opt. Commun. 410, 707–716 (2018).
[Crossref]

Richards-Kortum, R. R.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

Robinson, J.

V. Boominathan, J. Adams, J. Robinson, and A. Veeraraghavan, “PhlatCam: designed phase-mask based thin lensless camera,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1618–1629 (2020).
[Crossref]

Robinson, J. T.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
[Crossref]

Rollin, J.

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Rostami, S. R. M.

S. R. M. Rostami, V. Katkovnik, and K. Egiazarian, “Extended DoF and achromatic inverse imaging for lens and lensless MPM camera based on Wiener filtering of defocused OTFs,” Opt. Eng. 60, 1–14 (2021).
[Crossref]

Ryu, S.

Sahin, E.

U. Akpinar, E. Sahin, M. Meem, R. Menon, and A. Gotchev, “Learning wavefront coding for extended depth of field imaging,” IEEE Trans. Image Process. 30, 3307–3320 (2021).
[Crossref]

Sankaranarayanan, A.

Y. Wu, V. Boominathan, H. Chen, A. Sankaranarayanan, and A. Veeraraghavan, “PhaseCam3D—learning phase masks for passive single view depth estimation,” in IEEE International Conference on Computational Photography (ICCP) (2019), pp. 1–12.

Sankaranarayanan, A. C.

Y. Hua, S. Nakamura, M. S. Asif, and A. C. Sankaranarayanan, “SweepCam—Depth-aware lensless imaging using programmable masks,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1606–1617 (2020).
[Crossref]

Sauer, H.

O. Lévêque, C. Kulcsár, A. Lee, H. Sauer, A. Aleksanyan, P. Bon, L. Cognet, and F. Goudail, “Co-designed annular binary phase masks for depth-of-field extension in single-molecule localization microscopy,” Opt. Express 28, 32426–32446 (2020).
[Crossref]

A. Fontbonne, H. Sauer, C. Kulcsár, A.-L. Coutrot, and F. Goudail, “Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks,” Opt. Eng. 58, 113107 (2019).
[Crossref]

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

Sensale-Rodriguez, B.

Sheng, Y.

Sheppard, C. J.

Sherif, S. S.

Sitzmann, V.

V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
[Crossref]

Sun, J.

Q. Yang, L. Liu, and J. Sun, “Optimized phase pupil masks for extended depth of field,” Opt. Commun. 272, 56–66 (2007).
[Crossref]

Sundar, V.

S. S. Khan, V. Sundar, V. Boominathan, A. Veeraraghavan, and K. Mitra, “Flatnet: Towards photorealistic scene reconstruction from lensless measurements,” IEEE Trans. Pattern Anal. Mach. Intell. (2020).
[Crossref]

Tan, M. T.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

Tang, Y.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

Toxqui-Quitl, C.

E. González-Amador, A. Padilla-Vivanco, C. Toxqui-Quitl, J. Arines, and E. Acosta, “Jacobi–Fourier phase mask for wavefront coding,” Opt. Laser Eng. 126, 105880 (2020).
[Crossref]

Veeraraghavan, A.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

V. Boominathan, J. Adams, J. Robinson, and A. Veeraraghavan, “PhlatCam: designed phase-mask based thin lensless camera,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1618–1629 (2020).
[Crossref]

J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
[Crossref]

S. S. Khan, V. Sundar, V. Boominathan, A. Veeraraghavan, and K. Mitra, “Flatnet: Towards photorealistic scene reconstruction from lensless measurements,” IEEE Trans. Pattern Anal. Mach. Intell. (2020).
[Crossref]

Y. Wu, V. Boominathan, H. Chen, A. Sankaranarayanan, and A. Veeraraghavan, “PhaseCam3D—learning phase masks for passive single view depth estimation,” in IEEE International Conference on Computational Photography (ICCP) (2019), pp. 1–12.

Vercosa, D. G.

J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
[Crossref]

Waller, L.

Wang, D.

Wang, M. R.

Wang, Z.

Wetzstein, G.

X. Dun, H. Ikoma, G. Wetzstein, Z. Wang, X. Cheng, and Y. Peng, “Learned rotationally symmetric diffractive achromat for full-spectrum computational imaging,” Optica 7, 913–922 (2020).
[Crossref]

B. Krajancich, N. Padmanaban, and G. Wetzstein, “Factored occlusion: Single spatial light modulator occlusion-capable optical see-through augmented reality display,” IEEE Trans. Visual. Comput. Graph. 26, 1871–1879 (2020).
[Crossref]

V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
[Crossref]

S.-H. Baek, H. Ikoma, D. S. Jeon, Y. Li, W. Heidrich, G. Wetzstein, and M. H. Kim, “End-to-end hyperspectral-depth imaging with learned diffractive optics,” arXiv:2009.00463 (2020).

C. A. Metzler, H. Ikoma, Y. Peng, and G. Wetzstein, “Deep optics for single-shot high-dynamic-range imaging,” in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020).

Williams, M. D.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

Wu, Y.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

H. Li, J. Peng, F. Pan, Y. Wu, Y. Zhang, and X. Xie, “Focal stack camera in all-in-focus imaging via an electrically tunable liquid crystal lens doped with multi-walled carbon nanotubes,” Opt. Express 26, 12441–12454 (2018).
[Crossref]

Y. Wu, V. Boominathan, H. Chen, A. Sankaranarayanan, and A. Veeraraghavan, “PhaseCam3D—learning phase masks for passive single view depth estimation,” in IEEE International Conference on Computational Photography (ICCP) (2019), pp. 1–12.

Xie, X.

Yang, J. J.

Yang, Q.

Q. Yang, L. Liu, and J. Sun, “Optimized phase pupil masks for extended depth of field,” Opt. Commun. 272, 56–66 (2007).
[Crossref]

Yanny, K.

K. Yanny, N. Antipa, R. Ng, and L. Waller, “Miniature 3D fluorescence microscope using random microlenses,” in Optics and the Brain (Optical Society of America, 2019), paper BT3A–4.

Ye, F.

J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
[Crossref]

Ye, R.

Yi, S.

D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019).
[Crossref]

Zhang, H.

Zhang, Y.

Zhao, X.

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

Zhou, C.

O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” in ACM SIGGRAPH (2010), pp. 1–10.

Zhou, F.

Zhu, A. Y.

W. T. Chen, A. Y. Zhu, and F. Capasso, “Flat optics with dispersion-engineered metasurfaces,” Nat. Rev. Mater. 5, 604–620 (2020).
[Crossref]

ACM Trans. Graph. (2)

V. Sitzmann, S. Diamond, Y. Peng, X. Dun, S. Boyd, W. Heidrich, F. Heide, and G. Wetzstein, “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graph. 37, 1–13 (2018).
[Crossref]

D. S. Jeon, S.-H. Baek, S. Yi, Q. Fu, X. Dun, W. Heidrich, and M. H. Kim, “Compact snapshot hyperspectral imaging with diffracted rotation,” ACM Trans. Graph. 38, 117 (2019).
[Crossref]

Appl. Opt. (4)

Electron. Imaging (1)

M. Ponomarenko, V. Katkovnik, and K. Egiazarian, “Phase masks optimization for broadband diffractive imaging,” Electron. Imaging 2019, 258 (2019).
[Crossref]

IEEE Trans. Comput. Imaging (1)

A. Jerez, S. Pinilla, and H. Arguello, “Fast target detection via template matching in compressive phase retrieval,” IEEE Trans. Comput. Imaging 6, 934–944 (2020).
[Crossref]

IEEE Trans. Image Process. (2)

U. Akpinar, E. Sahin, M. Meem, R. Menon, and A. Gotchev, “Learning wavefront coding for extended depth of field imaging,” IEEE Trans. Image Process. 30, 3307–3320 (2021).
[Crossref]

J. Bacca, S. Pinilla, and H. Arguello, “Super-resolution phase retrieval from designed coded diffraction patterns,” IEEE Trans. Image Process. 29, 2598–2609 (2019).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (2)

Y. Hua, S. Nakamura, M. S. Asif, and A. C. Sankaranarayanan, “SweepCam—Depth-aware lensless imaging using programmable masks,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1606–1617 (2020).
[Crossref]

V. Boominathan, J. Adams, J. Robinson, and A. Veeraraghavan, “PhlatCam: designed phase-mask based thin lensless camera,” IEEE Trans. Pattern Anal. Mach. Intell. 42, 1618–1629 (2020).
[Crossref]

IEEE Trans. Visual. Comput. Graph. (1)

B. Krajancich, N. Padmanaban, and G. Wetzstein, “Factored occlusion: Single spatial light modulator occlusion-capable optical see-through augmented reality display,” IEEE Trans. Visual. Comput. Graph. 26, 1871–1879 (2020).
[Crossref]

J. Eur. Opt. Soc. Rapid. Publ. (1)

M.-A. Burcklen, F. Diaz, F. Lepretre, J. Rollin, A. Delboulbe, M.-S. L. Lee, B. Loiseaux, A. Koudoli, S. Denel, P. Millet, F. Duhem, F. Lemonnier, H. Sauer, and F. Goudail, “Experimental demonstration of extended depth-of-field F/1.2 visible high definition camera with jointly optimized phase mask and real-time digital processing,” J. Eur. Opt. Soc. Rapid. Publ. 10, 15046 (2015).
[Crossref]

J. Mod. Opt. (1)

V. Katkovnik, M. Ponomarenko, and K. Egiazarian, “Lensless broadband diffractive imaging with improved depth of focus: wavefront modulation by multilevel phase masks,” J. Mod. Opt. 66, 335–352 (2019).
[Crossref]

J. Opt. Soc. Am. A (2)

Nat. Comput. (1)

M. T. Emmerich and A. H. Deutz, “A tutorial on multiobjective optimization: fundamentals and evolutionary methods,” Nat. Comput. 17, 585–609 (2018).
[Crossref]

Nat. Rev. Mater. (1)

W. T. Chen, A. Y. Zhu, and F. Capasso, “Flat optics with dispersion-engineered metasurfaces,” Nat. Rev. Mater. 5, 604–620 (2020).
[Crossref]

Opt. Commun. (2)

S. Pinilla, J. Poveda, and H. Arguello, “Coded diffraction system in X-ray crystallography using a boolean phase coded aperture approximation,” Opt. Commun. 410, 707–716 (2018).
[Crossref]

Q. Yang, L. Liu, and J. Sun, “Optimized phase pupil masks for extended depth of field,” Opt. Commun. 272, 56–66 (2007).
[Crossref]

Opt. Eng. (2)

A. Fontbonne, H. Sauer, C. Kulcsár, A.-L. Coutrot, and F. Goudail, “Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks,” Opt. Eng. 58, 113107 (2019).
[Crossref]

S. R. M. Rostami, V. Katkovnik, and K. Egiazarian, “Extended DoF and achromatic inverse imaging for lens and lensless MPM camera based on Wiener filtering of defocused OTFs,” Opt. Eng. 60, 1–14 (2021).
[Crossref]

Opt. Express (10)

C. J. Sheppard and S. Mehta, “Three-level filter for increased depth of focus and Bessel beam generation,” Opt. Express 20, 27212–27221 (2012).
[Crossref]

E. E. García-Guerrero, E. R. Méndez, H. M. Escamilla, T. A. Leskova, and A. A. Maradudin, “Design and fabrication of random phase diffusers for extending the depth of focus,” Opt. Express 15, 910–923 (2007).
[Crossref]

S. Ryu and C. Joo, “Design of binary phase filters for depth-of-focus extension via binarization of axisymmetric aberrations,” Opt. Express 25, 30312–30326 (2017).
[Crossref]

S. Elmalem, R. Giryes, and E. Marom, “Learned phase coded aperture for the benefit of depth of field extension,” Opt. Express 26, 15316–15331 (2018).
[Crossref]

E. Ben-Eliezer, N. Konforti, B. Milgrom, and E. Marom, “An optimal binary amplitude-phase mask for hybrid imaging systems that exhibit high resolution and extended depth of field,” Opt. Express 16, 20540–20561 (2008).
[Crossref]

B. Milgrom, N. Konforti, M. A. Golub, and E. Marom, “Pupil coding masks for imaging polychromatic scenes with high resolution and extended depth of field,” Opt. Express 18, 15569–15584 (2010).
[Crossref]

H. Haim, A. Bronstein, and E. Marom, “Computational multi-focus imaging combining sparse model with color dependent phase mask,” Opt. Express 23, 24547–24556 (2015).
[Crossref]

H. Li, J. Peng, F. Pan, Y. Wu, Y. Zhang, and X. Xie, “Focal stack camera in all-in-focus imaging via an electrically tunable liquid crystal lens doped with multi-walled carbon nanotubes,” Opt. Express 26, 12441–12454 (2018).
[Crossref]

G. Kuo, F. L. Liu, I. Grossrubatscher, R. Ng, and L. Waller, “On-chip fluorescence microscopy with a random microlens diffuser,” Opt. Express 28, 8384–8399 (2020).
[Crossref]

O. Lévêque, C. Kulcsár, A. Lee, H. Sauer, A. Aleksanyan, P. Bon, L. Cognet, and F. Goudail, “Co-designed annular binary phase masks for depth-of-field extension in single-molecule localization microscopy,” Opt. Express 28, 32426–32446 (2020).
[Crossref]

Opt. Laser Eng. (1)

E. González-Amador, A. Padilla-Vivanco, C. Toxqui-Quitl, J. Arines, and E. Acosta, “Jacobi–Fourier phase mask for wavefront coding,” Opt. Laser Eng. 126, 105880 (2020).
[Crossref]

Opt. Lett. (1)

Optica (3)

Proc. Natl. Acad. Sci. USA (1)

L. Jin, Y. Tang, Y. Wu, J. B. Coole, M. T. Tan, X. Zhao, H. Badaoui, J. T. Robinson, M. D. Williams, A. M. Gillenwater, R. R. Richards-Kortum, and A. Veeraraghavan, “Deep learning extended depth-of-field microscope for fast and slide-free histology,” Proc. Natl. Acad. Sci. USA 117, 33051–33060 (2020).
[Crossref]

Sci. Adv. (1)

J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3D fluorescence microscopy with ultraminiature lensless flatscope,” Sci. Adv. 3, e1701548 (2017).
[Crossref]

Other (12)

S.-H. Baek, H. Ikoma, D. S. Jeon, Y. Li, W. Heidrich, G. Wetzstein, and M. H. Kim, “End-to-end hyperspectral-depth imaging with learned diffractive optics,” arXiv:2009.00463 (2020).

K. Yanny, N. Antipa, R. Ng, and L. Waller, “Miniature 3D fluorescence microscope using random microlenses,” in Optics and the Brain (Optical Society of America, 2019), paper BT3A–4.

O. Cossairt, C. Zhou, and S. Nayar, “Diffusion coded photography for extended depth of field,” in ACM SIGGRAPH (2010), pp. 1–10.

O. Cossairt and S. Nayar, “Spectral focal sweep: extended depth of field from chromatic aberrations,” in IEEE International Conference on Computational Photography (ICCP) (IEEE, 2010), pp. 1–8.

Pytorch, https://pytorch.org/ .

K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, “Joint image sharpening and denoising by 3D transform-domain collaborative filtering,” in International Workshop on Spectral Methods and Multirate Signal Processing (SMMSP) (2007).

S. S. Khan, V. Sundar, V. Boominathan, A. Veeraraghavan, and K. Mitra, “Flatnet: Towards photorealistic scene reconstruction from lensless measurements,” IEEE Trans. Pattern Anal. Mach. Intell. (2020).
[Crossref]

Y. Wu, V. Boominathan, H. Chen, A. Sankaranarayanan, and A. Veeraraghavan, “PhaseCam3D—learning phase masks for passive single view depth estimation,” in IEEE International Conference on Computational Photography (ICCP) (2019), pp. 1–12.

C. A. Metzler, H. Ikoma, Y. Peng, and G. Wetzstein, “Deep optics for single-shot high-dynamic-range imaging,” in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020).

J. W. Goodman, Introduction to Fourier Optics (Roberts and Company, 2005).

Training stage image databases, https://data.vision.ee.ethz.ch/cvl/DIV2K/ ; http://cv.snu.ac.kr/research/EDSR/Flickr2K.tar .

Kodak Lossless True Color Image Suite, http://r0k.us/graphics/kodak/ .

Supplementary Material (1)

NameDescription
Supplement 1       Detailed analysis on the convergence behavior of the proposed design in the end-to-end optimization, and additional details about the physical experiments setup.

Data Availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1. Light wave with a given wavelength and a curvature for a point source at a distance ${d_1}$ propagates on the aperture plane containing MPM (refractive index $n$) to be designed. The MPM modulates the phase of the incident wavefront. The resulting wavefront propagates through the lens to the aperture-sensor, distance ${d_2}$, via the Fresnel propagation model. The intensities of the sensor-incident wavefront define PSFs.
Fig. 2.
Fig. 2. Differentiable optimization framework of phase-coded optics for achromatic EDoF. The spectral PSFs are convolved with a batch of RGB images. The inverse imaging OTFs provide estimates of the true images. Finally, a differentiable quality loss ${\cal L}$, such as mean squared error with respect to the ground-truth image (or PSNR criterion), is defined on the reconstructed images.
Fig. 3.
Fig. 3. Reconstruction quality (accuracy) in PSNR(dB) versus depth distance ${d_1}$ using the inverse imaging OTFs with invariant/varying regularization for the proposed power-balanced hybrid and lensless systems. The red horizontal lines correspond to the desirable values of ${\rm{PSNR}} = 30\;{\rm{dB}}$. The effect of the Fresnel order on reconstruction quality is shown. These numerical results suggest that the highest value of Fresnel order provides the most precise reconstruction. A gain up to 2 dB is obtained by inverse imaging OTF with varying regularization in comparison with the OTF with the invariant regularization. Finally, the power-balanced hybrid overcomes its lensless counterpart up to 5 dB of PSNR.
Fig. 4.
Fig. 4. Average of the optimal number of levels (N) and optical power-balance ($\alpha$) values obtained for the proposed system with the end-to-end framework with varying/ invariant regularization versus different Fresnel order values. The solid line stands for the mean along with all the number of levels and power-balance values, and the colored area models the variance. These results suggest a direct dependence between the Fresnel order, the number of levels, and optical power balance. This dependence shows that the number of levels and optical power balance are higher since the Fresnel order increases.
Fig. 5.
Fig. 5. Phase profile of the designed MPM. The contribution of each term (value of $\alpha ,\beta$, and ${\rho _r}$) in MPM profile is illustrated; that is, the shape of MPM corresponding to quadratic, cubic, and some of the most valuable Zernike polynomials components is presented where $R = \sqrt {{x^2} + {y^2}}$ and $\phi = \arctan (y/x)$.
Fig. 6.
Fig. 6. (a) Central cross-sections of PSF functions of the optical system after deblurring by Eq. (11). The ideal deblurring should be close to $\delta$ function, and we observe that the curves are well concentrated around the central point and not far different from each other for all channels and distances. The best result we can see is for the blue color channel. (b) Average of the optimal $\mu$ values obtained for the proposed and lensless systems with the end-to-end framework versus different Fresnel order values. The solid line stands for the mean along all the $\mu$ values and the colored area models the variance. These results suggest a direct dependence between the Fresnel order and the inversion rule in Eq. (13) through the value of $\mu$, establishing that $\mu$ can be discarded when the Fresnel order is large enough, meaning that the inversion rule in Eq. (13) plays an important role when the Fresnel order is small.
Fig. 7.
Fig. 7. Evaluation of achromatic EDoF imaging in simulation for two depths, ${d_1} = 0.5,1.0$, using the proposed mixed OTF in Eq. (11) in terms of PSNR. We compare the performance of a Fresnel lens optimized for one of the target wavelengths, a refractive lens, lensless setup with cubic MPM (column $\beta$), lensless system with phase profile in [4], optical power-balanced hybrid without cubic component (column $\alpha + {\rm{Zern}}{\rm{.Poly}}$), and the proposed system (column $\alpha + \beta + {\rm{Zern}}{\rm{.Poly}}$) with Fresnel order equal to 16 and $\alpha = 0.22$ following results in Fig. 3. The resulting PSFs are shown for all target wavelengths and depths. Sensor images and PSFs (rows 2 and 4) exhibit strong chromatic aberrations for the Fresnel lens, refractive lens, and the lensless system with cubic MPM, whereas the lensless system in [4], a power-balanced hybrid without cubic component and the proposed optics mitigate wavelength-dependent image variations much better where our optics has the best performance (the curves are better concentrated around the focal point). After deconvolution (rows 3 and 5), the sharpest images are obtained with the proposed optics.
Fig. 8.
Fig. 8. Experimental setup. P, polarizer; BS, beamsplitter; and SLM, spatial light modulator. The lenses ${{\rm{L}}_1}$ and ${{\rm{L}}_2}$ form the 4f-telescopic system projecting wavefront from the SLM plane to the imaging lens ${{\rm{L}}_3}$, and CMOS is a registering camera. ${d_1}$ is a distance between the scene and the plane of the hybrid imaging system Lens & MPM, and ${d_2}$ is a distance between this optical system and the sensor.
Fig. 9.
Fig. 9. Reconstructed images from experimental blurred measurements that contain three objects at three different distances ${d_1} = 0.5,1.0$, and 1.2 m, a casual real scene (depth range 0.5–2.0 m), and two Fresnel orders are compared for four optical setups: proposed power-balanced hybrid (column $\alpha + \beta +$ Zern. Poly), lens $+$ MPM corresponding to cubic companion (column $\alpha + \beta$), lensless with cubic phase MPM (column $\beta$), and lens only (column Lens). To estimate the scene, we use varying/invariant Wiener filtering as in Eqs. (11) and (12) for distances ${d_1} = 0.5,0.65,0.8,0.9,1.0,1.15,1.3,1.5,1.8$, and 2.0 m. The experimental and simulated PSFs are shown in the measurement rows. The imaging results confirm that the highest value of Fresnel order provides the best reconstruction quality and that the advantage of the varying Wiener filtering in Eq. (11) over its invariant version in Eq. (12) is a mitigation of the chromatic aberrations and reduction of the noise. Additionally, we verify that the optimized proposed system is superior to its competitors in terms of sharpness and chromatic aberrations, suggesting the effectiveness of the optimization framework described in Fig. 2. Finally, we confirm the ability of the optimized OTF in Eq. (11) to estimate the longitudinal scene.
Fig. 10.
Fig. 10. Comparison between the smoothing approximation of the floor function described in Eq. (26). From this figure, we observe that Eq. (26) provides a close estimation of the true value.

Tables (1)

Tables Icon

Table 1. Comparison of Different Phase Modulation Setups for EDoF Enabling the State-of-the-Arta , b

Equations (28)

Equations on this page are rendered with MathJax. Learn more.

P λ ( x , y ) = P A ( x , y ) e j π λ ( 1 d 1 + 1 d 2 1 f λ ) ( x 2 + y 2 ) + j φ λ 0 , λ ( x , y ) .
P λ ( x , y ) = P A ( x , y ) e j π λ ( 1 d 1 + 1 d 2 ) ( x 2 + y 2 ) + j φ λ 0 , λ ( x , y ) ,
P λ ( x , y ) = P A ( x , y ) e j π λ ( 1 d 1 + 1 d 2 1 α f λ ) ( x 2 + y 2 ) + j φ λ 0 , λ , α ( x , y ) ,
P S F λ c o h ( u , v ) = F P λ ( u d 2 λ , v d 2 λ ) ,
P S F λ ( u , v ) = | P S F λ c o h ( u , v ) | 2 | P S F λ c o h ( u , v ) | 2 d u d v .
P S F c ( u , v ) = Λ P S F λ ( u , v ) T c ( λ ) d λ Λ P S F λ ( u , v ) T c ( λ ) d λ d u d v , c { r , g , b } ,
O T F c ( f x , f y ) = P S F c ( u , v ) e j 2 π ( f x u + f y v ) d u d v ,
I c , δ s ( x , y ) = P S F c , δ ( x , y ) I c o ( x , y ) ,
I c , δ s ( f x , f y ) = O T F c , δ ( f x , f y ) I c o ( f x , f y ) .
H ^ c a r g m i n H c 1 σ 2 δ , k , c ω δ | | I c o , k H c I c , δ s , k | | 2 2 + 1 γ c | | H c | | 2 2 J ,
H ^ c ( f x , f y ) = δ D ω δ O T F c , δ ( f x , f y ) δ D ω δ | O T F c , δ ( f x , f y ) | 2 + r e g k | I c o , k ( f x , f y ) | 2 ,
H ^ c ( f x , f y ) = δ D ω δ O T F c , δ ( f x , f y ) δ D ω δ | O T F c , δ ( f x , f y ) | 2 + r e g .
I ^ c o , k ( x , y ) = F 1 { H ^ c I c , δ s , k } ,
φ λ 0 , α ( x , y ) = π α λ 0 f λ 0 ( x 2 + y 2 ) + β ( x 3 + y 3 ) + r = 1 , r 4 R ρ r P r ( x , y ) ,
φ ^ λ 0 , α ( x , y ) = m o d ( φ λ 0 , α ( x , y ) + Q / 2 , Q ) Q / 2.
θ λ 0 , α ( x , y ) = φ ^ λ 0 , α ( x , y ) / N N ,
φ M P M λ 0 , λ , α ( x , y ) = λ 0 ( n ( λ ) 1 ) λ ( n ( λ o ) 1 ) θ λ 0 , α ( x , y ) ,
h λ 0 ( x , y ) = λ 0 ( n ( λ o ) 1 ) θ λ 0 , α 2 π .
Θ = ( α , β , m Q , r e g , N , ρ r ) .
Θ ^ = a r g m a x Θ ( P S N R ( Θ , δ ) , δ D ) .
P S N R ( Θ , δ ) = m e a n k K ( P S N R k ( Θ , δ ) ) .
J = 1 σ 2 δ , k , c ω δ | | I c o , k H c O T F c , δ I c o , k | | 2 2 + 1 γ c | | H c | | 2 2 .
1 σ 2 δ ω δ H c ( f x , f y ) | O T F c , δ ( f x , f y ) | 2 k | I c o , k ( f x , f y ) | 2 1 σ 2 δ ω δ O T F c , δ ( f x , f y ) k | I c o , k ( f x , f y ) | 2 + 1 γ H c ( f x , f y ) = 0
H ^ c ( f x , f y ) = δ D ω δ O T F c , δ ( f x , f y ) δ D ω δ | O T F c , δ ( f x , f y ) | 2 + σ 2 γ k | I c o , k ( f x , f y ) | 2 .
H ^ c ( f x , f y ) = δ D ω δ O T F c , δ ( f x , f y ) δ D ω δ | O T F c , δ ( f x , f y ) | 2 + r e g k | I c o , k ( f x , f y ) | 2 ,
w g ( w ) = w 1 2 1 π tan 1 ( 0.9999 sin ( 2 π w ) 1 0.9999 cos ( 2 π w ) ) .
r o u n d ( w ) g ( w + 0.5 ) .
m o d ( w , z ) w z g ( w / z ) .