Home Monocular depth sensing using metalens
Article Open Access

Monocular depth sensing using metalens

  • Fan Yang ORCID logo , Hung-I Lin , Peng Chen , Juejun Hu EMAIL logo and Tian Gu ORCID logo EMAIL logo
Published/Copyright: March 28, 2023
Become an author with De Gruyter Brill

Abstract

3-D depth sensing is essential for many applications ranging from consumer electronics to robotics. Passive depth sensing techniques based on a double-helix (DH) point-spread-function (PSF) feature high depth estimation precision, minimal power consumption, and reduced system complexity compared to active sensing methods. Here, we propose and experimentally implemented a polarization-multiplexed DH metalens designed using an autonomous direct search algorithm, which utilizes two contra-rotating DH PSFs encoded in orthogonal polarization states to enable monocular depth perception. Using a reconstruction algorithm that we developed, concurrent depth calculation and scene reconstruction with minimum distortion and high resolution in all three dimensions were demonstrated.

1 Introduction

Conventional optical imaging systems map 3-D scene to a flat image plane at the cost of losing depth information. The missing knowledge of object distances, however, is crucial to a variety of applications spanning autonomous driving, object recognition, gesture control, virtual/augmented reality, etc. Multiple active depth sensing mechanisms have been utilized to retrieve 3-D information, such as time-of-flight and structured light [19]. However, these sensing techniques require active illumination and modulation components, which add to system complexity, cost and power consumption. Passive stereo cameras infer object distance information by comparing different images captured at different viewpoints of the scene. However, they are limited by the well-known trade-off between system size and depth resolution.

An alternative route is depth-from-defocus (DFD), which applies computational imaging techniques to infer the depth from defocus blur of a classical lens [1017]. However, defocus cue is often ambiguous and requires complementary information such as pictorial depth cues to determine the depth. They also have low depth estimation accuracy since a defocused point-spread-function (PSF) of a classical lens varies slowly along the optical axis. In addition, the DFD method further suffers from limited depth range and degraded lateral resolution. To solve these issues, PSF engineering has been explored to enhance depth discrimination capability. This approach employs custom tailored phase masks to define PSF of the system, and depth information can be encoded into the captured image directly. PSFs designed for depth estimation include astigmatic PSF [18], biplane PSF [19], tetrapod-like PSF [2022], etc. Among them, double-helix (DH) PSF [2328] generates two rotating foci, where the rotation angle determines the object depth. This method streamlines image data post-processing given its shape simplicity. To produce the phase mask for DH PSF generation, the classical approach involves a spatial light modulator (SLM) placed in the Fourier plane of a 4f system, which however creates alignment challenges and significantly increases footprint of the entire system.

Metasurfaces provide a compact and cost-effective alternative to the 4f system. They are composed of sub-wavelength nanostructures that furnishes on-demand control of the outgoing wavefront [2936]. Compared to diffractive optical elements (DOE) [3739], metasurfaces offer much higher spatial resolution in phase mask definition, which contributes to higher efficiency, elimination of high-order diffraction, reduced aberration of PSFs, and increased degrees of freedom for wavefront modulation. A monocular DH metasurface was experimentally realized by Jin et al. [40]. The image captured by the DH metasurface is the convolution of the scene with the DH PSF, and thus the depth information can only be estimated with prior knowledge of the original object. Colburn et al. coupled a DH metasurface with an extended depth-of-focus (EDOF) metasurface in a binocular setting to resolve this ambiguity [41]. Multiplexing presents a way to combine the two metasurfaces into one aperture to realize monocular depth estimation (MDE) [42]. Along this line, MDE was recently demonstrated with a decoupled pair of conjugate single-helix PSFs [43].

In this paper, we demonstrate a polarization-multiplexed DH metasurface design for MDE using a single metasurface. Two DH PSFs with opposite rotating directions are each encoded with a linear polarization. Importantly, the focal point rotation angles of the two PSFs always add up to 90°, a feature that allows computationally efficient and unambiguous reconstruction of both the depth and image. As one specific example, we experimentally implemented the design at 635 nm wavelength within the depth range of 45–212 mm and rotation angles of up to 80°. This concept is generically applicable to other wavelengths and depth ranges.

2 Design principle of metalens with contra-rotating DH PSFs

The DH phase mask is constructed based on superposition of Laguerre–Gaussian modes [4446], it modifies the wavefront emitted from a point-source to generates two foci on the image plane, where orientation of the line connecting the foci depends on the point source distance. Rotation rate and depth range are determined by the choice of the Laguerre–Gaussian mode set. In previous implementations, a block-iterative weighted projection algorithm was utilized to optimize the DH phase mask and suppress the focal spot sidelobes, thereby improving depth estimation accuracy [41, 46, 47]. However, the optimization procedure is a time-consuming empirical process which requires constant human intervention. Here, we use a direct search (DS) algorithm [4852] to optimize the DH phase mask. The figure-of-merit (FOM) for the DS is defined in Eqs. (1)(3). It comprises the sum of the lower intensities between the two foci over a discrete set of sampling rotation angles, minus variance of the intensities between different angles. Here, I 1(ϕ) and I 2(ϕ) are the intensities of the two foci of a certain rotation angle ϕ, c(ϕ) compensates for illumination intensity decrease of the point source at increasing distance, z(ϕ) denotes the distance of the point source, N represents the total number of rotation angles sampled, S 2{I(ϕ)} calculates the variance of the intensities among different angles, and κ = 1 is a weighting factor. Equation (4) gives the relationship between rotation angle and distance using the same notations as in Ref. [41], with V 1 and ω 0 being free parameters to control the rotation rate and depth range.

(1) F O M = 1 N i = 1 N I ( ϕ i ) κ S 2 { I ( ϕ i ) }

(2) I ( ϕ ) = min { I 1 ( ϕ ) , I 2 ( ϕ ) } c ( ϕ )

(3) c ( ϕ ) = z ( ϕ ) 2

(4) z = π ω 0 2 λ tan ϕ 0 ϕ V 1

With a DH phase mask, an image formed through the metalens is the convolution of the DH PSF and the object. Without prior knowledge of the object, it is not possible to extract the true object information from a single-shot image. We eliminate this ambiguity by multiplexing two DH phase masks with opposite rotation directions into a single metalens. The two phase masks share an identical layout albeit with their x and y coordinates swapped, and therefore only one phase mask design is required. After acquiring two images corresponding to the two polarization states, a series of deconvolution operation is performed on them assuming DH PSFs of varying rotation angles. Since the two images capture the same object, their deconvolved outcome should be identical provided that a correct rotation angle is used. Therefore, by identifying the deconvolved image pair with maximum similarity, the rotation angle and hence object depth can be unambiguously determined. More details of the depth retrieval algorithm are discussed in Section 4. Further improvement of the design is possible leveraging the rise of end-to-end optimization framework in recent years, which provides an alternative approach to optimize both the meta-optical frontend and the reconstruction algorithm [5355].

An approach similar to that in Ref. [56] is employed to design polarization-multiplexed meta-atoms. The meta-atom structure is schematically depicted in Figure 1(a), comprising a 450 nm thick rectangular amorphous silicon nano-post sitting on a fused silica substrate. The geometries of the nano-posts are designed according to the polarization directions of the incident light. The meta-atom pitch is fixed at 300 nm. The finite-difference time-domain (FDTD) method is used to analyze the meta-atom responses. The phase delay and transmittance of the meta-atoms under x-polarized incident light are shown in Figure 1(b) and (c). Data pertinent to y-polarized light can be trivially obtained by swapping the x and y coordinates. The phase difference between two polarization states is shown in Figure 1(d). A 2-bit design [57] containing 16 meta-atom structures of different lateral dimensions is chosen to allow independent control of the metalens’ phase profiles in both polarization states. The meta-atom dimensions are listed in the Appendix.

Figure 1: 
Polarization-multiplexed meta-atom design. (a) Illustration of the meta-atom structure. (b) Phase delay and (c) transmittance of the meta-atoms with x-polarized incident light. (d) Phase delay difference between the two polarization states.
Figure 1:

Polarization-multiplexed meta-atom design. (a) Illustration of the meta-atom structure. (b) Phase delay and (c) transmittance of the meta-atoms with x-polarized incident light. (d) Phase delay difference between the two polarization states.

3 Metalens fabrication and PSF characterization

As a proof of concept, we designed a metalens with 1 mm aperture size and 5 mm focal length. Parameters in Eq. (4) are taken as ω 0 = 125, ϕ 0 = 180° and V 1 = 2, identical to those in Ref. [41]. The DS optimization was performed with N = 41 in Eq. (1), i.e. with 41 evenly spaced discrete rotation angles at a step size of 2°. The designed metalens has a rotation angle up to 80°, corresponding to a depth sensing range of 45–212 mm.

The so-designed metalens was fabricated through electron-beam lithography followed by reactive-ion etching. Figure 2 presents optical microscope and scanning electron microscope (SEM) images of the fabricated metalens, showing excellent uniformity and pattern fidelity.

Figure 2: 
Images of the fabricated metalens. (a) Fabricated metalens on the silica substrate with metal mask. (b) Optical microscope image of the metalens. (Scale bar: 200 μm). (c) Scanning electron microscope image of the metalens. (Scale bar: 1 μm).
Figure 2:

Images of the fabricated metalens. (a) Fabricated metalens on the silica substrate with metal mask. (b) Optical microscope image of the metalens. (Scale bar: 200 μm). (c) Scanning electron microscope image of the metalens. (Scale bar: 1 μm).

The PSF of the fabricated metalens was first characterized. Figure 3 compares the simulated and measured PSFs with different rotation angles. In the simulation, Kirchhoff diffraction integral [58] is used to transform the near-field wavefront after exiting the metasurface to the intensity distribution on the image plane. During the PSF measurement, a monochrome micro-LED display (Jade Bird Display 5000DPI AMuLED Panel) is placed in front of the metalens at different distances, and a 40 μm diameter circular spot is displayed to emulate a point object. A telescope assembly is placed between the DH metalens and an image sensor (Arducam MT9J001) with a calibrated magnification of 5. A polarizer is mounted in front of the sensor to control the polarization state of the incident light. As seen from Figure 3, excellent agreement is obtained between our design and experiment throughout the entire depth range.

Figure 3: 
Metalens PSF. (a) Simulation and (b) experimental measurement of PSFs for different source distances in the x polarization state. (c) Simulation and (d) experimental measurement of PSFs in the y polarization state. (scale bar: 20 μm).
Figure 3:

Metalens PSF. (a) Simulation and (b) experimental measurement of PSFs for different source distances in the x polarization state. (c) Simulation and (d) experimental measurement of PSFs in the y polarization state. (scale bar: 20 μm).

4 Concurrent imaging and depth mapping demonstration

The experimental setup consists of the micro-LED display projecting a ‘+’-shaped pattern shown in Figure 4(a) and placed at varying distances. The images captured by the DH metalens are shown in Figure 4(b) and (c) for the two polarization states, respectively. To extract depth information and reconstruct the scene, these images are deconvolved using a set of PSF pairs with different rotation angles. Since the two phase masks corresponding to orthogonal polarizations are linked via a reflection transformation, the sum of rotation angles of every PSF pair must equal 90°. We use Wiener deconvolution given in Eq. (5), where H is the Fourier transform of the image formed by the lens, G gives the Fourier transform of DH PSF (i.e., a cosine function), F denotes the Fourier transform of the deconvolved image, and SNR = 0.1 is the signal-to-noise ratio, which is an intrinsic parameter of the image sensor.

(5) F = H G G 2 + S N R

Figure 4: 
Experimental demonstration of image deconvolution to enable concurrent depth mapping and scene reconstruction. (a) A ‘+’ pattern on the micro-LED display emulates an object. (b, c) Images in the (b) x-polarization state and (c) y-polarization state with different object distances. (d) Deconvolved images of the object at 5.5 cm distance using DH PSF rotation angles of 90°, 110°, and 130° (left to right), respectively. (e) Deconvolved images of the object at 5.5 cm distance with DH PSF rotation angles of 0°, −20°, and −40° (left to right), respectively. (f) Similarity of image pairs deconvolved using different DH PSF rotation angles. The maxmium point corresponds to the correct rotation angle. (g) Object depth estimation based on analytical expression (solid line) and experimental measurement (red dots). (Scale bar: 40 μm).
Figure 4:

Experimental demonstration of image deconvolution to enable concurrent depth mapping and scene reconstruction. (a) A ‘+’ pattern on the micro-LED display emulates an object. (b, c) Images in the (b) x-polarization state and (c) y-polarization state with different object distances. (d) Deconvolved images of the object at 5.5 cm distance using DH PSF rotation angles of 90°, 110°, and 130° (left to right), respectively. (e) Deconvolved images of the object at 5.5 cm distance with DH PSF rotation angles of 0°, −20°, and −40° (left to right), respectively. (f) Similarity of image pairs deconvolved using different DH PSF rotation angles. The maxmium point corresponds to the correct rotation angle. (g) Object depth estimation based on analytical expression (solid line) and experimental measurement (red dots). (Scale bar: 40 μm).

Three pairs of deconvolved images in two polarization states are shown in Figure 4(d) and (e), each assuming a different rotation angle. We then computed the image correlation map between the image pair using Eq. (6), where ‘corr’ stands for the image correlation map, and h 1 and h 2 represent the deconvolved image pair. We further define the similarity parameter as the maximum value within the image correlation map, and Figure 4(f) plots the parameter as a function of the rotation angle. Since the pair of images depict the same object, the similarity curve should reach maximum when the rotation angles of the DH PSFs used in the deconvolution are correct. The object depth can then be inferred according to the correct rotation angles.

(6) c o r r ( x , y ) = h 1 ( x , y ) h 2 ( x x , y y ) d x d y

The protocol described above was applied to depth estimation of objects placed at different distances, and the measured depth values are shown as red dots in Figure 4(g). The analytical expression of Eq. (4) that our lens design is based on is also plotted as a solid line, showing excellent agreement.

To further characterize the lateral spatial resolution of the DH metalens, we replaced the ‘+’ pattern on the micro-LED display with a standard USAF resolution chart. As an example, the captured images under x and y polarized light are shown in Figure 5(a) and (b) for an object distance of 5.5 mm. The same deconvolution algorithm was performed to reconstruct the scene shown in Figure 5(c). The modulation transfer function (MTF) at different spatial frequencies was obtained from the image contrast of the reconstructed resolution chart. In addition to the direct MTF measurement, we also evaluated the MTF from Fourier transform of the measured DH PSFs. Both sets of results are plotted in Figure 5(d) with excellent agreement, which assures accuracy of our reconstruction algorithm.

Figure 5: 
Imaging of the USAF target for evaluating the lateral resolution. Experimentally captured images in the (a) x-polarization and (b) y-polarization states. (c) Reconstructed image of the USAF resolution target pattern. (d) Measured MTF of the metalens at an object distance of 5.5 mm. The red dots correspond to MTF measured from the USAF target and the blue dots are MTF calculated from experimentally measured PSFs via Fourier transform. The solid line gives MTF inferred from PSFs simulated by diffraction integral. (Scale bar: 80 μm).
Figure 5:

Imaging of the USAF target for evaluating the lateral resolution. Experimentally captured images in the (a) x-polarization and (b) y-polarization states. (c) Reconstructed image of the USAF resolution target pattern. (d) Measured MTF of the metalens at an object distance of 5.5 mm. The red dots correspond to MTF measured from the USAF target and the blue dots are MTF calculated from experimentally measured PSFs via Fourier transform. The solid line gives MTF inferred from PSFs simulated by diffraction integral. (Scale bar: 80 μm).

Lastly, real world imaging was demonstrated using letters of ‘M’, ‘I’, and ‘T’ printed on card boards. The letters were displaced at different distances as shown in Figure 6(a). An LED light source with a center wavelength of 625 nm and 20 nm full-width-at-half-maximum (FWHM) spectral bandwidth was used to illuminate the scene, and a filter with 635 nm center wavelength and 10 nm FWHM bandwidth was placed in front of the image sensor to reject out-of-band light. The images recorded in the two polarization states are shown in Figure 6(b) and (c). The aforementioned algorithm was implemented to infer the depth of the objects, with the caveat that oblique incidence onto the metalens (which leads to additional phase delay) was accounted for and corrected following procedures outlined in Ref. [41]. The extracted depths are shown in Figure 6(d), which agrees well with the ground truth. The average error of depth estimation is 2.7%.

Figure 6: 
Experimental demonstration of depth sensing. (a) Photos of printed letters of ‘M’, ‘I’, and ‘T’, each placed at a different distance. (b, c) Captured images in (b) the x-polarization state and (c) the y-polarization state. (d) Inferred object distances (red dots) compared to the ground truth (solid line). (Scale bar: 80 μm).
Figure 6:

Experimental demonstration of depth sensing. (a) Photos of printed letters of ‘M’, ‘I’, and ‘T’, each placed at a different distance. (b, c) Captured images in (b) the x-polarization state and (c) the y-polarization state. (d) Inferred object distances (red dots) compared to the ground truth (solid line). (Scale bar: 80 μm).

5 Conclusions

We demonstrated a monocular metalens design capable of perform both depth sensing and scene reconstruction concurrently. The design leverage polarization-multiplexing to encode two phase masks, each generating an optimized DH PSF. Our design ensures that rotation angles of the two contra-rotating PSFs are always complementary, which enables unambiguous depth perception without prior knowledge of the scene. Compared to other depth sensing methods, our MDE approach features a single-aperture, compact footprint, high depth and lateral resolution, and passive operation. These advantages foresee vast potential applications of our technology in areas such as microscopy, medical imaging, virtue/augmented reality, automotive/robotic sensing and beyond.


Corresponding authors: Juejun Hu and Tian Gu, Department of Materials Science and Engineering, Massachusetts Institute of Technology, Cambridge, MA 02139, USA; and Materials Research Laboratory, Massachusetts Institute of Technology, Cambridge, MA 02139, USA, E-mail: (J. Hu), (T. Gu) (T. Gu)

Fan Yang and Hung-I Lin contributed equally to this work.


Funding source: MIT Skoltech Seed Fund Program

Funding source: Defense Advanced Research Projects Agency Defense Sciences Office Program: EXTREME Optics and Imaging (EXTREME) under agreement no. HR00111720029

Acknowledgment

The authors acknowledge fabrication facility support by MIT.nano and the Center for Nanoscale Systems at Harvard University.

  1. Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.

  2. Research funding: This work was funded by Defense Advanced Research Projects Agency Defense Sciences Office Program: EXTREME Optics and Imaging (EXTREME) under agreement no. HR00111720029 and by the MIT Skoltech Seed Fund Program. The views, opinions, and/or findings expressed are those of the authors and should not be interpreted as representing the official views or policies of the Department of Defense or the U.S. Government.

  3. Conflict of interest statement: The subject matter is part of a patent application that has been licensed to 2Pi Inc., a company founded by the authors.

  4. Data availability: Data underlying the results presented in this paper may be obtained from the authors upon reasonable request.

Appendix A: Polarization-multiplexed meta-atom design

The meta-atom dimensions and their phase/transmittance responses are listed in Table 1.

Table 1:

Polarization-multiplexed meta-atoms.

Meta-atom index 1 2 3 4 5 6 7 8
X dimension [nm] 136 159 159 148 90 98 90 90
Y dimension [nm] 140 90 98 109 159 98 117 132
X-pol phase [°] −3 −17 −4 −10 125 87 83 98
Y-pol phase [°] 5 125 183 273 −17 87 180 274
X-pol transmittance 0.87 0.89 0.89 0.89 0.81 0.88 0.89 0.86
Y-pol transmittance 0.88 0.81 0.76 0.76 0.89 0.88 0.74 0.78
Meta-atom index 9 10 11 12 13 14 15 16
X dimension [nm] 98 117 228 105 109 132 121 117
Y dimension [nm] 159 90 228 121 148 90 105 117
X-pol phase [°] 183 180 185 186 273 274 261 264
Y-pol phase [°] −4 83 185 261 −10 98 186 264
X-pol transmittance 0.76 0.74 0.70 0.74 0.76 0.78 0.77 0.77
Y-pol transmittance 0.89 0.89 0.70 0.77 0.89 0.86 0.74 0.77

References

[1] G. Kim, Y. Kim, J. Yun, et al.., “Metasurface-driven full-space structured light for three-dimensional imaging,” Nat. Commun., vol. 13, no. 1, p. 5920, 2022. https://doi.org/10.1038/s41467-022-32117-2.Search in Google Scholar PubMed PubMed Central

[2] Y. Ni, S. Chen, Y. Wang, Q. Tan, S. Xiao, and Y. Yang, “Metasurface for structured light projection over 120 field of view,” Nano Lett., vol. 20, no. 9, pp. 6719–6724, 2020. https://doi.org/10.1021/acs.nanolett.0c02586.Search in Google Scholar PubMed

[3] J. Wang, “Metasurfaces enabling structured light manipulation: advances and perspectives,” Chinese Opt. Lett., vol. 16, no. 5, p. 050006, 2018. https://doi.org/10.3788/col201816.050006.Search in Google Scholar

[4] A. H. Dorrah and F. Capasso, “Tunable structured light with flat optics,” Science, vol. 376, no. 6591, p. eabi6860, 2022. https://doi.org/10.1126/science.abi6860.Search in Google Scholar PubMed

[5] C. He, Y. Shen, and A. Forbes, “Towards higher-dimensional structured light,” Light Sci. Appl., vol. 11, no. 1, p. 205, 2022. https://doi.org/10.1038/s41377-022-00897-3.Search in Google Scholar PubMed PubMed Central

[6] N. Li, C. P. Ho, J. Xue, et al.., “A progress review on solid-state lidar and nanophotonics-based lidar sensors,” Laser Photon. Rev., vol. 16, no. 11, p. 2100511, 2022. https://doi.org/10.1002/lpor.202100511.Search in Google Scholar

[7] J. Park, B. G. Jeong, S. I. Kim, et al.., “All-solid-state spatial light modulator with independent phase and amplitude control for three-dimensional lidar applications,” Nat. Nanotechnol., vol. 16, no. 1, pp. 69–76, 2021. https://doi.org/10.1038/s41565-020-00787-y.Search in Google Scholar PubMed

[8] R. J. Martins, E. Marinov, M. A. B. Youssef, et al.., “Metasurface-enhanced light detection and ranging technology,” Nat. Commun., vol. 13, no. 1, p. 5724, 2022. https://doi.org/10.1038/s41467-022-33450-2.Search in Google Scholar PubMed PubMed Central

[9] I. Kim, R. J. Martins, J. Jang, et al.., “Nanophotonics for light detection and ranging technology,” Nat. Nanotechnol., vol. 16, no. 5, pp. 508–524, 2021. https://doi.org/10.1038/s41565-021-00895-3.Search in Google Scholar PubMed

[10] M. Subbarao and T.-C. Wei, “Depth from defocus and rapid autofocusing: a practical approach,” in Proceedings 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE Computer Society, 1992, pp. 773–774.10.1109/CVPR.1992.223176Search in Google Scholar

[11] M. Subbarao and G. Surya, “Depth from defocus: a spatial domain approach,” Int. J. Comput. Vision, vol. 13, no. 3, pp. 271–294, 1994. https://doi.org/10.1007/bf02028349.Search in Google Scholar

[12] S. Chaudhuri and A. N. Rajagopalan, Depth from Defocus: A Real Aperture Imaging Approach, Berlin, Springer Science & Business Media, 1999.10.1007/978-1-4612-1490-8Search in Google Scholar

[13] Y. Y. Schechner and N. Kiryati, “Depth from defocus vs. stereo: how different really are they?” Int. J. Comput. Vis., vol. 39, no. 2, pp. 141–162, 2000. https://doi.org/10.1023/a:1008175127327.10.1023/A:1008175127327Search in Google Scholar

[14] C. Zhou, S. Lin, and S. K. Nayar, “Coded aperture pairs for depth from defocus and defocus deblurring,” Int. J. Comput. Vis., vol. 93, no. 1, pp. 53–72, 2011. https://doi.org/10.1007/s11263-010-0409-8.Search in Google Scholar

[15] Q. Guo, Z. Shi, Y.-W. Huang, et al.., “Compact single-shot metalens depth sensors inspired by eyes of jumping spiders,” Proc. Nat. Acad. Sci., vol. 116, no. 46, pp. 22959–22965, 2019. https://doi.org/10.1073/pnas.1912154116.Search in Google Scholar PubMed PubMed Central

[16] M. Y. Shalaginov, H.-I. Lin, F. Yang, et al.., “Metasurface-enabled wide-angle stereoscopic imaging,” in Frontiers in Optics, Pages JTu7B–2, Optica Publishing Group, 2022.10.1364/FIO.2022.JTu7B.2Search in Google Scholar

[17] S. Tan, F. Yang, V. Boominathan, A. Veeraraghavan, and G. V. Naik, “3d imaging using extreme dispersion in optical metasurfaces,” ACS Photonics, vol. 8, no. 5, pp. 1421–1429, 2021. https://doi.org/10.1021/acsphotonics.1c00110.Search in Google Scholar

[18] B. Huang, W. Wang, M. Bates, and X. Zhuang, “Three-dimensional super-resolution imaging by stochastic optical reconstruction microscopy,” Science, vol. 319, no. 5864, pp. 810–813, 2008. https://doi.org/10.1126/science.1153529.Search in Google Scholar PubMed PubMed Central

[19] M. F. Juette, T. J. Gould, M. D. Lessard, et al.., “Three-dimensional sub–100 nm resolution fluorescence microscopy of thick samples,” Nat. Methods, vol. 5, no. 6, pp. 527–529, 2008. https://doi.org/10.1038/nmeth.1211.Search in Google Scholar PubMed

[20] Y. Shechtman, S. J. Sahl, A. S. Backer, and W. E. Moerner, “Optimal point spread function design for 3d imaging,” Phys. Rev. Lett., vol. 113, no. 13, p. 133902, 2014. https://doi.org/10.1103/physrevlett.113.133902.Search in Google Scholar PubMed PubMed Central

[21] Y. Shechtman, L. E. Weiss, A. S. Backer, S. J. Sahl, and W. E. Moerner, “Precise three-dimensional scan-free multiple-particle tracking over large axial ranges with tetrapod point spread functions,” Nano Lett., vol. 15, no. 6, pp. 4194–4199, 2015. https://doi.org/10.1021/acs.nanolett.5b01396.Search in Google Scholar PubMed PubMed Central

[22] Y. Shechtman, A.-K. Gustavsson, P. N. Petrov, et al.., “Observation of live chromatin dynamics in cells via 3d localization microscopy using tetrapod point spread functions,” Biomed. Opt. Express, vol. 8, no. 12, pp. 5735–5748, 2017. https://doi.org/10.1364/boe.8.005735.Search in Google Scholar PubMed PubMed Central

[23] S. R. P. Pavani, M. A. Thompson, J. S. Biteen, et al.., “Three-dimensional, single-molecule fluorescence imaging beyond the diffraction limit by using a double-helix point spread function,” Proc. Nat. Acad. Sci., vol. 106, no. 9, pp. 2995–2999, 2009. https://doi.org/10.1073/pnas.0900245106.Search in Google Scholar PubMed PubMed Central

[24] M. A. Thompson, J. M. Casolari, M. Badieirostami, P. O. Brown, and W. E. Moerner, “Three-dimensional tracking of single mrna particles in saccharomyces cerevisiae using a double-helix point spread function,” Proc. Nat. Acad. Sci., vol. 107, no. 42, pp. 17864–17871, 2010. https://doi.org/10.1073/pnas.1012868107.Search in Google Scholar PubMed PubMed Central

[25] M. P. Backlund, R. Joyner, K. Weis, and W. E. Moerner, “Correlations of three-dimensional motion of chromosomal loci in yeast revealed by the double-helix point spread function microscope,” Mol. Biol. Cell, vol. 25, no. 22, pp. 3619–3629, 2014. https://doi.org/10.1091/mbc.e14-06-1127.Search in Google Scholar

[26] B. Yu, J. Yu, W. Li, et al.., “Nanoscale three-dimensional single particle tracking by light-sheet-based double-helix point spread function microscopy,” Appl. Opt., vol. 55, no. 3, pp. 449–453, 2016. https://doi.org/10.1364/ao.55.000449.Search in Google Scholar PubMed

[27] Z. Wang, Y. Cai, Y. Liang, et al.., “Single shot, three-dimensional fluorescence microscopy with a spatially rotating point spread function,” Biomed. Opt. Express, vol. 8, no. 12, pp. 5493–5506, 2017. https://doi.org/10.1364/boe.8.005493.Search in Google Scholar

[28] B. Ghanekar, V. Saragadam, D. Mehra, A.-K. Gustavsson, A. C. Sankaranarayanan, and A. Veeraraghavan, “Ps2f: polarized spiral point spread function for single-shot 3d sensing,” IEEE Trans. Pattern Anal. Mach. Intell., pp. 1–12, 2022. https://doi.org/10.1109/tpami.2022.3202511.Search in Google Scholar PubMed

[29] N. Yu, P. Genevet, M. A. Kats, et al.., “Light propagation with phase discontinuities: generalized laws of reflection and refraction,” Science, vol. 334, no. 6054, pp. 333–337, 2011. https://doi.org/10.1126/science.1210713.Search in Google Scholar PubMed

[30] F. Capasso, “The future and promise of flat optics: a personal perspective,” Nanophotonics, vol. 7, no. 6, pp. 953–957, 2018. https://doi.org/10.1515/nanoph-2018-0004.Search in Google Scholar

[31] S. M. Kamali, E. Arbabi, A. Arbabi, and A. Faraon, “A review of dielectric optical metasurfaces for wavefront control,” Nanophotonics, vol. 7, no. 6, pp. 1041–1068, 2018. https://doi.org/10.1515/nanoph-2017-0129.Search in Google Scholar

[32] A. V. Kildishev, A. Boltasseva, and V. M. Shalaev, “Planar photonics with metasurfaces,” Science, vol. 339, no. 6125, p. 1232009, 2013. https://doi.org/10.1126/science.1232009.Search in Google Scholar PubMed

[33] P. Lalanne and P. Chavel, “Metalenses at visible wavelengths: past, present, perspectives,” Laser Photon. Rev., vol. 11, no. 3, p. 1600295, 2017. https://doi.org/10.1002/lpor.201600295.Search in Google Scholar

[34] S. B. Glybovski, S. A. Tretyakov, P. A. Belov, Y. S. Kivshar, and C. R. Simovski, “Metasurfaces: from microwaves to visible,” Phys. Rep., vol. 634, pp. 1–72, 2016. https://doi.org/10.1016/j.physrep.2016.04.004.Search in Google Scholar

[35] M. L. Tseng, H.-H. Hsiao, C. H. Chu, et al.., “Metalenses: advances and applications,” Adv. Opt. Mater., vol. 6, no. 18, p. 1800554, 2018. https://doi.org/10.1002/adom.201800554.Search in Google Scholar

[36] T. Gu, H. J. Kim, C. Rivero-Baleine, and J. Hu, “Reconfigurable metasurfaces towards commercial success,” Nat. Photonics, vol. 17, pp. 48–58, 2022. https://doi.org/10.1038/s41566-022-01099-4.Search in Google Scholar

[37] L. Loetgering, M. Baluktsian, K. Keskinbora, et al.., “Generation and characterization of focused helical x-ray beams,” Sci. Adv., vol. 6, no. 7, p. eaax8836, 2020. https://doi.org/10.1126/sciadv.aax8836.Search in Google Scholar PubMed PubMed Central

[38] R. Orange-Kedem, E. Nehme, L. E. Weiss, et al.., “3d printable diffractive optical elements by liquid immersion,” Nature communications, vol. 12, no. 1, p. 3067, 2021. https://doi.org/10.1038/s41467-021-23279-6.Search in Google Scholar PubMed PubMed Central

[39] N. Dubey, V. Anand, S. Khonina, R. Kumar, A. N. K. Reddy, and J. Rosen, “Incoherent digital holography using spiral rotating point spread functions created by double-helix beams,” in Digital Holography and Three-Dimensional Imaging, Pages M1A–3, Optica Publishing Group, 2022.10.1364/DH.2022.M1A.3Search in Google Scholar

[40] C. Jin, J. Zhang, and C. Guo, “Metasurface integrated with double-helix point spread function and metalens for three-dimensional imaging,” Nanophotonics, vol. 8, no. 3, pp. 451–458, 2019. https://doi.org/10.1515/nanoph-2018-0216.Search in Google Scholar

[41] S. Colburn and A. Majumdar, “Metasurface generation of paired accelerating and rotating optical beams for passive ranging and scene reconstruction,” ACS Photonics, vol. 7, no. 6, pp. 1529–1536, 2020. https://doi.org/10.1021/acsphotonics.0c00354.Search in Google Scholar

[42] Z. Ma, S. Dong, X. Dun, Z. Wei, Z. Wang, and X. Cheng, “Reconfigurable metalens with phase-change switching between beam acceleration and rotation for 3d depth imaging,” Micromachines, vol. 13, no. 4, p. 607, 2022. https://doi.org/10.3390/mi13040607.Search in Google Scholar PubMed PubMed Central

[43] Z. Shen, F. Zhao, C. Jin, S. Wang, L. Cao, and Y. Yang, “Monocular metasurface camera for passive single-shot 4d imaging,” Nat. Commun., vol. 14, no. 1, p. 1035, 2023. https://doi.org/10.1038/s41467-023-36812-6.Search in Google Scholar PubMed PubMed Central

[44] Y. Y. Schechner, R. Piestun, and J. Shamir, “Wave propagation with rotating intensity distributions,” Phys. Rev. E, vol. 54, no. 1, p. R50, 1996. https://doi.org/10.1103/physreve.54.r50.Search in Google Scholar PubMed

[45] S. N. Khonina, V. V. Kotlyar, V. A. Soifer, M. Honkanen, J. Lautanen, and J. Turunen, “Generation of rotating gauss—laguerre modes with binary-phase diffractive optics,” J. Mod. Opt., vol. 46, no. 2, pp. 227–238, 1999. https://doi.org/10.1080/095003499149890.Search in Google Scholar

[46] S. R. P. Pavani and R. Piestun, “High-efficiency rotating point spread functions,” Opt. Express, vol. 16, no. 5, pp. 3484–3489, 2008. https://doi.org/10.1364/oe.16.003484.Search in Google Scholar PubMed

[47] R. Piestun and J. Shamir, “Control of wave-front propagation with diffractive elements,” Opt. Lett., vol. 19, no. 11, pp. 771–773, 1994. https://doi.org/10.1364/ol.19.000771.Search in Google Scholar PubMed

[48] P. Wang and R. Menon, “Optimization of periodic nanostructures for enhanced light-trapping in ultra-thin photovoltaics,” Opt. Express, vol. 21, no. 5, pp. 6274–6285, 2013. https://doi.org/10.1364/oe.21.006274.Search in Google Scholar PubMed

[49] P. Wang and R. Menon, “Optical microlithography on oblique and multiplane surfaces using diffractive phase masks,” J. Nanolithogr. MEMS, MOEMS, vol. 14, no. 2, p. 023507, 2015. https://doi.org/10.1117/1.jmm.14.2.023507.Search in Google Scholar

[50] P. Wang, N. Mohammad, and R. Menon, “Chromatic-aberration-corrected diffractive lenses for ultra-broadband focusing,” Sci. Rep., vol. 6, no. 1, p. 21545, 2016. https://doi.org/10.1038/srep21545.Search in Google Scholar PubMed PubMed Central

[51] F. Yang, S. An, M. Y. Shalaginov, et al.., “Design of broadband and wide-field-of-view metalenses,” Opt. Lett., vol. 46, no. 22, pp. 5735–5738, 2021. https://doi.org/10.1364/ol.439393.Search in Google Scholar PubMed

[52] F. Yang, S. An, M. Y. Shalaginov, H. Zhang, J. Hu, and T. Gu, “Understanding wide field-of-view flat lenses: an analytical solution [invited],” Chin. Opt. Lett., vol. 21, no. 2, p. 023601, 2023. https://doi.org/10.3788/col202321.023601.Search in Google Scholar

[53] G. Arya, W. F. Li, C. Roques-Carmes, M. Soljačić, S. G. Johnson, and Z. Lin. “End-to-end optimization of metasurfaces for imaging with compressed sensing.” arXiv preprint arXiv:2201.12348, 2022.Search in Google Scholar

[54] Z. Lin, R. Pestourie, C. Roques-Carmes, et al.., “End-to-end metasurface inverse design for single-shot multi-channel imaging,” Opt. Express, vol. 30, no. 16, pp. 28358–28370, 2022. https://doi.org/10.1364/oe.449985.Search in Google Scholar PubMed

[55] E. Tseng, S. Colburn, J. Whitehead, et al.., “Neural nano-optics for high-quality thin lens imaging,” Nat. Commun., vol. 12, no. 1, p. 6493, 2021. https://doi.org/10.1038/s41467-021-26443-0.Search in Google Scholar PubMed PubMed Central

[56] Y. Fan, H.-I. Lin, M. Y. Shalaginov, et al.., “Reconfigurable parfocal zoom metalens,” Adv. Opt. Mater., vol. 10, no. 17, p. 2200721, 2022. https://doi.org/10.1002/adom.202200721.Search in Google Scholar

[57] M. Y. Shalaginov, S. D. Campbell, S. An, et al.., “Design for quality: reconfigurable flat optics based on active metasurfaces,” Nanophotonics, vol. 9, no. 11, pp. 3505–3534, 2020. https://doi.org/10.1515/nanoph-2020-0033.Search in Google Scholar

[58] M. Born and E. Wolf, Principle of Optics, 7th ed. Cambridge, Cambridge University Press, 1999.Search in Google Scholar

Received: 2023-02-11
Accepted: 2023-03-17
Published Online: 2023-03-28

© 2023 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Articles in the same Issue

  1. Frontmatter
  2. Editorial
  3. Nanophotonics in support of Ukrainian Scientists
  4. Reviews
  5. Asymmetric transmission in nanophotonics
  6. Integrated circuits based on broadband pixel-array metasurfaces for generating data-carrying optical and THz orbital angular momentum beams
  7. Singular optics empowered by engineered optical materials
  8. Electrochemical photonics: a pathway towards electrovariable optical metamaterials
  9. Sustainable chemistry with plasmonic photocatalysts
  10. Perspectives
  11. Ukraine and singular optics
  12. Machine learning to optimize additive manufacturing for visible photonics
  13. Through thick and thin: how optical cavities control spin
  14. Research Articles
  15. Spin–orbit coupling induced by ascorbic acid crystals
  16. Broadband transfer of binary images via optically long wire media
  17. Counting and mapping of subwavelength nanoparticles from a single shot scattering pattern
  18. Controlling surface waves with temporal discontinuities of metasurfaces
  19. On the relation between electrical and electro-optical properties of tunnelling injection quantum dot lasers
  20. On-chip multivariant COVID 19 photonic sensor based on silicon nitride double-microring resonators
  21. Nano-infrared imaging of metal insulator transition in few-layer 1T-TaS2
  22. Electrical generation of surface phonon polaritons
  23. Dynamic beam control based on electrically switchable nanogratings from conducting polymers
  24. Tilting light’s polarization plane to spatially separate the ultrafast nonlinear response of chiral molecules
  25. Spin-dependent phenomena at chiral temporal interfaces
  26. Spin-controlled photonics via temporal anisotropy
  27. Coherent control of symmetry breaking in transverse-field Ising chains using few-cycle pulses
  28. Field enhancement of epsilon-near-zero modes in realistic ultrathin absorbing films
  29. Controlled compression, amplification and frequency up-conversion of optical pulses by media with time-dependent refractive index
  30. Tailored thermal emission in bulk calcite through optic axis reorientation
  31. Tip-enhanced photoluminescence of monolayer MoS2 increased and spectrally shifted by injection of electrons
  32. Quantum-enhanced interferometer using Kerr squeezing
  33. Nonlocal electro-optic metasurfaces for free-space light modulation
  34. Dispersion braiding and band knots in plasmonic arrays with broken symmetries
  35. Dual-mode hyperbolicity, supercanalization, and leakage in self-complementary metasurfaces
  36. Monocular depth sensing using metalens
  37. Multimode hybrid gold-silicon nanoantennas for tailored nanoscale optical confinement
  38. Replicating physical motion with Minkowskian isorefractive spacetime crystals
  39. Reconfigurable nonlinear optical element using tunable couplers and inverse-designed structure
Downloaded on 9.9.2025 from https://www.degruyterbrill.com/document/doi/10.1515/nanoph-2023-0088/html
Scroll to top button