Articles

Multi-aperture optics for wafer-level cameras

[+] Author Affiliations
Andreas Brückner

Fraunhofer Institute for Applied Optics and Precision Engineering, Albert-Einstein-Str. 7, Jena, Thueringen 07745 Germany

Robert Leitel

Fraunhofer Institute for Applied Optics and Precision Engineering, Albert-Einstein-Str. 7, Jena, Thueringen 07745 Germany

Alexander Oberdörster

Fraunhofer Institute for Applied Optics and Precision Engineering, Albert-Einstein-Str. 7, Jena, Thueringen 07745 Germany

Peter Dannberg

Fraunhofer Institute for Applied Optics and Precision Engineering, Albert-Einstein-Str. 7, Jena, Thueringen 07745 Germany

Frank Wippermann

Fraunhofer Institute for Applied Optics and Precision Engineering, Albert-Einstein-Str. 7, Jena, Thueringen 07745 Germany

Andreas Bräuer

Fraunhofer Institute for Applied Optics and Precision Engineering, Albert-Einstein-Str. 7, Jena, Thueringen 07745 Germany

J. Micro/Nanolith. MEMS MOEMS. 10(4), 043010 (November 21, 2011). doi:10.1117/1.3659144
History: Received June 22, 2011; Revised September 28, 2011; Accepted October 17, 2011; Published November 21, 2011; Online November 21, 2011
Text Size: A A A

Open Access Open Access

Wafer-level optics is considered to yield imaging lenses for cameras of the smallest possible form factor. The high accuracy of the applied microsystem technologies and the parallel fabrication of thousands of modules on the wafer level make it a hot topic for high-volume applications with respect to quality and costs. However, the adaption of existing materials and technologies from microoptics for the manufacturing of millimeter scale lens diameters led to yield problems due to material shrinkage and z-height accuracy. A multi-aperture approach to real-time vision systems is proposed that overcomes these issues because it relies on microlens arrays. The demonstrated prototype achieves VGA (Video Graphics Array, 640×480 pixels) resolution with a thickness of 1.4 mm, which is a thickness reduction of 50% compared to single-aperture equivalents. The partial images that are separately recorded in different channels are stitched together to form a final image of the whole field of view by means of image processing. Distortion is corrected within the processing chain. The microlens arrays are realized by state-of-the-art micro-optical fabrication techniques on wafer level that are suitable for a potential application in high volume, e.g., for consumer electronic products.

Figures in this Article

Following the ongoing development, miniature video cameras are becoming a standard feature of future information and communication devices, such as mobile phones, PDAs, or laptops. Besides applications such as video telephony or live broadcasting via the Internet, new applications in machine vision, medical imaging, and safety applications will also profit from the miniaturization trend. This is enabled by efforts of the semiconductor industry in downscaling pixels and thus shrinking image sensor formats and costs. Even though the size of today's commercially available camera modules range down to 2.5 × 3 × 2.5 mm,12 manufacturers struggle for even smaller packages. Especially, the module height is the target of further miniaturization in order to allow the integration into the latest ultraslim products.

At present, the pixel pitch of the smallest image sensor formats is between 1.1 and 1.75 μm. A further downscaling yields pixel sizes below the diffraction limit and, thus, a reduced spatial resolution as well as increased noise because the signal-to-noise ratio is directly proportional to pixel size. Although this might be tolerable in consumer digital cameras, the effect on the performance of subsequent image processing and analysis is not acceptable in machine vision, industrial inspection, and medical imaging. Lower f-numbers are hard to achieve with standard fabrication techniques, so that the diffraction limit cannot be shifted.3

Injection molding is still the dominating fabrication technology for the millimeter-scale lens fabrication in high volume. However, wafer-level optics (WLO) is seen as a promising candidate to decrease the fabrication and packaging costs for lens diameters of <3 mm. At that scale, the hybrid manufacturing and packaging techniques, which are standard for larger injection-molded lenses, lead to insufficient performance when considering the fabrication tolerances and the larger image resolution that is required for the shrinking pixel size.3 During the last few years, wafer-level optics evolved as an alternative fabrication technique from microtechnology. It has been successfully established for the fabrication of low-end camera modules with lenses of <3 mm diameter.45 A major advantage over standard techniques is that the fabrication of single lenses and the integration with spacers is carried out on wafer level for thousands of lenses in parallel. However, the adoption of UV-molding techniques from the world of micro-optics yields several challenges for the replication of millimeter-scale lenses with large sags. The accuracy of mold wafer mastering, material shrinkage, and tight z-height tolerances for the whole objective are issues that put a strong limitation on the performance and yield of WLO cameras. To date, commercially available WLO solutions hardly achieve a resolution beyond VGA.

The presented approach uses an optical setup that is inspired by the insect compound eye. A multi-aperture imaging system is proposed in that individual optical channels capture different portions of the overall field of view (FOV). This segmentation affects the trade-off between focal length and size of the field of view, so that a small thickness may be achieved whereas a small field of view is used for each individual channel. The miniaturization of the individual optical channels allows one to use microlens arrays that, in contrast to single-aperture wafer-level optics, have small lens sags and suffer less from shrinkage during molding. Additionally, lens master generation is carried out by state-of-the-art processes like photolithography and reflow of photoresist, both on wafer level.67

When searching for inspiration from nature, the basic feature of a multi-aperture vision system is the segmentation of the FOV, whereas different parts are transferred through different optical channels. Hence, the size of the field of view per channel is small but the FOV that can be imaged by all channels in total (or the total space-bandwidth product8) can be made large. This has the advantage that the optical setup of each individual channel can be made simple and optimized according to its central viewing direction and, optionally, also its imaged spectral range. In addition, the effective sampling in object space can be increased by contributions of several optical channels, which results in a shorter focal length for each individual channel.9 This is referred to as braided sampling. Both effects contribute to the reduction of z-height of multi-aperture optics when compared to single-aperture optics.

The idea of using multi-aperture optics for increasing the effective sampling has already been proposed by other groups. Two basic schemes can be distinguished in literature: For the first, each optical channel transfers a low-resolution image of the full FOV. An image-registration and fusion algorithm is used to create the final image of high resolution from all partial images (Fig. 1, outer right). This approach uses methods of superresolution and has been applied for infrared imaging systems10 and 3-D imaging of close objects.11 In the second approach, each optical channel captures a part of the whole FOV and a final image is created by the stitching of all partial images with the help of software image processing (Fig. 1 middle right). This working principle has been proposed to create a compact imaging system for the infrared spectral range.12 It is inspired by a special type of compound eye of a wasp parasite called Xenos Peckii.13 The imaging system that is described here uses the latter principle to combine the advantages of compact size together with a large FOV and a good sensitivity in the visible spectral range. Therefore, it is a suitable candidate for an ultraslim camera module, which is further called electronic cluster eye (or, shortened, eCley). For completeness, a third type of multi-aperture imaging system is mentioned that has also been developed for miniature camera applications.14 However, it contains a more complex optical setup, which creates an optical stitching of segments of the field of view that have been transferred by different optical channels before (Fig. 1 middle left).

Graphic Jump LocationF1 :

Overview of the different types of miniaturized camera modules. A schematic sideview of the optical system is shown on top of each field and a view on the image sensor on the bottom.

In order to derive the theoretical decrease of the focal length of the eCley with respect to a comparable single-aperture system, a simple relationship [Eq. 1] is used that describes the trade-off between effective focal length (feff), the diameter of the image circle (I), and the FOV (represented by its half maximum angle of incidence α) of a single-aperture lens, Display Formula

1f eff =I2tanα.
The specifications of the FOV and the image sensor resolution (number of pixels and pixel pitch ppx) result directly from the application. Thus, the targeted angular sampling (Δϕ) requires a certain focal length (fsa) of a single aperture optical system, which is given by Display Formula
2f sa =p px tanΔφ,
The diameter of the aperture stop (D) is then derived by inverting F/#=f/D. Equations 12 demonstrate that the focal length of a conventional camera lens can only be decreased by reducing the pixel size because other parameters, such as the number of pixels and the FOV, are fixed for the particular application. This fact is commonly exploited in mobile-phone camera optics. The lower limit of the size of a single-aperture camera is therefore a question of how small image sensor pixels can be. However, decreasing the pixel size leads to a decreased dynamic range, which causes unfavorable noisy images,15 and to a reduced image resolution when the pixel size drops below the optical diffraction limit.

The segmentation of the FOV by a multi-aperture optical setup gives a new degree of freedom so that the viewing direction of each channel in the FOV may be chosen by applying a pitch difference (Δpk = pLpk) between the microlens pitch (pL) and the partial image pitch pk. Hence, the effective angular sampling (Δϕeff) of the object space is not necessarily limited by the pixel pitch and the focal length of a single channel. Instead, an effective pixel pitch (Δpeff) can be designed in a way that the partial images of adjacent channels are displaced by a fraction of the pixel pitch ppx and hence allow a smaller effective sampling angle according to Display Formula

3tanΔφ eff =Δp eff f=p px kf.
The integer k ⩾ 1 is the so-called braiding factor and f the focal length of each optical channel. The angle Δϕeff should match Δϕ in order to achieve the same angular sampling, and it follows that: Display Formula
4f=f sa k.
The effective focal length of the multi-aperture system can be decreased by 1/k compared to a single-aperture objective, whereas the angular sampling is kept constant. Thus, the scaling limitation of single-aperture optics can be circumvented. In other words, the optical system can be scaled axially without the need to scale the pixel pitch so that the dynamic range is maintained.

Figure 2 illustrates the relationship between the sampling in object and image space by multiple optical channels for a braiding factor of k = 2.

Graphic Jump LocationF2 :

Visualization of a braided sampling of the object space by multiple optical channels. The different colors illustrate the correlation between patches on the object plane, optical channels, and partial images in the image plane. In this example, the effective sampling is increased by a factor of k = 2 when compared to the sampling of a single optical channel. The number of pixels along one dimension of a (squared) partial image is denoted by Ng.

For a practical demonstration of an electronic cluster eye, an image sensor with 3-MP (megapixels) resolution and a pixel pitch of 3.2 μm (Aptina Imaging Corp., San Jose, CA, USA, Micron MT9T001 P12STC) was chosen. The other input parameters for the optical design, such as FOV, f-number, and final image resolution, have been selected according to the specifications of a video telephone camera with VGA resolution. The first-order parameters of the system, such as number of channels, partial image diameter, focal length, diameter of the microlenses, the lens, and the partial image pitch, were calculated from this input using analytical paraxial equations. A braiding factor of k = 2 was chosen that theoretically leads to an eCley that has half the thickness with respect to a comparable single-aperture lens of the same FOV, image resolution and f-number.

Subsequently, a numerical model of the system was created and optimized with the sequential ray-tracing software ZEMAX (Radiant ZEMAX LLC, Bellevue, WA, USA). A single microlens in combination with an aperture stop is chosen for the optical setup of each channel (Fig. 3) in order to keep the technological effort low. The aperture stop of each channel was positioned in front of the related microlens so that optical aberrations, such as coma, astigmatism, and field curvature, could be reduced at a reasonable f-number. A second advantage of the reversed setup is a low angle of incidence on the image plane, which results from the refraction of ray bundles at the front face before hitting the individual microlens. Hence, the relative illumination on the image sensor array and also the light sensitivity for larger field angles is increased.

Graphic Jump LocationF3 :

Schematic cross section of the electronic cluster eye prototype illustrating the layer structure of the optical module, which is directly attached to an image sensor (shown in gray). Glass substrates are indicated in light blue, and microlenses and polymer spacers made from UV-polymer are in dark blue and dark gray, respectively. Drawing is not to scale.

The central viewing direction of each channel is defined by the pitch difference between the microlens array and the photodiode group. It has been designed in a way that the central viewing directions of all channels form an equidistant grid on a distant object plane and thus are free of distortion. Additionally, the tangential and sagittal radii of curvature of the lenslets were optimized independently according to the angular viewing direction of each individual microlens in object space. Such a “chirped” microlens array compensates for astigmatism under oblique incidence provided that only a small object field is imaged in each channel.16 The basic parameters of the prototype system are listed in Table 1.

Table Grahic Jump Location
Parameters of the eCley prototype with VGA resolution.
Table Footer NoteDenotes the range of values in the chirped array.

A nonsequential ray-tracing simulation has been carried out in order to simulate and avoid the optical cross talk in the multi-aperture system. Optical cross talk causes ghost images due to light that is imaged by one microlens onto the photodiode group of a neighboring channel. Horizontal diaphragm arrays made of black-matrix polymer are integrated in the layer stack for its suppression in the current module. For that purpose, the axial position and size of the openings of three diaphragm arrays of absorbing material were optimized in the simulation in order to find an optimal suppression of cross talk. The aperture stop array in front of the microlens array acts as absorber for light that would otherwise cause unwanted stray light when passing the intermediate space between lenses.

A nonsequential model has been used to perform image simulations because it accounts for effects such as refraction, reflection, and absorption as well as ghost images, stray light, and cross talk. A gray-scale image modeled with Lambertian radiation characteristics acted as object plane in these simulations, which were primarily used for testing the image-stitching and distortion-correction algorithms [Fig. 4]. A simulation of diffraction effects was subsequently added by a convolution of each of the partial images with its diffraction limited point-spread function [Fig. 4].

Graphic Jump LocationF4 :

Image simulation of a CCTV test chart as seen through the electronic cluster eye using a nonsequential ray-tracing method. The effect of sensor sampling is included. (a) Result of the pure ray-tracing simulation and (b) with added diffraction blur for each individual channel.

In order to demonstrate wafer-level fabrication, the entire production process has been performed on 4-in. D263T glass substrates (supplied by Schott AG). The current module sizes (listed in Table 1) allows for 102 units to be handled simultaneously on this scale. The applied materials are compatible with consecutive assembly routines, particularly reflow soldering.

At first, the generation of the masks for photolithographic patterning of diaphragm layers and the photoresist for lens reflow was done by electron-beam writing. The material of choice for diaphragm layers is a photosensitive negative-tone black matrix polymer, which is common in pixel mask patterning of liquid-crystal displays and provides a high optical density of 2.8 over a broad spectral range. Buried diaphragm layers have been achieved by laminating two thin glass substrates, having the structured layer in between. An epoxy glue of low viscosity is advantageous for bonding in order to obtain low strain with thin adhesive layers. UV-curable adhesives are useful for high throughput in mass production. The subsequent patterning of diaphragm array layers requires an accurate alignment. The utilized mask aligner [Suss MicroTec, Fig. 5] is equipped with top- and back-side microscopes, which allow for a lateral precision up to 1 μm. Finally, the lower block of the optical module (see Fig. 3) consists of a stack of three substrates, 300 and 400 μm thick, and three diaphragm layers.

Graphic Jump LocationF5 :

(a) Mask aligner (SUSS MA6), which is used for the lithography, replication, and wafer-level packaging of the microoptics, (b) image of an imprint wafer tool filled with chirped microlens arrays for eCleys, (c) microscope image of one of the chirped microlens arrays.

The microlens arrays were generated by reflow of photolithographically patterned resist.6 Toroidal lenses can be achieved by forming elliptic patterns in the resist structure. Because the optical and mechanical properties of the resist make it ineligible for direct use in an application, the microlens structure must be transferred or replicated. A stamp of negative shape has been molded using polydimethylsiloxane (PDMS) elastomer. After that, the stamp was used for the aligned imprinting in a highly transparent UV-curable polymer onto the back-side–patterned glass substrate [see Figs. 5].

For buried microlenses, a meshed spacer structure is required, which keeps the central areas of the microlenses in a certain distance with respect to the opposite wafer stack. This spacer structure has been generated by photolithographic patterning of the same polymer that has been applied for the lenslets. Both units, the stack of glass wafers and the wafer carrying the microlenses, were bonded under alignment in the mask aligner using a UV-curable epoxy glue. In the final step, the optics modules have been separated by dicing, which is shown in Fig. 6. They are then ready for the chip-scale assembly processes.

Graphic Jump LocationF6 :

(a) Close-up of the diced optics modules of the eCley, which are still on dicing tape and (b) image showing a size comparison between the fully assembled electronic cluster eye with VGA resolution on the packaged image sensor and a commercial plastic VGA lens.

The focus distance along the z-axis (normal to the sensor array), rotation around the z-axis, and tilt between both planes were monitored by reading out all partial images during the active alignment of the optical module relative to the image sensor. Finally, the micro-optical module was adhered to the image sensor array in the aligned position [Fig. 6].

The optical system creates a number of non-overlapping microimages on the image sensor [Fig. 7]. Each of the microimages, which are formed by related optical channels, represents part of the total FOV of the camera. All microimages must be combined for reconstructing the complete image of the total FOV. The FOVs of adjacent channels overlap; thus, the recombination process must be carried out on the pixel level. The viewing directions of the channels are tuned so that pixels from one channel look into the sampling gaps between pixels of the neighboring channel. This is essential for achieving resolution enhancement, in turn, enabling the short axial length of the eCley.

Graphic Jump LocationF7 :

Image of a CCTV test chart that has been acquired by the prototype of the eCley: (a) Image as it is recorded by the image sensor array with a full matrix resolution of 3 megapixels (inset shows a magnified image section of the 17×13 partial images of the individual channels) and (b) final image of the prototype after the application of the image postprocessing. The final image resolution is 700×550 pixels.

The recombination process can be described as braiding: Pixels are alternately selected from different microimages and placed next to each other. However, in practice, the optical system exhibits both parallax and distortion, which must be taken into account during recombination. For visualization, an object plane at a certain distance is considered that is parallel to the sensor plane. Each pixel views a point on that plane. Together, these points form a 2-D point cloud that illustrates how the object space is sampled by the eCley. Without parallax and distortion, each channel can be represented by a small number of points, arranged on a square grid. In combination, these squares are interleaved so that they form a regular grid with half the spacing of the sampling grids of the individual channels.

Parallax shifts the sampling points of the adjacent channels relative to each other [compare Figs. 8], whereas geometric distortion deforms the point distribution. Together, both effects lead to an irregular point cloud, representing irregular sampling of the object space [Fig. 8]. The effect of parallax and distortion can be calculated from the optical design of the system. In this way, each available pixel from the individual microimages is assigned to a location on the object space plane. This establishes a mapping from sensor image to target image, which makes the reconstruction a simple resorting of pixels.

Graphic Jump LocationF8 :

(a) Viewing directions of neighboring channels are tuned so that they sample object space regularly at a distance of infinity. (b) At closer distances, parallax shifts the apparent viewing directions, shifting the regular grids relative to each other. (c) Distortion of the channels leads to irregular sampling of object space. Different colors represent different optical channels.

Various kinds of interpolation can be used to display the resulting irregular point cloud.17 We have implemented the reconstruction process in a C program that continuously reads images from the image sensor and displays the reconstructed images in real time. On a standard PC with a 2.66-GHz Core 2 Duo processor, reconstruction takes 10 ms per frame, enabling display at video rates. We also tested the viability of our approach for embedded platforms, such as cell phones. We selected a system based on the Texas Instruments OMAP3 processor because it is used in many current smart phones. On this platform, with the OMAP3530 running at 720 MHz, processing takes 260 ms per frame. With lowerquality interpolation, the processing time can be reduced to 110 ms, which is acceptable for preview of the image on smart-phone displays. Please refer to our previous work17 for more details.

Because of the large number of optical elements in electronic cluster eyes, a full characterization of each lenslet would be impractical. However, it is sufficient to examine a characteristic part of the full chirped microlens array, which is implied by the parallel fabrication technique of the microlens arrays. The surface accuracy is assessed by a stylus measurement (Taylor Hobson Talysurf, Leicester, United Kingdom) of the radii of curvature along a horizontal section through the arrays on the wafer. Figure 9 shows an example comparison between the intended radii of curvature and the measurement results of that section for six microlens arrays. The scan was prepared by a careful alignment during the setup of the measurement using iterative prescanning. The results indicate an absolute radius error of <1% and an root-mean-sqaure (Rms) form deviation from the best fit radii of curvature of ∼55 nm. Additional microscope inspections are made for the selection of chips that are in good cosmetic condition (amount of contamination, microlens overflow, quality of diaphragms). For a real series production, the characterization should include all modules on a full wafer, e.g., by a measurement of the optical MTF for each module in order to mark outliers.

Graphic Jump LocationF9 :

Measured (black) and designed (red) radii of curvature along one dimension of six different microlens arrays. The x-axis shows the microlenses count across the wafer. The absolute error for the radius of curvature is <1%. An average rms deviation from the best-fit radii of curvature of 55 nm was measured.

The imaging properties of the electronic cluster eye are verified by measuring the modulation transfer function (MTF) of the bare optics module. This can be done for selected optical channels by using relay optics with large magnification to image a single partial image on another CCD image sensor. The MTF of the optical channel is then derived with a standard method (according to ISO 12233) from relaying and analyzing the image of a slanted edge target with strong initial contrast. Such a setup has been built with a 10 × 10 cm LED array, including diffuser plate, that has been masked with a thin steel plate to create a high-contrast edge in the object plane. The microoptical module has been fixed on a rotation stage, and a camera with microscope objective on a translation stage has been used to relay single partial images from the module. The results of the measurement can be seen in Figs. 10.

Graphic Jump LocationF10 :

(a) Comparison between different on-axis MTF curves and (b) the same for an off-axis field with an angle of incidence of 29 deg. Solid red line: Simulated MTF for the specific angle of incidence. Black squares: MTF of the optics without image sensor, which was measured by relaying the image plane of the related optical channel of the eCley. Blue triangles: MTF measured in the final image of the fully assembled prototype (optics + image sensor), including image postprocessing. Thin solid gray line: MTF measured from a commercial WLO camera module using the same method.

After a stand-alone camera device is created by aligning and attaching the microoptical module to the image sensor, the electronic cluster eye can be characterized like any conventional camera. However, an array of partial images is captured on the image sensor and image postprocessing is needed in order to achieve a regular image from the entirety of partial images. The final MTF performance of the system depends, among others, on the postprocessing algorithm. An example of an image captured by the demonstration board as well as the final (postprocessed) image of the eCley are shown in Fig. 7.

In order to quantify the resolution of the camera prototype, the MTF was measured for the fully assembled eCley. The results of several measurements for on and off-axis field positions are shown in Figs. 10. The simulated MTF curves were derived from the ray-tracing software ZEMAX (Radiant ZEMAX LLC, Bellevue, WA, USA) for an angle of incidence of zero degrees (on axis) and 29 deg (off axis) using a calculation method that accounts for diffraction. The simulated off-axis curve is an average of the tangential and sagittal sections of the 2-D MTF function. The single-channel MTF for the bare optical module was measured with a relay lens as explained before. For the measurement of the MTF of the final image, we captured an image of a slanted edge object within the center and at the edge of the field of view with the fully assembled eCley. The frequency axes in Figs. 10 are limited to 312 cycles per millimeter, which is the maximum sampling frequency according to the pixel pitch of the image sensor. At the same time, it is the effective Nyquist frequency of the eCley due to the used braiding factor of k = 2. In the on-axis case, the measured single channel MTF is systematically up to 10% lower than the simulated MTF, which is caused by tolerances of the fabrication and especially the back focal length of the module and residual misalignment. In the off-axis case, the difference between measured and simulated single-channel MTF is primarily due to the difference of tangential and sagittal MTF and to the fact that only a section is measured with the proposed method. The side lobe in the measured final MTF in Fig. 10 indicates a slight defocus, which is probably due to a too-large adhesive layer thickness when bonding the optical module to the image senor chip. The MTF of the final (postprocessed) image shows a remarkable dropoff which is caused by the coarse sampling of the image sensor (Nyquist frequency at 156 cycles/mm), the pixel cross talk and the image postprocessing. Another MTF curve (gray dashed line) of a commercial, single-aperture WLO camera (Omnivision CameraCube Model OVM7690, OmniVision, Santa Clara, CA, USA) has been added to Figs. 10 in order to compare the results to the state of the art. It displays a 10% advance at midspatial frequencies over the MTF of the final (braided) image of the eCley. Thus, the CameraCube demonstrates a slightly higher resolution. However, the images of the WLO camera contain considerably more noise, which is probably due to the small pixel size of 1.75 μm and its diagonal FOV of 62 deg is smaller compared to 70 deg in the eCley. Although, it uses a comparatively large pixel pitch of 3.2 μm, the thickness of the eCley is as small as 1.4 mm compared to 2.2 mm for the WLO camera module. However, the advantages of the multi-aperture approach come at the expense of lateral image sensor format so that about eight times the active area is needed to achieve the demonstrated resolution.

An addition to the investigations in the lab, field tests have been carried out by capturing videos and images in an outdoor environment (see Fig. 11).

Graphic Jump LocationF11 :

Image of a construction site in front of the Fraunhofer Institute IOF captured with the eCley prototype.

Image distortion has been analyzed by imaging a regular grid and locating the image coordinates of the individual grid lines and knots. A mean distance between adjacent grid lines of 38.87 pixels was measured and compared to the designed value (39 pixels), which gives a deviation of 0.3%. There was no spatial distribution of residual distortion found across the final image so that the eCley can be considered to be nearly free of distortion.

An ultrathin camera module with VGA resolution (700 × 550 pixels) has been demonstrated that applies multi-aperture microoptics, which is inspired by insect compound eyes. The so-called electronic cluster eye (eCley) captures different portions of the object field of view within multiple separated optical channels. These different partial images are stitched by means of software postprocessing to form a final image of the full field of view, which spans across 70 deg on diagonal. The segmentation of the full field of view decouples the focal length from the size of the field of view. Thus, a small thickness of only 1.4 mm was realized for the module with a footprint of 6.8 × 5.2 mm. The short focal length and small field size of each optical channel enabled the application of simple optical components, such as UV-replicated microlenses with small sag. These were fabricated by state-of-the-art micro-optical fabrication techniques, such as UV lithography, lens reflow, and UV molding, which show submicron precision and are cost efficient due to wafer-level manufacturing. Alignment and assembly of the two optical component wafers was carried out on wafer level for hundreds of eCleys in parallel, which decreases assembly efforts and, therefore, potential production costs. In comparison to single-aperture WLO camera modules, the thickness of an eCley is reduced by 50% at slightly lower resolution. At the same time, it achieves a larger FOV and images that are nearly free of distortion. Hence, this example of multi-aperture optics is becoming more than an academic experiment, but rather a practical alternative to single-aperture wafer-level optics modules. Future work will address the technological challenge of increasing the optical fill factor of the multi-aperture setup in order to make more efficient use of the active image-sensor area.

The described work is part of the project “Insect inspired imaging” (Project No. FKZ: UNSPECIFIED 01RB0705A ), which is funded by the German Federal Ministry of Education and Research (BMBF). The authors appreciate the contributions of Sylke Kleinle, Antje Oelschläger, and Christian König to the fabrication and assembly of the eCley prototypes.

Aptina Imaging Corp., , “ Products. ,” Aptina Imaging Corp. ((2010)). http://www.aptina.com/products/waferlevelcameras (6 December 2010).
OmniVision Technologies, , “ Products. ,” OmniVision Technologies, Inc. ((2009)), http://www.ovt.com/products/category.php?id=21 (6 December 2010).
Mathur  S., , Okincha  M., and Walters  M., “ What camera manufacturers want. ,” presented at  International Image Sensor Workshop. ,  Ogunquit, ME , June 7–10 , (2007).
Rossi  M., “ Wafer-Level Optics for miniature cameras. ,” presented at  SPIE Photonics Europe - Optics, Photonics and Digital Technologies for Multimedia Applications. ,  Brussels , April 12–16 , (2010).
Wolterink  E., and Demeyer  K., “ WaferOptics mass volume production and reliability. ,” Proc. SPIE. 7716, , 771614  ((2010)).
Daly  D., , Stevens  R. F., , Hutley  M. C., and Davies  N., “ The manufacture of microlenses by melting photoresist. ,” J. Meas. Sci. Technol.. 1, , 759–766  ((1990)).
Dannberg  P., , Erdmann  L., , Bierbaum  R., , Krehl  A., , Bräuer  A., , and Kley  E. B., “ Micro-optical elements and their integration to glass and optoelectronic wafers. ,” Microsyst. Technol.. 6, , 41–47  ((1999)).
Lohmann  A., “ Scaling laws for lens systems. ,” Appl. Opt.. 28, , 4996–4998  ((1989)).
Brückner  A., , Duparré  J., , Leitel  R., , Dannberg  P., , Bräuer  A., , and Tünnermann  A., “ Thin wafer-level camera lenses inspired by insect compound eyes. ,” Opt. Express. 18, , 24379–24394  ((2010)).
Shankar  M., , Willett  R., , Pitsianis  N., , Schulz  T., , Gibbons  R., , Kolste  R. T., , Carriere  J., , Chen  C., , Prather  D., and Brady  D., “ Thin infrared imaging systems through multichannel sampling. ,” Appl. Opt.. 47, , B1–B10  ((2008)).
Horisaki  R., , Irie  S., , Nakao  Y., , Ogura  Y., , Toyoda  T., , Masaki  Y., and Tanida  J., “ 3D information acquisition using a compound imaging system. ,” Proc. SPIE. 6695, , 66950F  ((2007)).
Druart  G., , Guérineau  N., , Haidar  R., , Thétas  S., , Taboury  J., , Rommeluère  S., , Primot  J., and Fendler  M., “ Demonstration of an infrared microcamera inspired by Xenos Peckii Vision. ,” Appl. Opt.. 48, , 3368–3374  ((2009)).
Buschbeck  E., , Ehmer  B., and Hoy  R., “ Chunk versus point sampling: visual imaging in a small insect. ,” Science. 286, , 1178–1179  ((1999)).
Meyer  J., , Brückner  A., , Leitel  R., , Dannberg  P., , Bräuer  A., , and Tünnermann  A., “ Optical cluster eye fabricated on wafer-level. ,” Opt. Express. 19, , 17506–17519  ((2011)).
Chen  T., , Catrysse  P., , Gamal  A. E., , and Wandell  B., “ How small should pixel size be?. ,” Proc. SPIE. 3965, , 451–459  ((2000)).
Duparr  J., é, Wippermann  F., , Dannberg  P., , and Reimann  A., “ Chirped arrays of refractive ellipsoidal microlenses for aberration correction under oblique incidence. ,” Opt. Express. 13, , 10539–10551  ((2005)).
Oberdörster  A., , Brückner  A., , Wippermann  F. C., , and Bräuer  A., “ Correcting distortion and braiding of micro-images from multi-aperture imaging systems. ,” Proc. SPIE. 7875, , 7875–7810  ((2010)).

Grahic Jump LocationImage not available.

Andreas Brückner graduated in physics from the Friedrich-Schiller-University Jena in 2006. Since then, he has been working as an optical scientist in the field of microoptical systems at the Fraunhofer Institute for Applied Optics and Precision Engineering (IOF) in Jena. His research interests include optical design for imaging systems, microtechnology, and image processing. His current research focuses on the miniaturization of imaging optics by applying multi-aperture optical systems in order to create a benefit for applications in consumer electronics, automotive, or medical imaging.

Grahic Jump LocationImage not available.

Robert Leitel received his PhD from the Friedrich-Schiller-University Jena in 2008 working on the formation and characterization of stochastic subwavelength structures on polymer surfaces. In 2004, he obtained his diploma in physics, studying inhomogeneous optical coatings using in situ and ex situ spectrophotometric characterization. During his four years at the Optical Coating Department of the Fraunhofer Institute for Applied Optics and Precision Engineering, he gained experience in manipulating the spectral characteristics of surfaces and interfaces by plasma treatment and thin-film deposition. He joined the Micro-optical Imaging Group in 2009 to reinforce the micro-optics technology line dealing with lens reflow, UV molding, and hybrid integration.

Grahic Jump LocationImage not available.

Alexander Oberdörster graduated in physics from the University of Düsseldorf and the Fraunhofer ISE in Freiburg. At the ISE and at Spheron VR AG, he studied and designed devices for measuring optical scattering properties of surfaces (BRDFs). At the Fraunhofer IIS, he developed cameras for digital cinematography and related technologies. Currently, he is working on image-processing algorithms for multi-aperture imaging systems at the Fraunhofer IOF. His research interests include computational photography, nonuniform sampling, and reconstruction.

Grahic Jump LocationImage not available.

Peter Dannberg recieved his diploma and PhD in physics from Jena University in 1983 and 1987, respectively. Since 1993, he has been with the Fraunhofer Institute of Applied Optics and Precision Engineering, where he is currently working as head of the Micro-optics Technology Group. His interests involve micro-optical elements and systems as well as polymer optical materials and micro-/nanotechnolgy.

Grahic Jump LocationImage not available.

Frank Wippermann graduated from the University for Applied Sciences in Jena in 1999. He worked in the field of fiber optical system design for sensing and telecommunication applications and was a founder of the company Pyramid Optics, dedicated to optical fiber switches. In 2004, he joined the Fraunhofer Institute for Applied Optics and Precision Engineering (IOF). He received his PhD from Ilmenau Technical University in 2007. His main professional areas include the design and technology of imaging optics on wafer level. Currently, he is heading the group of microoptical imaging systems.

Grahic Jump LocationImage not available.

Andreas Bräuer graduated in physics in 1978 (PhD) at University of Jena (Germany). In the past 18 years, he has gained experience in micro-optics and related project acquisition and management. He is head of the Department of Micro-Optical Systems at the Fraunhofer Institute for Applied Optics and Precision Engineering (IOF) in Jena, Germany. His current interests are in applied research in micro-optical solutions for imaging and illumination, such as, e.g. single and multi-aperture miniaturized cameras, LED illumination modules, and array projectors.

© 2011 Society of Photo-Optical Instrumentation Engineers (SPIE)

Citation

Andreas Brückner ; Robert Leitel ; Alexander Oberdörster ; Peter Dannberg ; Frank Wippermann, et al.
"Multi-aperture optics for wafer-level cameras", J. Micro/Nanolith. MEMS MOEMS. 10(4), 043010 (November 21, 2011). ; http://dx.doi.org/10.1117/1.3659144


Figures

Graphic Jump LocationF1 :

Overview of the different types of miniaturized camera modules. A schematic sideview of the optical system is shown on top of each field and a view on the image sensor on the bottom.

Graphic Jump LocationF2 :

Visualization of a braided sampling of the object space by multiple optical channels. The different colors illustrate the correlation between patches on the object plane, optical channels, and partial images in the image plane. In this example, the effective sampling is increased by a factor of k = 2 when compared to the sampling of a single optical channel. The number of pixels along one dimension of a (squared) partial image is denoted by Ng.

Graphic Jump LocationF3 :

Schematic cross section of the electronic cluster eye prototype illustrating the layer structure of the optical module, which is directly attached to an image sensor (shown in gray). Glass substrates are indicated in light blue, and microlenses and polymer spacers made from UV-polymer are in dark blue and dark gray, respectively. Drawing is not to scale.

Graphic Jump LocationF4 :

Image simulation of a CCTV test chart as seen through the electronic cluster eye using a nonsequential ray-tracing method. The effect of sensor sampling is included. (a) Result of the pure ray-tracing simulation and (b) with added diffraction blur for each individual channel.

Graphic Jump LocationF5 :

(a) Mask aligner (SUSS MA6), which is used for the lithography, replication, and wafer-level packaging of the microoptics, (b) image of an imprint wafer tool filled with chirped microlens arrays for eCleys, (c) microscope image of one of the chirped microlens arrays.

Graphic Jump LocationF6 :

(a) Close-up of the diced optics modules of the eCley, which are still on dicing tape and (b) image showing a size comparison between the fully assembled electronic cluster eye with VGA resolution on the packaged image sensor and a commercial plastic VGA lens.

Graphic Jump LocationF7 :

Image of a CCTV test chart that has been acquired by the prototype of the eCley: (a) Image as it is recorded by the image sensor array with a full matrix resolution of 3 megapixels (inset shows a magnified image section of the 17×13 partial images of the individual channels) and (b) final image of the prototype after the application of the image postprocessing. The final image resolution is 700×550 pixels.

Graphic Jump LocationF8 :

(a) Viewing directions of neighboring channels are tuned so that they sample object space regularly at a distance of infinity. (b) At closer distances, parallax shifts the apparent viewing directions, shifting the regular grids relative to each other. (c) Distortion of the channels leads to irregular sampling of object space. Different colors represent different optical channels.

Graphic Jump LocationF9 :

Measured (black) and designed (red) radii of curvature along one dimension of six different microlens arrays. The x-axis shows the microlenses count across the wafer. The absolute error for the radius of curvature is <1%. An average rms deviation from the best-fit radii of curvature of 55 nm was measured.

Graphic Jump LocationF10 :

(a) Comparison between different on-axis MTF curves and (b) the same for an off-axis field with an angle of incidence of 29 deg. Solid red line: Simulated MTF for the specific angle of incidence. Black squares: MTF of the optics without image sensor, which was measured by relaying the image plane of the related optical channel of the eCley. Blue triangles: MTF measured in the final image of the fully assembled prototype (optics + image sensor), including image postprocessing. Thin solid gray line: MTF measured from a commercial WLO camera module using the same method.

Graphic Jump LocationF11 :

Image of a construction site in front of the Fraunhofer Institute IOF captured with the eCley prototype.

Tables

Table Grahic Jump Location
Parameters of the eCley prototype with VGA resolution.
Table Footer NoteDenotes the range of values in the chirped array.

References

Aptina Imaging Corp., , “ Products. ,” Aptina Imaging Corp. ((2010)). http://www.aptina.com/products/waferlevelcameras (6 December 2010).
OmniVision Technologies, , “ Products. ,” OmniVision Technologies, Inc. ((2009)), http://www.ovt.com/products/category.php?id=21 (6 December 2010).
Mathur  S., , Okincha  M., and Walters  M., “ What camera manufacturers want. ,” presented at  International Image Sensor Workshop. ,  Ogunquit, ME , June 7–10 , (2007).
Rossi  M., “ Wafer-Level Optics for miniature cameras. ,” presented at  SPIE Photonics Europe - Optics, Photonics and Digital Technologies for Multimedia Applications. ,  Brussels , April 12–16 , (2010).
Wolterink  E., and Demeyer  K., “ WaferOptics mass volume production and reliability. ,” Proc. SPIE. 7716, , 771614  ((2010)).
Daly  D., , Stevens  R. F., , Hutley  M. C., and Davies  N., “ The manufacture of microlenses by melting photoresist. ,” J. Meas. Sci. Technol.. 1, , 759–766  ((1990)).
Dannberg  P., , Erdmann  L., , Bierbaum  R., , Krehl  A., , Bräuer  A., , and Kley  E. B., “ Micro-optical elements and their integration to glass and optoelectronic wafers. ,” Microsyst. Technol.. 6, , 41–47  ((1999)).
Lohmann  A., “ Scaling laws for lens systems. ,” Appl. Opt.. 28, , 4996–4998  ((1989)).
Brückner  A., , Duparré  J., , Leitel  R., , Dannberg  P., , Bräuer  A., , and Tünnermann  A., “ Thin wafer-level camera lenses inspired by insect compound eyes. ,” Opt. Express. 18, , 24379–24394  ((2010)).
Shankar  M., , Willett  R., , Pitsianis  N., , Schulz  T., , Gibbons  R., , Kolste  R. T., , Carriere  J., , Chen  C., , Prather  D., and Brady  D., “ Thin infrared imaging systems through multichannel sampling. ,” Appl. Opt.. 47, , B1–B10  ((2008)).
Horisaki  R., , Irie  S., , Nakao  Y., , Ogura  Y., , Toyoda  T., , Masaki  Y., and Tanida  J., “ 3D information acquisition using a compound imaging system. ,” Proc. SPIE. 6695, , 66950F  ((2007)).
Druart  G., , Guérineau  N., , Haidar  R., , Thétas  S., , Taboury  J., , Rommeluère  S., , Primot  J., and Fendler  M., “ Demonstration of an infrared microcamera inspired by Xenos Peckii Vision. ,” Appl. Opt.. 48, , 3368–3374  ((2009)).
Buschbeck  E., , Ehmer  B., and Hoy  R., “ Chunk versus point sampling: visual imaging in a small insect. ,” Science. 286, , 1178–1179  ((1999)).
Meyer  J., , Brückner  A., , Leitel  R., , Dannberg  P., , Bräuer  A., , and Tünnermann  A., “ Optical cluster eye fabricated on wafer-level. ,” Opt. Express. 19, , 17506–17519  ((2011)).
Chen  T., , Catrysse  P., , Gamal  A. E., , and Wandell  B., “ How small should pixel size be?. ,” Proc. SPIE. 3965, , 451–459  ((2000)).
Duparr  J., é, Wippermann  F., , Dannberg  P., , and Reimann  A., “ Chirped arrays of refractive ellipsoidal microlenses for aberration correction under oblique incidence. ,” Opt. Express. 13, , 10539–10551  ((2005)).
Oberdörster  A., , Brückner  A., , Wippermann  F. C., , and Bräuer  A., “ Correcting distortion and braiding of micro-images from multi-aperture imaging systems. ,” Proc. SPIE. 7875, , 7875–7810  ((2010)).

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging & repositioning the boxes below.

Related Book Chapters

Topic Collections

Advertisement
  • Don't have an account?
  • Subscribe to the SPIE Digital Library
  • Create a FREE account to sign up for Digital Library content alerts and gain access to institutional subscriptions remotely.
Access This Article
Sign in or Create a personal account to Buy this article ($20 for members, $25 for non-members).
Access This Proceeding
Sign in or Create a personal account to Buy this article ($15 for members, $18 for non-members).
Access This Chapter

Access to SPIE eBooks is limited to subscribing institutions and is not available as part of a personal subscription. Print or electronic versions of individual SPIE books may be purchased via SPIE.org.