Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Motion detection based on 3D-printed compound eyes

Open Access Open Access

Abstract

A biologically inspired compound eye system is fabricated for the detection of object motion without the need for sophisticated image processing. The array of the artificial optical unit, called ommatidium, structurally and functionally mimics the natural compound eyes for motion detection. Each artificial ommatidium consists of polymer lenses, a light-guiding polymer cone, a 3D printing cladding, and a light intensity sensor to measure the change of light intensity during motion detection. To simplify the signal processing and improve the system reliability, low-cost light sensors, instead of CMOS/CCD arrays, are used for measuring the light intensity changes caused by object movement. The distance and speed of a moving metal ball of a pendulum were measured using the compound eye system. The measured results agree well with the theoretical analyses. The error between the measured and calculated speed is less than 2%.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Bio-inspired artificial compound eye systems have attracted lots of interest in recent years, thanks to their unique outstanding optical properties, such as wide field of view, infinite depth of field, and high sensitivity for object motion detection [1]. As a result, an imaging and sensing system with these advantages will have significant potential for many applications in visually controlled navigation [2], machine vision sensor system [3], and biomedical sensing and imaging [4]. Unlike the human eyes, the compound eye system is a different architecture, which consists of tens to thousands of integrating small optical units called ommatidia. These units are arranged on a convex curved surface, and individual component has its own lens, crystalline cone, and photoreceptor cells with a waveguiding rhabdom. Several artificial micro-compound eye systems have been developed using the 3D structure of the ommatidium. Viollet et al. [5] reported an artificial compound eye system fabricated by assembling and aligning the flexible PCB board with a polymer lens array. Song et al. [6] proposed nearly full hemispherical shapes digital arthropod-inspired cameras. Jeong et al. [7] presented a biologically inspired artificial compound eye by two light/thermal polymerizations. Other examples of artificial compound eye systems can be found in review papers [1,811].

Biological compound eyes have high sensitivity in object motion detection. One of the mechanisms recognized by researchers is that motion detection by insects is based on the flicker effect [12,13]. In general, the flick effect is that, as an object moves across into and move out the compound eyes’ field of view, ommatidia can progressively be turned on and off during which individual ommatidium works for detecting the light change. In the meantime, insects can feel the object’s distance directly based on the change of light intensity. Due to the flicker effect, insects respond much better to moving than stationary objects.

Artificial compound eye systems have been developed to mimic the fast motion detection capability of insects [1417]. Most researches have worked on the analysis of multiple images received by an artificial compound eye imaging system using charge-coupled devices (CCDs) and complementary metal-oxide-semiconductor (CMOS) arrays. Pericet-Camara et al. [17] present an optical flow system inspired by the compound eye, which linked to the extracting optic flow over multiple visual directions. Kagawa et al. [18] showed a compact and thin compound eye camera called TOMBO, which uses video geometry to construct motion equations and optimize an error function to acquire motion parameters. Lin and Cheng [12] presented a superposition-type compound eye (SSCE) for tracking pan-tilt rotation motion. Although these methods have good accuracy in motion detection, they require digital hardware for image read-out and computational time in signal processing of motion detection.

In this work, we focus on developing an alternative system to detect object motion without the need for sophisticated image processing. We use light intensity sensors, instead of the CCD/CMOS image sensors, to perform the signal detection and processing. Each artificial ommatidium consists of polymer lenses, a light-guiding polymer cone, a 3D printing cladding, and a light intensity sensor to measure the change of light intensity during the motion detection. The ommatidia are fabricated using a simple process that combines of SLA 3D printing method and a polymer filling technique. This process produces artificial compound eyes with good optical lenses. The motion distance and speed are studied by analyzing the output of the light intensities of a row of 5 ommatidia, which replicates the flicker effect of an insect’s eye for motion detection.

2. Experiment

2.1 Materials and optical design

The structure of a single ommatidium is shown in Fig. 1. In the experiment, a Projet 1200 3D micro-SLA printer from 3D Systems was used to print the cladding with different design parameters. The resin used for cladding was VisiJet FTX Green, which had a dark green color with a lower reflective index (n = 1.47). Different geometry designs were easily obtained by changing relevant parameters in the AutoCAD software. For the surface hydrophilicity treatment, 0.1% Teflon AF 2400 type amorphous fluoroplastic resin was selected to use for hydrophobic treatment, and the corona treatment was applied for hydrophilic treatment. Two light sensitivity polymers with different reflective indices (n) and good optical property, NOA164 (n = 1.64) and NOA81 (n = 1.56), were selected as the filling polymer. A laboratory-made syringe pump system was used for polymer filling. In this system, a 1-ml syringe with a 0.46 mm diameter needle was used to adding the polymer. A motorized x-y stage was connected to a software named Repetier-Host to control the polymer loading location. A UV exposure chamber was used to cure the light-sensitive liquid polymer. NOA is sensitive to UV wavelength light from 320 to 380 nanometers with peak sensitivity around 365nm. Cure time is dependent on light intensity and thickness of the polymer lens. In our experiment, the cure time is about 15 minutes with a UV lamp in the chamber.

 figure: Fig. 1.

Fig. 1. Artificial Ommatidium consisting of polymer lenses, a light-guiding polymer cone, and a 3D printing waveguide cladding. ($\Delta x\; $ = 4 mm, R1 = 1.45 mm, R2 = 1.08 mm, D = 2 mm, d = 0.8 mm)

Download Full Size | PDF

It should be noticed that this work investigates only artificial apposition compound eye. In this type of compound eye, the ommatidia are optically isolated, which means each light intensity sensor only collected the light from its own lens. The polymer cone has the function of focusing the light onto a small spot, and the 3D-printed cladding with lower reflective index (n = 1.47) provide optical insolation between the neighboring ommatidia.

The fabricated polymer lenses have much better surface smoothness than the direct 3D printing lenses. For comparison, the surface roughness of the 3D printing lenses (fabricated using the same 3D-printing process as the cladding) and the polymer lenses were measured by a Daktak surface profiler. The results are shown in Fig. 2. It can be observed that the polymer lens has a much smoother surface with a root mean square (RMS) value of 70 nm, compared to that of the direct 3D-printing lens (RMS value of 14 µm).

 figure: Fig. 2.

Fig. 2. Surface roughness of 3D printing lens vs. optical polymer lens. The polymer lens has a much smoother surface (red curve) with the RMS value of 70 nm, compared to that of the direct 3D printing lens (blue curve) with the RMS value of 14 µm.

Download Full Size | PDF

The radius of lens curvature primarily depends on the wetting conditions of the contact surface [19]. Two different wettability regions were created in our experiment to form the lenses by the effect of surface tension and avoid bubble generation between the liquid polymer and sidewall of the hollow during the filling process. The corona plasma was first applied to make the surface more hydrophilic. Then the surfaces were treated by coating a thin layer of Teflon to reduce the surface’s energy and turn the surfaces more hydrophobic. Figure 3 shows the contact angles of water and NOA series polymers on the 3D printed material surface before and after treatments. The new surface wetting conditions were confirmed that contact angles for both water and NOA polymers were increased by the Teflon coating and decreased by the corona plasma treatment. The mean value of $\mathrm{\theta }$ for the water contact angle under the Teflon treatment increased dramatically from 75$^\circ $ to 90$^\circ $. Corona plasma treatment led to an inversion of hydrophilicity of the surface where the contact angle reduced from 75$^\circ $ to 20$^\circ $. For the NOA series UV sensitive polymers, the contact angle on the same treated surface was smaller than that of water due to the lower surface tension. The focal length of polymer lenses is controlled by varying the surface tension and radius of lens curvature [20,21] using the above procedure.

 figure: Fig. 3.

Fig. 3. Pictures of water and polymer beads on three substrate surfaces and measured contact angles ($\theta $).

Download Full Size | PDF

For motion detection, the main step is to measure the change of light intensity in each sensor, which provides information about the distance and velocity of the moving object. An optical simulation is used to optimize the optical design for best focusing. The ommatidia used in this study contains two lenses and a polymer cone (with length L) (Fig. 4). According to the result of surface tension and contact measurements, the radii of the lenses R1 and R2 (Fig. 1) were found to be 1.45 mm and 1.08 mm, respectively. The width (D) of the front lens is 2 mm, and the width (d) of the backside lens is 0.8 mm. The backside lens is a part of the thick lens, so that changing L will change the total focal length. The distance between the adjacent lens ($\Delta x$) is 4 mm (Fig. 1). These values are used in the simulation with different polymer cone lengths. It can be observed that changes in the polymer cone length (L) of each ommatidium will lead to different focal points. The light can achieve the maximum intensity through the optical system focusing on the light sensor when the polymer cone length is around 1.6 mm.

 figure: Fig. 4.

Fig. 4. Optical Simulation for different polymer cone length (L) of 3D printing cladding

Download Full Size | PDF

2.2 Devices fabrication and optical measurement

The fabrication process of the artificial compound eye is shown in Fig. 5. Firstly, different designs of 3D cladding structures were directly printed by an SLA printer. Secondly, two different surface treatments were applied. Corona is used to treat the whole 3D printing cladding surfaces. Then the front and back surfaces of the structure were coated with a thin layer of Teflon. Next, the light sensitive polymer is used to fill the 3D-printed cladding and form the lenses by the effect of surface tension. The radius of lens curvature design can be controlled by varying the polymer cone length (L) with the controllable volume of polymer filling. After that, the 3D artificial ommatidia structure was solidified by UV exposure in a UV chamber. Finally, the artificial compound eye system was assembled with a lab-made stage, which helps to align the optical unit to the light intensity sensor. More details of the 3D printing process and surface treatment were reported in previous work [22]. It should be noted that 3D printing can be used to produce optical elements, such as waveguides [23]. For optical lenses, polymer filling method shown above can have better optical quality than 3D printing (Fig. 2).

 figure: Fig. 5.

Fig. 5. The fabrication process of compound eye for motion detection. (A) 3D printing compound eye cladding structure. (B) Corona treatment of the whole open cladding surfaces. (C) Teflon treatment of front and back surfaces. (D) Optical polymer filling. (E) UV exposure to form the solid polymer lenses. (F) Assemble the device on a stage with light intensity sensors.

Download Full Size | PDF

The optical set-up of the experimental system is shown in Fig. 6. A linear array of the artificial optical system was used to detect a moving metal ball in x and y directions. The light sensors are photoresistors (GL5528 photoresistor by Ardest) aligned with the lenses. Each sensor only can collect the light through its own lens. The light sensors continuously read the number of intensities during the measurement, and the time between each measurement is 1 ms. The velocity of the moving ball can be calculated from the measured light intensity change. The measurement mimics the flicker effect of natural compound eyes. As the object moves into the visual field of the device, the system senses the intensity change, which makes the ommatidia “turns on.” As the object moves out of the field of view, the intensity numbers keep a constant, and the ommatidia “turns off”. For the purpose of comparing the measured velocity with theoretical analyses, a pendulum with a 3 mm diameter metal ball is used as the object to do the measurements.

 figure: Fig. 6.

Fig. 6. (A) Schematic of intensity measurement for motion detection. (B) The optical set-up. (C) GL5528 output circuit diagram.

Download Full Size | PDF

3. Results and discussion

We have measured object movement in a plane above the linear array of the compound eye system to demonstrate the working principle of the proposed method. The linear array contains 5 independent ommatidium units. The schematic of measurement is shown in Fig. 6(A). The coordinate center is identified in the figure as well. The relationship between the change of intensity and the distance in both the x-axis and y-axis needs to be found first. The light intensity change is measured by comparing the values with and without the object in the view field of the compound eye. A 3 mm diameter ball is used as the object which moves in the x and y-direction on the plane. Figure 7 shows the light intensity change (arbitrary units) as the object moves along the y-axis of the ommatidia with a 2 mm polymer cone length. The ommatidia are labeled 1 to 5 from left to right in Fig. 6(A). The object is set on the top of ommatidium 3 [the middle one in Fig. 6(A)]. It can be observed that the ommatidium 3 has the maximum change of light intensity). The uniformity of ommatidium structure can be indirectly seen from the overlapping curves between two ommatidia having symmetrical locations. The curves of ommatidium 2 and 4 almost coincided as well as the curves of ommatidium 1 and 5. Figure 8 shows the measured results as the object moves in the x-axis direction. The width of the distribution is related to the system field of view. The maximum intensity change is related to the focal length of each ommatidium. As a result, the path of a moving object in the plane can be tracked by reading the detected numbers of light intensities change. Two more comparative experiments are used to optimize the optical design to study the effect of focal length on the light intensity change. Figure 9 shows the light intensity change as the object moves in the x-axis of the same ommatidium array but without the polymer filling. Figure 10 shows the light intensity change as the object moves in the x-axis of the ommatidia with a 1 mm polymer cone length. The results indicate that the device with ommatidium lenses improves both the field of view (width of the intensity curves) and range of intensity change. The measured results of ommatidium 3 in three different designs are shown in Fig. 11. It was observed that comparing the ommatidia with 1 mm and 2 mm polymer cone lengths, the 1mm-design has a similar field of view as the 2-mm version but a lower range of intensity change. It can be explained that, according to the result of the lens simulation, 1-mm design focuses less light onto the light sensor than the 2-mm version (Fig. 4).

 figure: Fig. 7.

Fig. 7. Change of light intensity vs. the distance between the object and device in y-axis. The object was located on the top of ommatidium 3 (middle) and moving alone y-axis.

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. Change of light intensity vs. the distance between the object and device in x-axis. The ommatidia have 2 mm polymer cone length.

Download Full Size | PDF

 figure: Fig. 9.

Fig. 9. Change of light intensity vs. the distance between the object and device in x-axis. The ommatidia do not have polymer lens.

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. Change of light intensity vs. the distance between the object and device in x-axis. The ommatidia have 1 mm polymer cone length.

Download Full Size | PDF

 figure: Fig. 11.

Fig. 11. Comparison of measured results of ommatidium 3 in three different designs (Figs. 810).

Download Full Size | PDF

For the purpose of comparing the measured velocity with theoretical analyses, a pendulum with a 3 mm diameter metal ball is used as the object to do the measurements (Fig. 12). The measured light intensity changes versus detection time during a real pendulum move across the device back and forth is shown in Fig. 13. The maximum change of intensity value for each lens presents the moment that the object moves to the location directly above the lens. At a given time, the object location on the x-y plane can be found by comparing the five measured intensity values. Also, the object velocity can be calculated by the change of distance over the detection time.

 figure: Fig. 12.

Fig. 12. The schematic of velocity measurement

Download Full Size | PDF

 figure: Fig. 13.

Fig. 13. Light intensity change vs. detection time over a cycle of pendulum swing (The numbers indicate the ommatidium position order).

Download Full Size | PDF

For a specific size of a pendulum ball, the theoretical velocity function can be written as:

$${V_t} = \sqrt {2g\Delta h} $$
Here $\textrm{g}$ is the local acceleration of gravity, $\Delta h$ is height difference in y direction during the movement of the ball. $\Delta h$ can be calculated by the following equations:
$$\Delta \textrm{h} = \textrm{R}({1 - \cos (\alpha )} );{\; }$$
$$\mathrm{\alpha } = \arcsin \left( {\frac{s}{R}} \right)$$
Here $\mathrm{\alpha }\; \textrm{and}\; \textrm{s}$ are the amplitude of degree and width of the pendulum’s swing, respectively, R is the length of the pendulum. In the measurement, the value of R is 47.5 cm and s is 5 cm. In this measurement, the coordinate center is identified in Fig. 6. The starting point of the ball is at x = -4 cm and y = 4 cm.

The instantaneous velocity of motion can be derived from the velocities in x and y directions:

$${V_m} = {\; }\sqrt {V_x^2 + V_y^2} $$
Here ${V_x},{V_y}{\; }$are the instantaneous velocity in x and y directions. Vm can be calculated by knowing the time difference of the object movement and distance. The relationship of the ${V_x},{V_y}$,${\; }\Delta x$, and$\; \Delta y$ can be easily found:
$${V_x} = \Delta x/t$$
$${V_y} = \Delta y/t$$
Here t is the time difference during the movement.

For a given value of R, $\Delta y$ can be calculated when the $\Delta x$ and the starting point is known. In the experiment, we measured the time difference ($t$) between the moments when two adjacent lenses had the maximum intensity changes. Since the movement of pendulum ball is not uniform motion, each time difference is counted by the measured points between two adjacent maximum intensities. These adjacent lenses are 4 mm apart from center to center. We have $\Delta x$ = 4 mm. The measured and calculated velocity results of the pendulum movement are listed in Table 1. It indicates that the system has a good accuracy of velocity measurement (error rate is less than 2%). It should be noted that there are errors in the use of the ideal model for the pendulum. These errors are nevertheless included in the final errors analyses in Table 1. It is shown that the current model of the pendulum is reasonable for this study.

Tables Icon

Table 1. The measured (Vm) and theoretical (Vt) velocities of the pendulum movement across the field of ommatidia. The number listed indicates the order of the ommatidium from left to right.)

In our system, the ball size is selected to match the size of ommatidium lenses. The ball needs to cause enough intensity variation of detected light and at the same time to produce different intensity values among the ommatidia. Based on the selected ball in our system, the maximum detectable object distance of the system is 12 cm in the vertical direction. Increase in the ball size will lead to higher maximum detectable distance in the y-axis but reduce the different intensity variation among the 5 ommatidia.

The system detects the number of intensities in every 1 ms which has a good stability and is suitable for the measurement of corresponding object speed. Ideally printed circuit boards (PCB) are better for system stability. PCB, as well as large number of ommatidia on curve surface, should be used for real devices for practical and commercial applications.

4. Conclusions

We have demonstrated a simple artificial compound eye system for motion detection of an object without the need for sophisticated image processing. We used 3D printing and integrated polymer lenses to fabricate ommatidia. This method features a direct route to make a 3D artificial compound eye system, and design parameters can be easily modified and controlled by 3D printing technology. Unlike the compound eye motion detection using the stereo imaging, this system uses light sensors, instead of CMOS/CCD arrays. Both distance and speed of a moving object were experimentally measured. The relationship between the object distance and the change of the light intensity was found by placing a 3-mm ball above the compound eye system. The velocity of a moving object was studied using a pendulum swinging across the field of view. Using the pendulum movement model and the measured temporal information of light intensity change of each ommatidium, the velocity of the object was obtained. The measured results agree well with the theoretical analyses. The error between the measured and calculated speed is less than 2%. It should be noted that the prototype is good for measuring object movement on the plane above the linear array. Future devices can be 2D arrays which will be suitable for 3D motion detection.

Disclosures

The authors declare no conflicts of interest.

References

1. J. J. Kim, H. Liu, A. O. Ashtiani, and H. Jiang, “Biologically inspired artificial eyes and photonics,” Rep. Prog. Phys. 83(4), 047101 (2020). [CrossRef]  

2. K. Y. Ma, P. Chirarattananon, S. B. Fuller, and R. J. Wood, “Controlled flight of a biologically inspired, insect-scale robot,” Science 340(6132), 603–607 (2013). [CrossRef]  

3. J. D. Davis, S. F. Barrett, C. H. G. Wright, and M. Wilcox, “A bio-inspired apposition compound eye machine vision sensor system,” Bioinspiration Biomimetics 4(4), 046002 (2009). [CrossRef]  

4. J. Tanida, H. Mima, K. Kagawa, C. Ogata, and M. Umeda, “Application of a compound imaging system to odontotherapy,” Opt. Rev. 22(2), 322–328 (2015). [CrossRef]  

5. S. Viollet, S. Godiot, R. Leitel, W. Buss, P. Breugnon, M. Menouni, R. Juston, F. Expert, F. Colonnier, G. l’Eplattenier, and A. Brückner, “Hardware architecture and cutting-edge assembly process of a tiny curved compound eye,” Sensors 14(11), 21702–21721 (2014). [CrossRef]  

6. Y. M. Song, Y. Xie, V. Malyarchuk, J. Xiao, I. Jung, K. J. Choi, Z. Liu, H. Park, C. Lu, R. H. Kim, and R. Li, “Digital cameras with designs inspired by the arthropod eye,” Nature 497(7447), 95–99 (2013). [CrossRef]  

7. K. H. Jeong, J. Kim, and L. P. Lee, “Biologically inspired artificial compound eyes,” Science 312(5773), 557–561 (2006). [CrossRef]  

8. J. W. Duparré and F. C. Wippermann, “Micro-optical artificial compound eyes,” Bioinspiration Biomimetics 1(1), R1–R16 (2006). [CrossRef]  

9. R. Stevens and T. Miyashita, “Review of standards for microlenses and microlens arrays,” Imaging Sci. J. 58(4), 202–212 (2010). [CrossRef]  

10. W. Yuan, L. H. Li, W. B. Lee, and C. Y. Chan, “Fabrication of microlens array and its application: a review,” Chin. J. Mech. Eng. 31(1), 16 (2018). [CrossRef]  

11. J. Duparré, P. Dannberg, P. Schreiber, A. Bräuer, and A. Tünnermann, “Artificial apposition compound eye fabricated by micro-optics technology,” Appl. Opt. 43(22), 4303–4310 (2004). [CrossRef]  

12. G. L. Lin and C. C. Cheng, “An Artificial Compound Eye Tracking Pan-Tilt Motion,” IAENG Intl. J. Comput. Sci. 35, 2 (2008).

13. P. Qu, F. Chen, H. Liu, Q. Yang, J. Lu, J. Si, Y. Wang, and X. Hou, “A simple route to fabricate artificial compound eye structures,” Opt. Express 20(5), 5775–5782 (2012). [CrossRef]  

14. Y. Zheng, L. Song, J. Huang, H. Zhang, and F. Fang, “Detection of the three-dimensional trajectory of an object based on a curved bionic compound eye,” Opt. Lett. 44(17), 4143–4146 (2019). [CrossRef]  

15. M. Ma, F. Guo, Z. Cao, and K. Wang, “Development of an artificial compound eye system for three-dimensional object detection,” Appl. Opt. 53(6), 1166–1172 (2014). [CrossRef]  

16. D. Floreano, R. Pericet-Camara, S. Viollet, F. Ruffier, A. Brückner, R. Leitel, W. Buss, M. Menouni, F. Expert, R. Juston, and M. K. Dobrzynski, “Miniature curved artificial compound eyes,” Proc. Natl. Acad. Sci. 110(23), 9267–9272 (2013). [CrossRef]  

17. R. Pericet-Camara, M. K. Dobrzynski, R. Juston, S. Viollet, R. Leitel, H. A. Mallot, and D. Floreano, “An artificial elementary eye with optic flow detection and compositional properties,” J. R. Soc. Interface 12(109), 20150414 (2015). [CrossRef]  

18. K. Kagawa, E. Tanaka, K. Yamada, S. Kawahito, and J. Tanida, “Deep-focus compound-eye camera with polarization filters for 3D endoscopes,” Proc. SPIE 8227, 822714 (2012). [CrossRef]  

19. A. Shahini, J. Xia, Z. Zhou, Y. Zhao, and M. M. C. Cheng, “Versatile miniature tunable liquid lenses using transparent graphene electrodes,” Langmuir 32(6), 1658–1665 (2016). [CrossRef]  

20. H. Liu, Y. Huang, and H. Jiang, “Artificial eye for scotopic vision with bioinspired all-optical photosensitivity enhancer,” Proc. Natl. Acad. Sci. U. S. A. 113(15), 3982–3985 (2016). [CrossRef]  

21. A. Shahini, H. Jin, Z. Zhou, Y. Zhao, P.-Y. Chen, J. Hua, and M. M. C. Cheng, “Toward individually tunable compound eyes with transparent graphene electrode,” Bioinspiration Biomimetics 12(4), 046002 (2017). [CrossRef]  

22. B. Zhang, J. Hua, Y. Zhao, Y. Chen, J.C. Ching-Ming Chen, and M.M.C Cheng, “Fabrication of Biomimetic Artificial Compound Eyes,” In 2019 20th International Conference on Solid-State Sensors, Actuators and Microsystems & Eurosensors XXXIII (TRANSDUCERS & EUROSENSORS XXXIII), pp. 1503–1506. IEEE, 2019.

23. K. Akşit, P. Chakravarthula, K. Rathinavel, Y. Jeong, R. Albert, H. Fuchs, and D. Luebke, “Manufacturing application-driven foveated near-eye displays,” IEEE Trans. Visual. Comput. Graphics 25(5), 1928–1939 (2019). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1.
Fig. 1. Artificial Ommatidium consisting of polymer lenses, a light-guiding polymer cone, and a 3D printing waveguide cladding. ($\Delta x\; $ = 4 mm, R1 = 1.45 mm, R2 = 1.08 mm, D = 2 mm, d = 0.8 mm)
Fig. 2.
Fig. 2. Surface roughness of 3D printing lens vs. optical polymer lens. The polymer lens has a much smoother surface (red curve) with the RMS value of 70 nm, compared to that of the direct 3D printing lens (blue curve) with the RMS value of 14 µm.
Fig. 3.
Fig. 3. Pictures of water and polymer beads on three substrate surfaces and measured contact angles ($\theta $).
Fig. 4.
Fig. 4. Optical Simulation for different polymer cone length (L) of 3D printing cladding
Fig. 5.
Fig. 5. The fabrication process of compound eye for motion detection. (A) 3D printing compound eye cladding structure. (B) Corona treatment of the whole open cladding surfaces. (C) Teflon treatment of front and back surfaces. (D) Optical polymer filling. (E) UV exposure to form the solid polymer lenses. (F) Assemble the device on a stage with light intensity sensors.
Fig. 6.
Fig. 6. (A) Schematic of intensity measurement for motion detection. (B) The optical set-up. (C) GL5528 output circuit diagram.
Fig. 7.
Fig. 7. Change of light intensity vs. the distance between the object and device in y-axis. The object was located on the top of ommatidium 3 (middle) and moving alone y-axis.
Fig. 8.
Fig. 8. Change of light intensity vs. the distance between the object and device in x-axis. The ommatidia have 2 mm polymer cone length.
Fig. 9.
Fig. 9. Change of light intensity vs. the distance between the object and device in x-axis. The ommatidia do not have polymer lens.
Fig. 10.
Fig. 10. Change of light intensity vs. the distance between the object and device in x-axis. The ommatidia have 1 mm polymer cone length.
Fig. 11.
Fig. 11. Comparison of measured results of ommatidium 3 in three different designs (Figs. 810).
Fig. 12.
Fig. 12. The schematic of velocity measurement
Fig. 13.
Fig. 13. Light intensity change vs. detection time over a cycle of pendulum swing (The numbers indicate the ommatidium position order).

Tables (1)

Tables Icon

Table 1. The measured (Vm) and theoretical (Vt) velocities of the pendulum movement across the field of ommatidia. The number listed indicates the order of the ommatidium from left to right.)

Equations (6)

Equations on this page are rendered with MathJax. Learn more.

V t = 2 g Δ h
Δ h = R ( 1 cos ( α ) ) ;
α = arcsin ( s R )
V m = V x 2 + V y 2
V x = Δ x / t
V y = Δ y / t
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.