Abstract

We propose a new panossramic optical system that provides an additional field of view (FOV) channel without expanding the physical size of a conventional panoramic annular lens (PAL). The two channels are contained within one PAL, their optical paths do not interfere with each other, and the two images are realized on a single image plane. A prototype panoramic lens was developed that provides a 360° × (38–80°) front FOV channel and a 360° × (102–140°) back FOV channel.

© 2017 Optical Society of America

1. Introduction

Panoramic and fish eye lenses can realize a large field of view (FOV) and are widely applied in many areas, including security and surveillance systems, unmanned vehicle robotic systems, and automotive inspection systems [1].

A panoramic lens system is typically composed of a panoramic annular lens (PAL), aperture stop, and relay lenses. The rays that enter the PAL system form an annular image that contains a blind area on the image plane [2]. When compared with conventional large FOV optical systems, such as a fish eye optical system, PALs have fewer pieces, fewer elements, and a more compact structure because the ray path is folded several times within the lens [3]. Researchers have endeavored to expand the capability of this basic PAL structure; For example, adding an additional FOV to PAL lenses.

At present, the primary focus of researchers has been to take advantage of the central blank zone of PAL lenses, which refers to the blank area near the front of the PAL [4,5]. Huang et al. proposed a novel panoramic stereo imaging system that improved the FOV by incorporating coaxial double panoramic annular lenses. In this system, the FOV rays individually enter each PAL, and are imaged on a single image plane. In the final system layout, the ray channel of the first PAL passes through the central blank area of the second PAL before reaching the image plane [6]. Cheng et al. proposed a new type of lens that combined refractive and catadioptric lenses to extend the FOV. The field angle of the refractive lens was 0–50° × 360° while that of the catadioptric lens was 50–135° × 360°. The rays are reflected once inside the catadioptric lenses [7]. Luo et al. employed refractive lenses in front of panoramic lenses to reduce the central FOV blindness of PAL lenses. In this design, the two channels are overlapped on the front surface of the PAL because the central unused portion in the first surface of the PAL is not large enough. To prevent energy from being lost, the authors used an adjustable dichroic filter to select the passing channel. However, the FOV of the compensation channel is limited [8]. Above researchers added an additional channel by inserting it into the central portion of either catadioptric or panoramic lenses, which has proven to be very beneficial for large FOV systems. However, the additional channel increases the size of the PALs, and introduces potential alignment problems because the additional channel is not an integral part of the PAL structure. As demonstrated by these studies, it is difficult to add an additional field channel while simultaneously maintaining a compact structure.

In this paper, we present the design of a panoramic lens that incorporates two ray channels with FOVs of 360° × (38–80°) and 360° × (102–140°). The F-number of these lenses is 5, and their focal lengths are 2.75 mm and 3.12 mm, respectively. The two channels are integrated into a single PAL with shared common relay lenses, which ensures a compact overall structure and simplifies alignment and manufacturing in practical applications.

The remainder of this paper is organized as follows. The design principles are introduced in Section 2. The detailed design process is described in Section 3. In Section 4, the implementation of a proof-of concept prototype is discussed and analyzed. Finally, the paper is concluded in Section 5.

2. Design principles

2.1 Background of the two FOV channel PAL system

Unmanned aerial vehicle (UAV) plays an increasing important role in surveillance and environment monitoring [9]. Combining the large FOV system into the UAV will make it convenient to acquire more useful information. The intention of the optical design in this paper is to be applied in UAV surveillance. The system is required to have a compact size both in axial and radial dimension to have the flexibility in combination with unmanned aerial vehicle. And adding large FOV in a small compact dimension is also desired. Finally, we decided to use one PAL lens to receive two fields of view and transmit the rays into one common aperture stop. The two channels share common relay lenses and are imaged on a single image plane, which makes the whole system concise and compactness.

2.2 Design principle of the two FOV channel PAL system

The primary advantage of PALs is that they contain a folded optical path, which results in a compact structure [10,11]. A conventional panoramic optical system is composed of a PAL, aperture stop, and relay lenses. The intent of a PAL is for all the ray paths to be contained in the PAL lenses before they enter the aperture stop. This allows the PAL to maintain an efficient contour profile.

In this paper, we incorporate one conventional and one additional channel into a prepositioning PAL system in which all the rays enter the aperture stop after several reflections in the PAL. For convenience, we define the conventional and new channels as the front and back FOV channels, respectively, as shown in Fig. 1.

 figure: Fig. 1

Fig. 1 Principle of the proposed panoramic lenses: (a) the proposed lenses have two FOV channels, namely, the front FOV and back FOV channels, respectively, α and β are the minimal and maximal angles of the front FOV, respectively, γ and δ are the minimal and maximal angles of the back FOV, respectively, and θ1 and θ2 are the values of the front and back FOVs, respectively; (b) the imaging plane of the panoramic lenses: the outer red ring area represents the back FOV image, the blue ring area represent the front FOV image, and the white ring area between the red and blue rings represents the blind field area between angles β and γ. To achieve a more efficient implementation, the central and ring blind spots should be as small as possible.

Download Full Size | PPT Slide | PDF

The PAL itself has a significant transverse chromatic aberration because it is not symmetric. This may be resolved by either cementing the elements in place in the PAL or by incorporating more pieces into the relay lenses to compensate for the aberration; however, this increases the complexity [12,13]. In the proposed system, we adopted a quasi-symmetrical form for the PAL to facilitate transverse chromatic aberration correction, which results in a more concise and compact system. To eliminate any stray light and ensure that the system is compact, the aperture stop is located near the rear surface of the PAL [14]. Accordingly, the entrance pupil position is situated near the front surface so that the system has a quasi-symmetrical structure, which allows it to benefit from the transverse chromatic aberration correction [15]. In addition, by adopting a quasi-symmetrical structure, the rays only occupy a small portion of the first incident surface area. This structure provides sufficient space to accommodate two channels in the same PAL.

Figure 2 illustrates the ray channels of a conventional quasi-symmetrical PAL system. The surfaces that react to the rays are in a purple color. In PALs, the reflective and refractive areas do not overlap to prevent energy from being lost; thus, the only available space that can be used for the back FOV channel is in blank area1, blank area2, blank area3, and blank area4. To maximize space utilization, the proposed design also uses the face3 and face4 regions for the back FOV channel since the corresponding rays also enter the aperture stop. However, this configuration will also consume the space of blank area1 and blank area2, which reduces the available design space; thus, we also intend to focus on the utilization of blank area3 and blank area4. The ray entrance position of the back FOV channel is located at the outside edge of face2 and face2 because the space of blank area2 is too narrow. The spaces of blank area3 and blank area4 are subsequently used to reflect the rays internally, which will allow the rays to reach face3 and then follow a ray path similar to that of the front FOV channel. To maintain the quasi-symmetrical principle of the design, the center of the entrance pupil should be located near face1, face5 and their conjugate surfaces, as shown in Fig. 3.

 figure: Fig. 2

Fig. 2 Ray path of a conventional quasi-symmetrical PAL system.

Download Full Size | PPT Slide | PDF

 figure: Fig. 3

Fig. 3 Schematic of the compound PAL system. The purple surfaces represent those used by the front FOV channel, the blue surfaces represent those used by the back FOV channel, and the green surfaces represent those used by both channels.

Download Full Size | PPT Slide | PDF

Figure 4 illustrates the unfolded optical paths of the PAL lens, and the entrance pupil position and the exit pupil position are located near the front surface and the rear surface of PAL lens respectively, tracing the rays in PAL, which get quasi-symmetrical form for the PAL system.

 figure: Fig. 4

Fig. 4 Schematic of optical paths in PAL. (a) Unfolded ray path of front FOV channel. (b) Unfolded ray path of back FOV channel. The dash line represents the optical axis of the system.

Download Full Size | PPT Slide | PDF

3. Design process

In this section, an overview of the design process is provided for the front and back FOV channels, followed by a description of several key optimizations that were applied to ensure the design met the performance and size objectives.

3.1 Design of the front FOV channel

To simplify the calculations, the position of the entrance pupil is located on the front surface. Thus, all the primary field rays encounter the PAL surface at a single point. The rays enter face1 at point A, are reflected by face2, and converge at virtual point O. The two marginal points where the rays intersect the face2 are points B and C, and α and β are the angles between the optical axis and the minimum and maximum FOV chief rays, respectively, that were just reflected by face2. In order to realize a higher sensor surface utilization ratio, it will be necessary to shrink the central blindness area by minimizing the value of α.

We define point O as the origin of the coordinate system, θ1 and θ2 are the angles of the minimum and maximum FOVs, respectively, and θ3 and θ4 are the transform angles of θ1 and θ2 that are refracted by face1. The coordinates of points A, B, and C are (Ax, Ay), (Bx, By), and (Cx, Cy), respectively. From Fig. 5, we can infer that the value of θ3 equals arctan(CyAyCxAx)according to the law of refraction, and the value of the FOV is approximately equal to arcsin((θ2θ1)/n) after it is transmitted from face1, where n is the index value of the material of PAL. We therefore obtain:

 figure: Fig. 5

Fig. 5 Schematic diagram of the front FOV channel in the proposed PAL system: the purple surfaces represent those that react only to the rays from the front FOV channel, while the green surfaces react to the rays from the combined front and back FOV channels.

Download Full Size | PPT Slide | PDF

θ4=arctan(CyAyCxAx)+arcsin((θ2θ1)/n).

We consider the ray path from point A to virtual point O to be a perfect imaging system because the rays that are initially transmitted from point A converge at point O. We define the distance between points A and B as L, according the principle of equal optical paths when applied to a perfect imaging system [16].

L+dis(B,O)=dis(C,O)+dis(C,A)=Ltotal.
where the symbol dis(A, B) refers to the distance between points A and B, and Ltotal refers to the total optical ray path. By expanding Eq. (2), we obtain:
L+(Lcos(θ4)+Ax)2+(Lsin(θ4)Ay)2=Cx2+Cy2+(CxAx)2+(CyAy)2=Ltotal.
After transformation, we obtain:

L=Ltotal2(Ax2+Ay2)2(cos(θ4)Axsin(θ4)Ay+Ltotal).

From Eq. (4), we can calculate the values of Bx and By, which are Ax + L cos(θ4) and Ay − L sin(θ4), respectively. The aperture of face4 should be small because it is very close to the aperture stop, thus Cy can be assigned a suitable value a priori.

If the horizontal distance between points A and O is not large enough, the rays will intersect at face3 at a low height. This will cause a large optical power at face3, which may cause the surface of face3 to become twisted. On the other hand, if the horizontal distance between points A and O is too great, the aperture of face3 will become very large, besides, the rays from the back FOV will also occupy a large area in face3. Therefore, some space should be reserved in advance at face3 when designing the front FOV channel so that Ax can be assigned a suitable value.

The uncertainty factors primarily have to do with Cx and Ay. We now determine the influence of Cx and Ay on the coordinate of point B.

By analyzing the above equations, we can determine the relationship between the given parameters. As shown in Fig. 6(a), the value of |By| is directly related to the value of Ay, and the value of |Bx - Ax| is inversely related to the value of Ay. The value of Cx has a larger influence on the coordinate of point B than does Ay.

 figure: Fig. 6

Fig. 6 (a) The solid lines represent the value of |By| versus the value of Ay for different Cx values, and the dashed lines represent the value of |Bx - Ax| versus the value of Ay for different Cx values. (b) The values of |By| and |Bx-Ax| versus the value of (θ2 - θ1). Given the value of Ax is 16 mm, the value of Cy is −3 mm,the value of n is 1.6.

Download Full Size | PPT Slide | PDF

To determine a suitable radial dimension and provide sufficient space to design the back FOV channel, we anticipate a relatively large value of |Bx - Ax| and small values of |By| and Ay. We are willing to either sacrifice some of the front FOV to achieve that goal, as shown in Fig. 6(b), or change the values of Ay and Cx. As shown in Fig. 6(a), we first select a low value of Ay. Note that the values of By and |Bx - Ax| are all directly related to the value of Cx, and that tradeoffs will need to be made when choosing suitable design parameters because a large value of Cx will increase the axial and radial dimension of the PAL, while a low value of Cx will shrink the axial design space for back FOV channel of the PAL.

From Fig. 6(a), we can see that the value of |By| is very close to the value of Ay while the value of Cx is small. In future calculations for the back FOV channel, we will choose the value of |By| so that it is close to that of Ay.

3.2 Design of the back FOV channel

The design of the back FOV channel is similar to that of the front channel (see Fig. 7). The rays enter face5 at point A, and are reflected by face6 and face7 in succession before creating an image at virtual point O and entering the common channel. To improve the utilization of the sensor surface, space should be minimized between the front and back FOV imaging areas. Therefore, the value of δ should be close to but larger than the value of β mentioned in the design process of the front FOV channel to avoid overlapping of the two imaging areas. The marginal points at which the rays intersect face6 are points B’ and C’, and the marginal points at which the rays intersect face7 are points D’ and E’. In order to increase the space utilization, points A’ and E’ are located at the outer margin point of face2 and face2, and point C’ is located at the outer margin of face1. Thus, point E’ can be considered as a conjugate point of A’ and its coordinate is very close to point B, which was mentioned in the previous design process.

 figure: Fig. 7

Fig. 7 Schematic diagram of the back FOV channel in the PAL system. The blue surfaces are those that only react to the back FOV channel.

Download Full Size | PPT Slide | PDF

We now extract the ray path from point A to point O for analysis.

We define point O as the origin of the coordinate system for calculation purposes. Angles θ1 and θ2 are the supplementary angles of the maximum and minimum field angles of the back FOV channel, respectively, θ3 and θ4 are the transforms of θ1 and θ2 that are refracted by face5, the radius of face6 is R1, the radius of face7 is R2, and the value of (θ4 - θ3) is approximately equal to arcsin((θ2 - θ1)/n) after being transmitted from face5, where n is the index value of the material of PAL.

As shown in Fig. 8, because points A’ and B and points C’ and A are all conjugate points of each other, θ3 is equal to θ4 that discussed in previous design step, (θ4 - θ3), which is approximately equal to arcsin((θ2'θ1')/n). We therefore obtain:

 figure: Fig. 8

Fig. 8 Schematic of the ray path from point A to point O in the back FOV channel

Download Full Size | PPT Slide | PDF

θ4'=θ3'+arcsin((θ2'θ1')/n).

We define the distance between points A’ and C’ as L, which is identical to the value L referred to in the previous design step since points A’ and B and points C’ and A are all conjugate points of each other. The distance between points A’ and B’ is defined as L’. Recall that we already set the value of |By| close to the value of Ay in the previous design step. Here, the vertical heights of points C’ and E’ are quite similar, so that the rays transmitted from face6 to face7 can be assumed to be parallel rays. Thus, face6 and face7 represent an approximate paraboloid. Considering point E’ is very close to point B, we can determine the values of R1 and R2 from the paraboloid equations:

R1=Lcosθ3'+L,
R2=Bx+Bx2+By2.

According to the sag equationsag=12Rr2, the sag value for R1 can be expressed as:

R12L'cosθ4'=(L'sinθ4')22R1.

Upon reorganization, the value of L’ can be determined using Eq. (9).

L'=(cosθ4'+1)(sinθ4')2R1.

As can be seen in Fig. 8, Δz1 and Δz2 can be expressed as:

Δz1=12R1((Lsinθ3')2(L'sinθ4')2),
Δz2=12R2((Lsinθ3')2(L'sinθ4')2),

Where the values of L, L’, R1, and R2 can be calculated individually using Eqs. (4), (9), (6), and (7). Because the horizontal distance between points E’ and C’ equals (Bx - Ax), the edge thickness of the PAL equals (Bx - Ax)-Δz1 -Δz2. From Eqs. (5)–(11), except for the FOV value, all of the remaining parameters can be derived from the previously described design process. Here, we analyze the value of Cx, which is the dominant factor in the front FOV design.

As shown in Fig. 9, the edge thickness of the PAL decreased as the size of the back FOV increased. Thus, we should control the value of Cx to avoid a negative edge thickness, however, a large value of Cx will lead to an excessive PAL volume With this in mind, during the design process, we must consider the relevant tradeoffs when determining the value of Cx to ensure a reasonable edge thickness.

 figure: Fig. 9

Fig. 9 Edge thickness of the PAL system versus the value of (θ2 - θ1) for different values of Cx.

Download Full Size | PPT Slide | PDF

Now that the design process is complete, a suitable initial structure has been defined. The channels contained in the PAL system are shown in Fig. 10. The two channels do not overlap in the outer surface of the PAL; thus, there is no need for semi-reflective or semi-refractive surfaces, which will simplify the coating process for during actual fabrication.

 figure: Fig. 10

Fig. 10 Schematic showing the space occupied by the two channels within the PAL system

Download Full Size | PPT Slide | PDF

3.3 Optimization

During the design process, we adopted even ogive surfaces in the optical design [15]. As can be seen from Section 2.2, the vertex of the radius of face6 does not lie on the optical axis; however, all the surfaces used in the design have axial and rotational symmetry, which means only the portion below the optical axis of R1 is used when constructing the surfaces. As mentioned above, an even ogive surface was used during the design process, which is defined as:

z=crg21+1(1+k)c2rg2+i=14αirg2i,

Where the value of rg is:

rg=ro+x2+y2,

And c is the curvature, k is the conic constant, αi is the aspheric coefficient, and ro is the compensation value, as illustrated in Fig. 8. The vertical distance from the vertex of R1 to the optical axis is equal to ro. This type of surface expression can be applied to surfaces that require a high power near a portion of the optical axis. We employed an even ogive surface to construct face2 because the inner margin requires a strong ray bending capability. However, even ogive surface is not included among the available surfaces in the ZEMAX software library, which was the software package used during the design process. Therefore, we defined a new type face within the ZEMAX tool, and applied it during the design process [17].

Except for the special surfaces, namely, face2 and face6, all the other surfaces utilized a conventional even aspheric surface, as shown in Eq. (13).

z=cr21+1(1+k)c2r2+i=14αir2i,r=x2+y2.

We use the even ogive surface to optimize the PAL surface to ensure the rays exited from the final surface of the PAL in parallel. This made the design of the relay lenses more convenient, and allowed the various PAL surfaces to be continuous. Because the incident rays were all in parallel, it was therefore easy to design the relay lens to use a typical eyepiece structure. Once we achieved suitable quality from the relay lenses, they were combined.

The front PAL channel was designed to have a quasi-symmetrical form, which resulted in only minor transverse chromatic aberration. However, some chromatic aberration still remained in the front PAL system. Therefore, after combining the front PAL system and relay lenses, it was important to optimize the relay system once again to ensure continued aberration compensation.

During the PAL design process, we first used the lower orders of αi to complete the basic structure before added higher levels of αi to realize certain capabilities, such as allowing the rays to exit the PAL lenses in parallel. As far as the design of the relay lenses, we first used the lower order of αi to optimize the size of the image spot. Then, if the image quality was not acceptable, we added higher levels of αi into the lenses of the relay system.

A flowchart of the design process for the entire system is shown in Fig. 11.

 figure: Fig. 11

Fig. 11 Design process for the PAL lenses.

Download Full Size | PPT Slide | PDF

4. Analysis of the design result

To validate the correctness of the proposed design method, we constructed a prototype using the parameters listed in Table 1.

Tables Icon

Table 1. Design parameters of optical system

The final layout of the complete PAL lens system is shown in Fig. 12. The system is composed of one PAL and five relay lenses, and incorporates a 71.1 mm total length and 4.4 mm rear distance. The structure of the entire system is compact, and the two channels can be used simultaneously. The careful arrangement of the two channels ensures that they do not overlap along the outer surface.

 figure: Fig. 12

Fig. 12 Final layout and ray trace of the panoramic lens

Download Full Size | PPT Slide | PDF

As can be seen in Figs. 13 and 14, good images were obtained from both channels, and all modulation transfer function (MTF) values were higher than 0.3 at 60 lp/mm. The Ray fan plots of both channels were given in Figs. 15 and 16.

 figure: Fig. 13

Fig. 13 Spot diagram of the optical system. The blue spots represent the 486-nm image spots, the grey spots represent the 587-nm image spots, and the red spots represent the 656-nm image spots.

Download Full Size | PPT Slide | PDF

 figure: Fig. 14

Fig. 14 PAL MTF values: (a) MTF of the front channel, (b) MTF of the back channel. The solid lines represent the tangential MTF, the dashed lines represent the sagittal MTF. and the black lines represent diffraction limit.

Download Full Size | PPT Slide | PDF

 figure: Fig. 15

Fig. 15 Ray fan plot of PAL front channel: (a) 38°, (b) 50°, (c) 65°, (d) 80° . The maximum vertical scale for the plots is 50µm.

Download Full Size | PPT Slide | PDF

 figure: Fig. 16

Fig. 16 Ray fan plot of PAL back channel: (a) 104°, (b) 115°, (c) 130°, (d) 140° . The maximum vertical scale for the plots is 50µm.

Download Full Size | PPT Slide | PDF

The tolerance analysis of the optical design was also performed by using the software ZEMAX, taking into consideration the tolerance on surface curvature, lens thickness, tilt, and decenter.

After the analysis, we found that the most sensitive factor is the second relay lens behind the PAL lens; 0.05 mm decenter of the lens leads to degradation of 0.215 and 0.098 at 60lp/mm of the front channel MTF and back channel MTF, while the image quality is comparatively less sensitive to the other system parameters and can be compensated by slightly changing the back working distance of the system.

Figure 17(a) shows the imaging relationship of the complete system. We can also see that the blind area is very small, and thus the sensor surface utilization ratio is high. Because the half FOV of the system is greater than 90°, we therefore applied the f-θ mapping to the entire system. The distortion value of the whole system is negative because only a small central area remained obscured, as shown in Figs. 17(a) and 17(b).

 figure: Fig. 17

Fig. 17 (a) Imaging relationship between the FOV and Image Height. (b) Distortion.

Download Full Size | PPT Slide | PDF

5. Conclusion

In this paper, we proposed a new type of panoramic lens with two FOV channels. The advantage of the proposed system is that it includes an additional channel without adversely affecting the conventional PAL lens structure. The system uses one PAL lens to receive two fields of view and transmit the rays into one common aperture stop. The two channels share common relay lenses, and the final images are imaged on a single image plane, which allows the complete structure to be more concise and compact. The design process was described in detail in this paper. First, we described the construction of the two FOV channels, reviewed the relationship between the two channels, and highlighted the key points that must be respected when designing the initial PAL structure. Subsequently, we described the surface types used in the design process. As discussed, we proposed an even ogive surface for the lenses, which is an extension of an ogive surface and explained the optimization method used on the lens surfaces in the PAL. Finally, we analyzed a design prototype in detail to validate the correctness of the proposed design method. In the future work, we plan to fabricate the whole system and assemble it, with emphasis on its combination with unmanned aerial vehicle to achieve effective navigation vision. This system can be used in a variety of applications, including surveillance systems and automotive inspection systems.

Funding

National Natural Science Foundation of China (NSFC) (11474037).

References and links

1. S. Thibault, “Panoramic lens applications revisited,” Proc. SPIE 7000, 70000L1 (2008).

2. I. Powell, “Design study of an infrared panoramic optical system,” Appl. Opt. 35(31), 6190–6194 (1996). [PubMed]  

3. T. Doi, “Panoramic imaging lens,” United States patent US 6646818 B2 (2003).

4. V. A. Solomatin, “A panoramic video camera,” J. Opt. Technol. 74(12), 815–817 (2007).

5. D. Hui, M. Zhang, Z. Geng, Y. Zhang, J. Duan, A. Shi, L. Hui, Q. Fang, and Y. Liu, “Designs for high performance PAL-based imaging systems,” Appl. Opt. 51(21), 5310–5317 (2012). [PubMed]  

6. Z. Huang, J. Bai, and X. Y. Hou, “Design of panoramic stereo imaging with single optical system,” Opt. Express 20(6), 6085–6096 (2012). [PubMed]  

7. D. Cheng, C. Gong, C. Xu, and Y. Wang, “Design of an ultrawide angle catadioptric lens with an annularly stitched aspherical surface,” Opt. Express 24(3), 2664–2677 (2016). [PubMed]  

8. Y. Luo, J. Bai, X. Zhou, X. Huang, Q. Liu, and Y. Yao, “Non-blind area PAL system design based on dichroic filter,” Opt. Express 24(5), 4913–4923 (2016).

9. M. Saska, “Autonomous deployment of swarms of micro-aerial vehicles in cooperative surveillance,” in International Conference on Unmanned Aircraft Systems IEEE (2014), 584–595.

10. C. B. Martin, “Design issues of a hyper-field fisheye lens,” Proc. SPIE 5524, 84–92 (2004).

11. I. Powell, “Panoramic lens,” Appl. Opt. 33(31), 7356–7361 (1994). [PubMed]  

12. S. Niu, J. Bai, X. Y. Hou, and G. G. Yang, “Design of a panoramic annular lens with a long focal length,” Appl. Opt. 46(32), 7850–7857 (2007). [PubMed]  

13. W. J. Smith, Modern Lens Design (McGraw-Hill, 2005).

14. Z. Huang, J. Bai, T. X. Lu, and X. Y. Hou, “Stray light analysis and suppression of panoramic annular lens,” Opt. Express 21(9), 10810–10820 (2013). [PubMed]  

15. J. Wang, Y. Liang, and M. Xu, “Design of panoramic lens based on ogive and aspheric surface,” Opt. Express 23(15), 19489–19499 (2015). [PubMed]  

16. V. N. Mahajan, “Optical imaging and aberrations.” Storage and Retrieval for Image and Video Databases, (2013).

17. Zemax Optical Design Program User's Guide, Zemax Development Corporation.

References

  • View by:
  • |
  • |
  • |

  1. S. Thibault, “Panoramic lens applications revisited,” Proc. SPIE 7000, 70000L1 (2008).
  2. I. Powell, “Design study of an infrared panoramic optical system,” Appl. Opt. 35(31), 6190–6194 (1996).
    [PubMed]
  3. T. Doi, “Panoramic imaging lens,” United States patent US 6646818 B2 (2003).
  4. V. A. Solomatin, “A panoramic video camera,” J. Opt. Technol. 74(12), 815–817 (2007).
  5. D. Hui, M. Zhang, Z. Geng, Y. Zhang, J. Duan, A. Shi, L. Hui, Q. Fang, and Y. Liu, “Designs for high performance PAL-based imaging systems,” Appl. Opt. 51(21), 5310–5317 (2012).
    [PubMed]
  6. Z. Huang, J. Bai, and X. Y. Hou, “Design of panoramic stereo imaging with single optical system,” Opt. Express 20(6), 6085–6096 (2012).
    [PubMed]
  7. D. Cheng, C. Gong, C. Xu, and Y. Wang, “Design of an ultrawide angle catadioptric lens with an annularly stitched aspherical surface,” Opt. Express 24(3), 2664–2677 (2016).
    [PubMed]
  8. Y. Luo, J. Bai, X. Zhou, X. Huang, Q. Liu, and Y. Yao, “Non-blind area PAL system design based on dichroic filter,” Opt. Express 24(5), 4913–4923 (2016).
  9. M. Saska, “Autonomous deployment of swarms of micro-aerial vehicles in cooperative surveillance,” in International Conference on Unmanned Aircraft Systems IEEE (2014), 584–595.
  10. C. B. Martin, “Design issues of a hyper-field fisheye lens,” Proc. SPIE 5524, 84–92 (2004).
  11. I. Powell, “Panoramic lens,” Appl. Opt. 33(31), 7356–7361 (1994).
    [PubMed]
  12. S. Niu, J. Bai, X. Y. Hou, and G. G. Yang, “Design of a panoramic annular lens with a long focal length,” Appl. Opt. 46(32), 7850–7857 (2007).
    [PubMed]
  13. W. J. Smith, Modern Lens Design (McGraw-Hill, 2005).
  14. Z. Huang, J. Bai, T. X. Lu, and X. Y. Hou, “Stray light analysis and suppression of panoramic annular lens,” Opt. Express 21(9), 10810–10820 (2013).
    [PubMed]
  15. J. Wang, Y. Liang, and M. Xu, “Design of panoramic lens based on ogive and aspheric surface,” Opt. Express 23(15), 19489–19499 (2015).
    [PubMed]
  16. V. N. Mahajan, “Optical imaging and aberrations.” Storage and Retrieval for Image and Video Databases, (2013).
  17. Zemax Optical Design Program User's Guide, Zemax Development Corporation.

2016 (2)

2015 (1)

2013 (1)

2012 (2)

2008 (1)

S. Thibault, “Panoramic lens applications revisited,” Proc. SPIE 7000, 70000L1 (2008).

2007 (2)

2004 (1)

C. B. Martin, “Design issues of a hyper-field fisheye lens,” Proc. SPIE 5524, 84–92 (2004).

1996 (1)

1994 (1)

Bai, J.

Cheng, D.

Duan, J.

Fang, Q.

Geng, Z.

Gong, C.

Hou, X. Y.

Huang, X.

Huang, Z.

Hui, D.

Hui, L.

Liang, Y.

Liu, Q.

Liu, Y.

Lu, T. X.

Luo, Y.

Martin, C. B.

C. B. Martin, “Design issues of a hyper-field fisheye lens,” Proc. SPIE 5524, 84–92 (2004).

Niu, S.

Powell, I.

Saska, M.

M. Saska, “Autonomous deployment of swarms of micro-aerial vehicles in cooperative surveillance,” in International Conference on Unmanned Aircraft Systems IEEE (2014), 584–595.

Shi, A.

Solomatin, V. A.

Thibault, S.

S. Thibault, “Panoramic lens applications revisited,” Proc. SPIE 7000, 70000L1 (2008).

Wang, J.

Wang, Y.

Xu, C.

Xu, M.

Yang, G. G.

Yao, Y.

Zhang, M.

Zhang, Y.

Zhou, X.

Appl. Opt. (4)

J. Opt. Technol. (1)

Opt. Express (5)

Proc. SPIE (2)

C. B. Martin, “Design issues of a hyper-field fisheye lens,” Proc. SPIE 5524, 84–92 (2004).

S. Thibault, “Panoramic lens applications revisited,” Proc. SPIE 7000, 70000L1 (2008).

Other (5)

T. Doi, “Panoramic imaging lens,” United States patent US 6646818 B2 (2003).

M. Saska, “Autonomous deployment of swarms of micro-aerial vehicles in cooperative surveillance,” in International Conference on Unmanned Aircraft Systems IEEE (2014), 584–595.

W. J. Smith, Modern Lens Design (McGraw-Hill, 2005).

V. N. Mahajan, “Optical imaging and aberrations.” Storage and Retrieval for Image and Video Databases, (2013).

Zemax Optical Design Program User's Guide, Zemax Development Corporation.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (17)

Fig. 1
Fig. 1 Principle of the proposed panoramic lenses: (a) the proposed lenses have two FOV channels, namely, the front FOV and back FOV channels, respectively, α and β are the minimal and maximal angles of the front FOV, respectively, γ and δ are the minimal and maximal angles of the back FOV, respectively, and θ1 and θ2 are the values of the front and back FOVs, respectively; (b) the imaging plane of the panoramic lenses: the outer red ring area represents the back FOV image, the blue ring area represent the front FOV image, and the white ring area between the red and blue rings represents the blind field area between angles β and γ. To achieve a more efficient implementation, the central and ring blind spots should be as small as possible.
Fig. 2
Fig. 2 Ray path of a conventional quasi-symmetrical PAL system.
Fig. 3
Fig. 3 Schematic of the compound PAL system. The purple surfaces represent those used by the front FOV channel, the blue surfaces represent those used by the back FOV channel, and the green surfaces represent those used by both channels.
Fig. 4
Fig. 4 Schematic of optical paths in PAL. (a) Unfolded ray path of front FOV channel. (b) Unfolded ray path of back FOV channel. The dash line represents the optical axis of the system.
Fig. 5
Fig. 5 Schematic diagram of the front FOV channel in the proposed PAL system: the purple surfaces represent those that react only to the rays from the front FOV channel, while the green surfaces react to the rays from the combined front and back FOV channels.
Fig. 6
Fig. 6 (a) The solid lines represent the value of |By| versus the value of Ay for different Cx values, and the dashed lines represent the value of |Bx - Ax| versus the value of Ay for different Cx values. (b) The values of |By| and |Bx-Ax| versus the value of (θ2 - θ1). Given the value of Ax is 16 mm, the value of Cy is −3 mm,the value of n is 1.6.
Fig. 7
Fig. 7 Schematic diagram of the back FOV channel in the PAL system. The blue surfaces are those that only react to the back FOV channel.
Fig. 8
Fig. 8 Schematic of the ray path from point A to point O in the back FOV channel
Fig. 9
Fig. 9 Edge thickness of the PAL system versus the value of (θ2 - θ1) for different values of Cx.
Fig. 10
Fig. 10 Schematic showing the space occupied by the two channels within the PAL system
Fig. 11
Fig. 11 Design process for the PAL lenses.
Fig. 12
Fig. 12 Final layout and ray trace of the panoramic lens
Fig. 13
Fig. 13 Spot diagram of the optical system. The blue spots represent the 486-nm image spots, the grey spots represent the 587-nm image spots, and the red spots represent the 656-nm image spots.
Fig. 14
Fig. 14 PAL MTF values: (a) MTF of the front channel, (b) MTF of the back channel. The solid lines represent the tangential MTF, the dashed lines represent the sagittal MTF. and the black lines represent diffraction limit.
Fig. 15
Fig. 15 Ray fan plot of PAL front channel: (a) 38°, (b) 50°, (c) 65°, (d) 80° . The maximum vertical scale for the plots is 50µm.
Fig. 16
Fig. 16 Ray fan plot of PAL back channel: (a) 104°, (b) 115°, (c) 130°, (d) 140° . The maximum vertical scale for the plots is 50µm.
Fig. 17
Fig. 17 (a) Imaging relationship between the FOV and Image Height. (b) Distortion.

Tables (1)

Tables Icon

Table 1 Design parameters of optical system

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

θ 4 =arctan( C y A y C x A x )+arcsin(( θ 2 θ 1 )/n).
L+dis(B,O)=dis(C,O)+dis(C,A)= L total .
L+ (Lcos( θ 4 )+ A x ) 2 + (Lsin( θ 4 ) A y ) 2 = C x 2 + C y 2 + ( C x A x ) 2 + ( C y A y ) 2 = L total .
L= L total 2 ( A x 2 + A y 2 ) 2(cos( θ 4 ) A x sin( θ 4 ) A y + L total ) .
θ 4 ' = θ 3 ' +arcsin(( θ 2 ' θ 1 ' )/n).
R 1 =Lcos θ 3 ' +L,
R 2 = B x + B x 2 + B y 2 .
R 1 2 L ' cos θ 4 ' = ( L ' sin θ 4 ' ) 2 2 R 1 .
L ' = (cos θ 4 ' +1) (sin θ 4 ' ) 2 R 1 .
Δ z 1 = 1 2 R 1 ( (Lsin θ 3 ' ) 2 ( L ' sin θ 4 ' ) 2 ),
Δ z 2 = 1 2 R 2 ( (Lsin θ 3 ' ) 2 ( L ' sin θ 4 ' ) 2 ),
z= c r g 2 1+ 1(1+k) c 2 r g 2 + i=1 4 α i r g 2i ,
r g = r o + x 2 + y 2 ,
z= c r 2 1+ 1(1+k) c 2 r 2 + i=1 4 α i r 2i ,r= x 2 + y 2 .

Metrics