Abstract

Integral imaging systems are imaging devices that provide 3D images of 3D objects. When integral imaging systems work in their standard configuration the provided reconstructed images are pseudoscopic; that is, are reversed in depth. In this paper we present, for the first time we believe, a technique for formation of real, undistorted, orthoscopic integral images by direct pickup. The technique is based on a smart mapping of pixels of an elemental-images set. Simulated imaging experiments are presented to support our proposal.

©2005 Optical Society of America

1. Introduction

Integral imaging (InI) is a three-dimensional (3D) imaging technique that works with incoherent light and provides with auto-stereoscopic images without the help of special glasses. InI is based on integral photography (IP) which was proposed by Lipmann in 1908 [1]. In the past few years the technology has approached the level required to apply the IP concept to 3D TV and display [2,3]. In the capture stage of an InI system an array of microlenses generates a collection of plane micro-images onto a CCD. These micro-images will be referred to hereafter as elemental images. In the reconstruction stage the set of elemental images are displayed in front of another microlens array, providing the observer with a reconstructed 3D image that is seen with horizontal and vertical parallax.

Since its birth, InI has satisfactorily tackled many of its challenges. In this sense the methods proposed to improve the spatial resolution [4–8], enhance the limited depth of field [9–12], or to expand the viewing area are noticeable [13]. One of the main problems in InI is the overlapping between adjacent elemental images in the pickup stage. To overcome this drawback the use of gradient-index microlenses [14], or the use of a barrier array [15] were proposed. Besides, InI systems have shown to be useful not only for pure 3D imaging applications but also for other applications like object recognition [16,17], the acquisition of 3D mapping of polarization distributions [18], or to provide the input signal for electro-floating display systems [19].

As it is known, in their standard configuration InI systems provide the observer with real, pseudoscopic images, that is, with a 3D reconstruction that is reversed in depth. This is a problem that has attracted many research efforts but still has not been solved, to our knowledge. As we explain in detail in the following Section, many different techniques have been proposed to overcome this drawback. An optimum solution is the pseudoscopic-to-orthoscopic (PO) conversion. But at present, no technique that allows the reconstruction without distortion of real, orthoscopic images after a single recording-stage process has been reported. This is precisely the aim of our proposal. Specifically, we show that after a single-step pickup process and a smart pixel mapping in the display stage, undistorted real, orthoscopic integral images can be obtained, avoiding then the deterioration in resolution inherent to a two-step recording processes.

2. Review of pseudoscopic-to-orthoscopic conversion techniques

We start by considering the standard configuration of an InI system, as shown in Fig. 1. In the pickup stage the system is adjusted so that a representative plane of the object neighborhood, referred to here as the reference object plane, and the CCD plane are conjugated through the microlenses. Distances d and g are, then, related through the lens law, 1/d+1/g=1/f, f being the lenslets focal length. Then, a collection of elemental 2D images, each with a different perspective of the 3D object, are generated onto the pickup device. In the reconstruction process the recorded elemental images are displayed by an optical device (like a LCD) in front of the other microlens array. The LCD and the microlens array are adjusted so that the gap, gr, is the same as the pickup gap, g. As shown in the figure this architecture permits the reconstruction of a real but pseudoscopic image at a distance d r = d. The first attempt to overcome this problem was done by Ives [20] who, in the context of IP, proposed to record a second set of elemental images, by using the reconstructed image as the object. As we show in Fig. 2, when the second set of elemental images are used in a second reconstruction stage, a real, undistorted, orthoscopic 3D image is reconstructed. This proposal does not constitute an effective solution for the PO conversion problem. This is because the two-step recording process leads to important image degradation because of diffraction effects and the pixilated structure of the CCD and the LCD [21].

 figure: Fig. 1.

Fig. 1. Scheme of a standard InI system. In the pickup stage a set of 2D elemental images are recorded in the CCD. In the display step a real depth-reversed image is provided to the observer.

Download Full Size | PPT Slide | PDF

 figure: Fig. 2.

Fig. 2. PO conversion method proposed by Ives. The reconstructed image is used as the object for a second pickup. The second display provides the observer with a real, undistorted orthoscopic image.

Download Full Size | PPT Slide | PDF

A different approach was suggested by Davies et al. [22], who proposed the use of an auto-collimating transmission screen placed between the object and the InI microlenses. The auto-collimating screen has the property of bending any incoming ray so that the incident angle equals the emerging angle, producing then a depth-reversed, undistorted image of the 3D object (see Fig. 3). When this image is used as the object for the pickup stage of an InI system, the reconstruction system provides a virtual, undistorted orthoscopic 3D image. However, although this scheme does not need for two recording stages, it still suffers from some drawbacks. On the one hand, the use of the auto-collimation screen enhances the detrimental effects of diffraction onto resolution. Besides, the reconstructed image is virtual. However, the formation of a real image of the object is often preferable because such floating image provides the observer with an impressive feel of depth, since the image appears to be located in the free space near the observer [23].

 figure: Fig. 3.

Fig. 3. The auto-collimating screen produces undistorted, depth-reversed images of 3D objects.

Download Full Size | PPT Slide | PDF

A very smart and simple scheme was suggested by Okano and co-workers [2]. They proposed to capture the elemental images with the standard pickup architecture. Then, each elemental image is rotated by 180° around the center of the elemental cell. Taking into account the pixilated structure of the elemental images, this operation simply implies a local pixel mapping. As we show in Fig. 4, when these rotated elemental images are displayed at a distance g v=g-2f 2/(df) [24], a virtual, undistorted orthoscopic image is obtained at a distance d v=d-2f from the lenslets array. Note that to obtain a reconstructed virtual image, a double condition is necessary. On the one hand, any microlens must produce in the reference plane a virtual image of its corresponding elemental image. On the other hand, the set of ray-cones corresponding to the same point of the object must intersect in the same virtual point of the reconstructed image, as shown in the figure. Although in this scheme there is no degradation of the image due to the introduction of additional elements or stages, it still has the drawback that the reconstructed image is virtual.

 figure: Fig. 4.

Fig. 4. Schematic drawing of the orthoscopic, virtual reconstruction.

Download Full Size | PPT Slide | PDF

Some other methods have been proposed for the PO conversion either in transmission architecture [25], or by using micro-convex mirrors [26]. Let us concentrate, to finish, in the method suggested by Jang and Javidi [27]. Their scheme is similar to the one proposed by Davies et al., but the auto-collimating device is substituted by a converging lens. Then as shown in Fig. 5, in the pickup the elemental images of a depth reversed version of the object are recorded at g v=df/(d+f). In the display the elemental images are rotated by 180° and the gap is shifted to g r= df/(df). Then, a real, orthoscopic image of the 3D object is reconstructed. However, due to the non-constant lateral magnification of the converging lens, the reconstructed image appears clearly distorted.

 figure: Fig. 5.

Fig. 5. This technique provides real, orthoscopic but distorted 3D images.

Download Full Size | PPT Slide | PDF

3. PO conversion by smart pixel mapping

Among the techniques analyzed in the previous Section, the only one that satisfies the requirement of providing real, orthoscopic 3D images is the one proposed by Ives. However, the practical application of this technique is not very advantageous because the two-step recording/displaying technique produces a strong degradation in the image. What we suggest here is to make use of Ives PO concept but, in order to avoid the disadvantages of the two-step process, we propose to perform the intermediate stage (Display 1 + Pickup 2) by digital procedure. On the basis of the scheme shown in Fig. 6, we will explain the PO digital procedure for the 1D case. The extension to the 2D case is straightforward. Since we assume that the optically-acquired set of elemental images (Set 1) have a pixilated structure, it is clear that the conversion to the second set of elemental images (Set 2) will simply consist in a pixel mapping. Then, to avoid typical aliasing problems in the pixels mapping, it is necessary to assume that the following relations are satisfied

N=mMandM=2dg,

where N stands for the number of pixels per elemental cell, M is the number of elemental images, and m is a positive integer number. Since there is no loss of energy in the digital pickup, we perform the digital Pickup 2 through a virtual pinhole array. This simplifies the process.

 figure: Fig. 6.

Fig. 6. Scheme of the PO digital conversion method.

Download Full Size | PPT Slide | PDF

In Fig. 6 we show a scheme of the procedure in which N=M=5. Let us concentrate in the formation of the central micro-image of Set 2. In the Pickup 1, a set of 5 elemental images is optically recorded in the CCD. Each elemental image records a different perspective of the object, and all the pixels of the cell are impressed. When, in the Display 1, these elemental images are viewed through the central pinhole a faceting effect occurs: a different portion of the reconstructed image is seen through each microlens [24]. Then, through the upper microlens the central pinhole only can see the “blue” part of the reconstructed image, which indeed was recorded in the “blue” pixel of the upper micro-cell. In other words, the blue part of the reconstructed image is stored in pixel 1 of cell 1, that is, in the element O 1,1 of Set 1. Here the first index refers to the cell and the second to the pixel in the cell. Similarly, the “black” region of the reconstructed image (as seen by central pinhole) is stored in the element O 2,2, and so on. Then, as we see in the scheme, the collection of pixels that conform the central elemental image of Set 2 can be obtained after the mapping D 3,j=O 6-j,6-j. A similar process occurs when the reconstructed image is seen through the other pinholes. So, we can straightforwardly conclude that the mapping, to which we will refer as the smart mapping, that provides the Set 2 of elemental images (which will permit, after the optical Display 2, the reconstruction of a real, undistorted, orthoscopic image) can be expressed as

Di,j=Ok,wherek=i+(M+12)j;and=(M+1)j.

In this equation we have assumed that the number of micro lenses is an odd number. In the case of an even number of lenses, a similar equation holds. Besides, if for a given value of j, the corresponding k < 1 or k > M, then the corresponding pixel D i,j is set to zero.

4. The numerical experiment

To illustrate our approach we have performed a numerical experiment. In the simulation we have considered that the Pickup 1 is performed by a microlens array composed of 39×39 square microlenses of 1mm×1mm in size and focal length of f=5 mm (see Fig. 7). Each elemental cell consisted in 39×39 pixels. We calculated the set of elemental images of a scene composed by two letters placed at different depths. The set of elemental images obtained with this setup, together with the elemental images obtained after the smart pixel mapping, are shown in Fig. 8. Besides, we have simulated the reconstruction stage. In our calculations the observer is at a distance D=500 mm from the microlens array and sees the reconstructed pseudoscopic Fig. 9(a) or orthoscopic Fig. 9(b) image. We have simulated a lateral displacement for the observer from x = -25mm to x = +25mm . The position x = 0 corresponds to the case in which the observer’s eye is centered at the optical axis of the central microlens. Note how the smart pixel mapping allows the orthoscopic reconstruction without the need of an additional second recording stage.

 figure: Fig. 7.

Fig. 7. Scheme, not to scale, of the pickup numerical experiment. The CCD is adjusted so that the reference object plane is at 102.5 mm. The letters are separated by 60 mm in depth.

Download Full Size | PPT Slide | PDF

 figure: Fig. 8.

Fig. 8. (a) Collection of 39 × 39 elemental images obtained with the setup of previous figure; (b) Elemental images obtained after applying the smart pixel mapping.

Download Full Size | PPT Slide | PDF

 figure: Fig. 9

Fig. 9 (a) Real, pseudoscopic image calculated from the elemental images in Fig. 8 (a); (b) Real orthoscopic image calculated from the elemental images in Fig. 8 (b). In both reconstructions we assume that the observer is placed at 500 mm from the microlenses, and he/she laterally displaces the eye from left to right The eye displacement is from x=-25mm to x=+25 mm. (Video file of 0.27 Mb).

Download Full Size | PPT Slide | PDF

5. Conclusions

We have presented, for the first time we believe, a technique for formation of real, undistorted orthoscopic integral images by direct pickup. The technique is simply based on a smart mapping of pixels of an elemental-images set, and therefore does not need for any additional reconstruction/recording stage. The pixel mapping obeys a very simple formula, so that it can be applied in real time if required for video applications. We have performed a numerical experiment that shows that our technique provides the desired reconstruction without any degradation of resolution.

Acknowledgments

This work has been funded in part by the grant DPI2003-4698, Ministerio de Ciencia y Tecnología, Spain. R. Martínez-Cuenca acknowledges funding from the Universitat de Valéncia (Cinc Segles grant).

References and Links

1 . M. G. Lippmann , “ Epreuves reversibles donnant la sensation du relief ,” J. Phys. (Paris) 7 , 821 – 825 ( 1908 ).

2 . F. Okano , H. Hoshino , J. Arai , and I. Yayuma , “ Real time pickup method for a three-dimensional image based on integral photography ,” Appl. Opt. 36 , 1598 – 1603 ( 1997 ). [CrossRef]   [PubMed]  

3 . H. Arimoto and B. Javidi , “ Integral three-dimensional imaging with digital reconstruction ,” Opt. Lett. 26 , 157 – 159 ( 2001 ). [CrossRef]  

4 . L. Erdman and K. J. Gabriel , “ High resolution digital photography by use of a scanning microlens array ,” Appl. Opt. 40 , 5592 – 5599 ( 2001 ). [CrossRef]  

5 . S. Kishk and B. Javidi , “ Improved resolution 3D object sensing and recognition using time multiplexed computational integral imaging ,” Opt. Express 11 , 3528 – 3541 ( 2003 ), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-26-3528 . [CrossRef]   [PubMed]  

6 . A. Stern and B. Javidi , “ Three-dimensional image sensing and reconstruction with time-division multiplexed computational integral imaging ,“ Appl. Opt. 42 , 7036 – 7042 ( 2003 ). [CrossRef]   [PubMed]  

7 . J.-S. Jang and B. Javidi , “ Three-dimensional synthetic aperture integral imaging ,” Opt. Lett. 27 , 1144 – 1146 ( 2002 ). [CrossRef]  

8 . J.-S. Jang and B. Javidi , “ Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics ,” Opt. Lett. 27 , 324 – 326 ( 2002 ). [CrossRef]  

9 . J.-S. Jang and B. Javidi , “ Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes ,” Opt. Lett. 28 , 1924 – 1926 ( 2003 ). [CrossRef]   [PubMed]  

10 . M. Martínez-Corral , B. Javidi , R. Martínez-Cuenca , and G. Saavedra , “ Integral imaging with improved depth of field by use of amplitude modulated microlens array ,” Appl. Opt. 43 (31) ( 2004 ). [CrossRef]   [PubMed]  

11 . R. Martínez-Cuenca , G. Saavedra , M. Martínez-Corral , and B. Javidi , “ Enhanced depth of field integral imaging with sensor resolution constraints ,” Opt. Express 12 , 5237 – 5242 ( 2004 ). [CrossRef]   [PubMed]  

12 . M. Hain , W. von Spiegel , M. Schmiedchen , T. Tschudi , and B. Javidi , “ 3D integral imaging using diffractive Fresnel lens array ,” Opt. Express 13 , 315 – 326 ( 2005 ). [CrossRef]   [PubMed]  

13 . Y. Kim , J.-H. Park , S.-W. Min , S. Jung , H. Choi , and B. Lee , “ Wide-viewing-angle integral three-dimensional imaging system by curving a screen and a lens array ,” Appl. Opt. 44 , 546 – 552 ( 2005 ). [CrossRef]   [PubMed]  

14 . J. Aray , F. Okano , H. Hoshino , and I. Yuyama , “ Gradient-index lens-array method based on real-time integral photography for three-dimensional images ,” Appl. Opt. 37 , 2035 – 2045 ( 1998 ).

15 . H. Choi , S.-W. Min , S. Jung , J.-H. Park , and B. Lee , “ Multiple-viewing-zone integral imaging using dynamic barrier array for three-dimensional displays ,” Opt. Express 11 , 927 – 932 ( 2003 ). [CrossRef]   [PubMed]  

16 . Y. Frauel , O. Matoba , E. Tajahuerce , and B. Javidi , “ Comparison of passive ranging integral imaging and active imaging digital holography for 3D object recognition ,” Appl. Opt. 43 , 452 – 462 ( 2004 ). [CrossRef]   [PubMed]  

17 . S. Yeom and B. Yavidi , “ Three-dimensional distortion-tolerant object recognition using integral imaging ,” Opt. Express 12 , 5795 – 5809 ( 2005 ). [CrossRef]  

18 . O. Matoba and B. Javidi , “ Three-dimensional polarimetric integral imaging ,” Opt. Lett. 29 , 2375 – 2377 ( 2004 ). [CrossRef]   [PubMed]  

19 . S.-W. Min , M. Hahn , J. Kim , and B. Lee , “ Three-dimensional electro-floating display system using an integral imaging method ,” Opt. Express 13 , 4358 – 4369 ( 2005 ). [CrossRef]   [PubMed]  

20 . H. E. Ives , “ Optical properties of a Lippmann lenticulated sheet ,” J. Opt. Soc. Am. 21 , 171 – 176 ( 1931 ). [CrossRef]  

21 . T. Okoshi , “ Three-dimensional displays ,” Proc. IEEE 68 , 548 – 564 ( 1980 ). [CrossRef]  

22 . N. Davies , M. McCormick , and L. Yang , “ Three-dimensional imaging systems: a new development ,” Appl. Opt. 27 , 4520 – 4528 ( 1988 ). [CrossRef]   [PubMed]  

23 . J.-H. Park , S.-W. Min , S. Jung , and B. Lee , “ Analysis of viewing parameters for two display methods based on integral photography ,” Appl. Opt. 40 , 5217 – 5232 ( 2001 ). [CrossRef]  

24 . M. Martínez-Corral , B. Javidi , R. Martínez-Cuenca , and G. Saavedra , “ Mutifacet structure of observed reconstructed integral images ,” J. Opt. Soc. Am. A 22 , 597 – 603 ( 2005 ). [CrossRef]  

25 . J-S. Jang and B. Javidi , “ Two-step integral imaging for orthoscopic three-dimensional imaging with improved viewing resolution ,” Opt. Eng. 41 , 2568 – 2571 ( 2002 ). [CrossRef]  

26 . J-S. Jang and B. Javidi , “ Three-dimensional projection integral imaging using micro-convex-mirror arrays ,” Opt. Express 12 , 1077 – 1083 ( 2004 ). [CrossRef]   [PubMed]  

27 . J.-S. Jang and B. Javidi , “ Formation of orthoscopic three-dimensional real images in direct pickup one-step integral imaging ,” Opt. Eng. 42 , 1869 – 1870 ( 2003 ). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. M. G. Lippmann , “ Epreuves reversibles donnant la sensation du relief ,” J. Phys. (Paris)   7 , 821 – 825 ( 1908 ).
  2. F. Okano , H. Hoshino , J. Arai , and I. Yayuma , “ Real time pickup method for a three-dimensional image based on integral photography ,” Appl. Opt.   36 , 1598 – 1603 ( 1997 ).
    [Crossref] [PubMed]
  3. H. Arimoto and B. Javidi , “ Integral three-dimensional imaging with digital reconstruction ,” Opt. Lett.   26 , 157 – 159 ( 2001 ).
    [Crossref]
  4. L. Erdman and K. J. Gabriel , “ High resolution digital photography by use of a scanning microlens array ,” Appl. Opt.   40 , 5592 – 5599 ( 2001 ).
    [Crossref]
  5. S. Kishk and B. Javidi , “ Improved resolution 3D object sensing and recognition using time multiplexed computational integral imaging ,” Opt. Express   11 , 3528 – 3541 ( 2003 ), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-26-3528 .
    [Crossref] [PubMed]
  6. A. Stern and B. Javidi , “ Three-dimensional image sensing and reconstruction with time-division multiplexed computational integral imaging ,“ Appl. Opt.   42 , 7036 – 7042 ( 2003 ).
    [Crossref] [PubMed]
  7. J.-S. Jang and B. Javidi , “ Three-dimensional synthetic aperture integral imaging ,” Opt. Lett.   27 , 1144 – 1146 ( 2002 ).
    [Crossref]
  8. J.-S. Jang and B. Javidi , “ Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics ,” Opt. Lett.   27 , 324 – 326 ( 2002 ).
    [Crossref]
  9. J.-S. Jang and B. Javidi , “ Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes ,” Opt. Lett.   28 , 1924 – 1926 ( 2003 ).
    [Crossref] [PubMed]
  10. M. Martínez-Corral , B. Javidi , R. Martínez-Cuenca , and G. Saavedra , “ Integral imaging with improved depth of field by use of amplitude modulated microlens array ,” Appl. Opt.   43 (31) ( 2004 ).
    [Crossref] [PubMed]
  11. R. Martínez-Cuenca , G. Saavedra , M. Martínez-Corral , and B. Javidi , “ Enhanced depth of field integral imaging with sensor resolution constraints ,” Opt. Express   12 , 5237 – 5242 ( 2004 ).
    [Crossref] [PubMed]
  12. M. Hain , W. von Spiegel , M. Schmiedchen , T. Tschudi , and B. Javidi , “ 3D integral imaging using diffractive Fresnel lens array ,” Opt. Express   13 , 315 – 326 ( 2005 ).
    [Crossref] [PubMed]
  13. Y. Kim , J.-H. Park , S.-W. Min , S. Jung , H. Choi , and B. Lee , “ Wide-viewing-angle integral three-dimensional imaging system by curving a screen and a lens array ,” Appl. Opt.   44 , 546 – 552 ( 2005 ).
    [Crossref] [PubMed]
  14. J. Aray , F. Okano , H. Hoshino , and I. Yuyama , “ Gradient-index lens-array method based on real-time integral photography for three-dimensional images ,” Appl. Opt.   37 , 2035 – 2045 ( 1998 ).
  15. H. Choi , S.-W. Min , S. Jung , J.-H. Park , and B. Lee , “ Multiple-viewing-zone integral imaging using dynamic barrier array for three-dimensional displays ,” Opt. Express   11 , 927 – 932 ( 2003 ).
    [Crossref] [PubMed]
  16. Y. Frauel , O. Matoba , E. Tajahuerce , and B. Javidi , “ Comparison of passive ranging integral imaging and active imaging digital holography for 3D object recognition ,” Appl. Opt.   43 , 452 – 462 ( 2004 ).
    [Crossref] [PubMed]
  17. S. Yeom and B. Yavidi , “ Three-dimensional distortion-tolerant object recognition using integral imaging ,” Opt. Express   12 , 5795 – 5809 ( 2005 ).
    [Crossref]
  18. O. Matoba and B. Javidi , “ Three-dimensional polarimetric integral imaging ,” Opt. Lett.   29 , 2375 – 2377 ( 2004 ).
    [Crossref] [PubMed]
  19. S.-W. Min , M. Hahn , J. Kim , and B. Lee , “ Three-dimensional electro-floating display system using an integral imaging method ,” Opt. Express   13 , 4358 – 4369 ( 2005 ).
    [Crossref] [PubMed]
  20. H. E. Ives , “ Optical properties of a Lippmann lenticulated sheet ,” J. Opt. Soc. Am.   21 , 171 – 176 ( 1931 ).
    [Crossref]
  21. T. Okoshi , “ Three-dimensional displays ,” Proc. IEEE   68 , 548 – 564 ( 1980 ).
    [Crossref]
  22. N. Davies , M. McCormick , and L. Yang , “ Three-dimensional imaging systems: a new development ,” Appl. Opt.   27 , 4520 – 4528 ( 1988 ).
    [Crossref] [PubMed]
  23. J.-H. Park , S.-W. Min , S. Jung , and B. Lee , “ Analysis of viewing parameters for two display methods based on integral photography ,” Appl. Opt.   40 , 5217 – 5232 ( 2001 ).
    [Crossref]
  24. M. Martínez-Corral , B. Javidi , R. Martínez-Cuenca , and G. Saavedra , “ Mutifacet structure of observed reconstructed integral images ,” J. Opt. Soc. Am. A   22 , 597 – 603 ( 2005 ).
    [Crossref]
  25. J-S. Jang and B. Javidi , “ Two-step integral imaging for orthoscopic three-dimensional imaging with improved viewing resolution ,” Opt. Eng.   41 , 2568 – 2571 ( 2002 ).
    [Crossref]
  26. J-S. Jang and B. Javidi , “ Three-dimensional projection integral imaging using micro-convex-mirror arrays ,” Opt. Express   12 , 1077 – 1083 ( 2004 ).
    [Crossref] [PubMed]
  27. J.-S. Jang and B. Javidi , “ Formation of orthoscopic three-dimensional real images in direct pickup one-step integral imaging ,” Opt. Eng.   42 , 1869 – 1870 ( 2003 ).
    [Crossref]

2005 (5)

2004 (5)

2003 (5)

2002 (3)

2001 (3)

1998 (1)

J. Aray , F. Okano , H. Hoshino , and I. Yuyama , “ Gradient-index lens-array method based on real-time integral photography for three-dimensional images ,” Appl. Opt.   37 , 2035 – 2045 ( 1998 ).

1997 (1)

1988 (1)

1980 (1)

T. Okoshi , “ Three-dimensional displays ,” Proc. IEEE   68 , 548 – 564 ( 1980 ).
[Crossref]

1931 (1)

1908 (1)

M. G. Lippmann , “ Epreuves reversibles donnant la sensation du relief ,” J. Phys. (Paris)   7 , 821 – 825 ( 1908 ).

Arai, J.

Aray, J.

J. Aray , F. Okano , H. Hoshino , and I. Yuyama , “ Gradient-index lens-array method based on real-time integral photography for three-dimensional images ,” Appl. Opt.   37 , 2035 – 2045 ( 1998 ).

Arimoto, H.

Choi, H.

Davies, N.

Erdman, L.

Frauel, Y.

Gabriel, K. J.

Hahn, M.

Hain, M.

Hoshino, H.

J. Aray , F. Okano , H. Hoshino , and I. Yuyama , “ Gradient-index lens-array method based on real-time integral photography for three-dimensional images ,” Appl. Opt.   37 , 2035 – 2045 ( 1998 ).

F. Okano , H. Hoshino , J. Arai , and I. Yayuma , “ Real time pickup method for a three-dimensional image based on integral photography ,” Appl. Opt.   36 , 1598 – 1603 ( 1997 ).
[Crossref] [PubMed]

Ives, H. E.

Jang, J.-S.

Jang, J-S.

J-S. Jang and B. Javidi , “ Three-dimensional projection integral imaging using micro-convex-mirror arrays ,” Opt. Express   12 , 1077 – 1083 ( 2004 ).
[Crossref] [PubMed]

J-S. Jang and B. Javidi , “ Two-step integral imaging for orthoscopic three-dimensional imaging with improved viewing resolution ,” Opt. Eng.   41 , 2568 – 2571 ( 2002 ).
[Crossref]

Javidi, B.

M. Martínez-Corral , B. Javidi , R. Martínez-Cuenca , and G. Saavedra , “ Mutifacet structure of observed reconstructed integral images ,” J. Opt. Soc. Am. A   22 , 597 – 603 ( 2005 ).
[Crossref]

M. Hain , W. von Spiegel , M. Schmiedchen , T. Tschudi , and B. Javidi , “ 3D integral imaging using diffractive Fresnel lens array ,” Opt. Express   13 , 315 – 326 ( 2005 ).
[Crossref] [PubMed]

M. Martínez-Corral , B. Javidi , R. Martínez-Cuenca , and G. Saavedra , “ Integral imaging with improved depth of field by use of amplitude modulated microlens array ,” Appl. Opt.   43 (31) ( 2004 ).
[Crossref] [PubMed]

R. Martínez-Cuenca , G. Saavedra , M. Martínez-Corral , and B. Javidi , “ Enhanced depth of field integral imaging with sensor resolution constraints ,” Opt. Express   12 , 5237 – 5242 ( 2004 ).
[Crossref] [PubMed]

O. Matoba and B. Javidi , “ Three-dimensional polarimetric integral imaging ,” Opt. Lett.   29 , 2375 – 2377 ( 2004 ).
[Crossref] [PubMed]

Y. Frauel , O. Matoba , E. Tajahuerce , and B. Javidi , “ Comparison of passive ranging integral imaging and active imaging digital holography for 3D object recognition ,” Appl. Opt.   43 , 452 – 462 ( 2004 ).
[Crossref] [PubMed]

J-S. Jang and B. Javidi , “ Three-dimensional projection integral imaging using micro-convex-mirror arrays ,” Opt. Express   12 , 1077 – 1083 ( 2004 ).
[Crossref] [PubMed]

J.-S. Jang and B. Javidi , “ Formation of orthoscopic three-dimensional real images in direct pickup one-step integral imaging ,” Opt. Eng.   42 , 1869 – 1870 ( 2003 ).
[Crossref]

J.-S. Jang and B. Javidi , “ Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes ,” Opt. Lett.   28 , 1924 – 1926 ( 2003 ).
[Crossref] [PubMed]

A. Stern and B. Javidi , “ Three-dimensional image sensing and reconstruction with time-division multiplexed computational integral imaging ,“ Appl. Opt.   42 , 7036 – 7042 ( 2003 ).
[Crossref] [PubMed]

S. Kishk and B. Javidi , “ Improved resolution 3D object sensing and recognition using time multiplexed computational integral imaging ,” Opt. Express   11 , 3528 – 3541 ( 2003 ), http://www.opticsexpress.org/abstract.cfm?URI=OPEX-11-26-3528 .
[Crossref] [PubMed]

J.-S. Jang and B. Javidi , “ Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics ,” Opt. Lett.   27 , 324 – 326 ( 2002 ).
[Crossref]

J.-S. Jang and B. Javidi , “ Three-dimensional synthetic aperture integral imaging ,” Opt. Lett.   27 , 1144 – 1146 ( 2002 ).
[Crossref]

J-S. Jang and B. Javidi , “ Two-step integral imaging for orthoscopic three-dimensional imaging with improved viewing resolution ,” Opt. Eng.   41 , 2568 – 2571 ( 2002 ).
[Crossref]

H. Arimoto and B. Javidi , “ Integral three-dimensional imaging with digital reconstruction ,” Opt. Lett.   26 , 157 – 159 ( 2001 ).
[Crossref]

Jung, S.

Kim, J.

Kim, Y.

Kishk, S.

Lee, B.

Lippmann, M. G.

M. G. Lippmann , “ Epreuves reversibles donnant la sensation du relief ,” J. Phys. (Paris)   7 , 821 – 825 ( 1908 ).

Martínez-Corral, M.

Martínez-Cuenca, R.

Matoba, O.

McCormick, M.

Min, S.-W.

Okano, F.

J. Aray , F. Okano , H. Hoshino , and I. Yuyama , “ Gradient-index lens-array method based on real-time integral photography for three-dimensional images ,” Appl. Opt.   37 , 2035 – 2045 ( 1998 ).

F. Okano , H. Hoshino , J. Arai , and I. Yayuma , “ Real time pickup method for a three-dimensional image based on integral photography ,” Appl. Opt.   36 , 1598 – 1603 ( 1997 ).
[Crossref] [PubMed]

Okoshi, T.

T. Okoshi , “ Three-dimensional displays ,” Proc. IEEE   68 , 548 – 564 ( 1980 ).
[Crossref]

Park, J.-H.

Saavedra, G.

Schmiedchen, M.

Stern, A.

Tajahuerce, E.

Tschudi, T.

von Spiegel, W.

Yang, L.

Yavidi, B.

Yayuma, I.

Yeom, S.

Yuyama, I.

J. Aray , F. Okano , H. Hoshino , and I. Yuyama , “ Gradient-index lens-array method based on real-time integral photography for three-dimensional images ,” Appl. Opt.   37 , 2035 – 2045 ( 1998 ).

Appl. Opt. (9)

F. Okano , H. Hoshino , J. Arai , and I. Yayuma , “ Real time pickup method for a three-dimensional image based on integral photography ,” Appl. Opt.   36 , 1598 – 1603 ( 1997 ).
[Crossref] [PubMed]

L. Erdman and K. J. Gabriel , “ High resolution digital photography by use of a scanning microlens array ,” Appl. Opt.   40 , 5592 – 5599 ( 2001 ).
[Crossref]

A. Stern and B. Javidi , “ Three-dimensional image sensing and reconstruction with time-division multiplexed computational integral imaging ,“ Appl. Opt.   42 , 7036 – 7042 ( 2003 ).
[Crossref] [PubMed]

M. Martínez-Corral , B. Javidi , R. Martínez-Cuenca , and G. Saavedra , “ Integral imaging with improved depth of field by use of amplitude modulated microlens array ,” Appl. Opt.   43 (31) ( 2004 ).
[Crossref] [PubMed]

Y. Kim , J.-H. Park , S.-W. Min , S. Jung , H. Choi , and B. Lee , “ Wide-viewing-angle integral three-dimensional imaging system by curving a screen and a lens array ,” Appl. Opt.   44 , 546 – 552 ( 2005 ).
[Crossref] [PubMed]

J. Aray , F. Okano , H. Hoshino , and I. Yuyama , “ Gradient-index lens-array method based on real-time integral photography for three-dimensional images ,” Appl. Opt.   37 , 2035 – 2045 ( 1998 ).

Y. Frauel , O. Matoba , E. Tajahuerce , and B. Javidi , “ Comparison of passive ranging integral imaging and active imaging digital holography for 3D object recognition ,” Appl. Opt.   43 , 452 – 462 ( 2004 ).
[Crossref] [PubMed]

N. Davies , M. McCormick , and L. Yang , “ Three-dimensional imaging systems: a new development ,” Appl. Opt.   27 , 4520 – 4528 ( 1988 ).
[Crossref] [PubMed]

J.-H. Park , S.-W. Min , S. Jung , and B. Lee , “ Analysis of viewing parameters for two display methods based on integral photography ,” Appl. Opt.   40 , 5217 – 5232 ( 2001 ).
[Crossref]

J. Opt. Soc. Am. (1)

J. Opt. Soc. Am. A (1)

J. Phys. (1)

M. G. Lippmann , “ Epreuves reversibles donnant la sensation du relief ,” J. Phys. (Paris)   7 , 821 – 825 ( 1908 ).

Opt. Eng. (2)

J-S. Jang and B. Javidi , “ Two-step integral imaging for orthoscopic three-dimensional imaging with improved viewing resolution ,” Opt. Eng.   41 , 2568 – 2571 ( 2002 ).
[Crossref]

J.-S. Jang and B. Javidi , “ Formation of orthoscopic three-dimensional real images in direct pickup one-step integral imaging ,” Opt. Eng.   42 , 1869 – 1870 ( 2003 ).
[Crossref]

Opt. Express (7)

Opt. Lett. (5)

Proc. IEEE (1)

T. Okoshi , “ Three-dimensional displays ,” Proc. IEEE   68 , 548 – 564 ( 1980 ).
[Crossref]

Supplementary Material (1)

» Media 1: GIF (265 KB)     

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. Scheme of a standard InI system. In the pickup stage a set of 2D elemental images are recorded in the CCD. In the display step a real depth-reversed image is provided to the observer.
Fig. 2.
Fig. 2. PO conversion method proposed by Ives. The reconstructed image is used as the object for a second pickup. The second display provides the observer with a real, undistorted orthoscopic image.
Fig. 3.
Fig. 3. The auto-collimating screen produces undistorted, depth-reversed images of 3D objects.
Fig. 4.
Fig. 4. Schematic drawing of the orthoscopic, virtual reconstruction.
Fig. 5.
Fig. 5. This technique provides real, orthoscopic but distorted 3D images.
Fig. 6.
Fig. 6. Scheme of the PO digital conversion method.
Fig. 7.
Fig. 7. Scheme, not to scale, of the pickup numerical experiment. The CCD is adjusted so that the reference object plane is at 102.5 mm. The letters are separated by 60 mm in depth.
Fig. 8.
Fig. 8. (a) Collection of 39 × 39 elemental images obtained with the setup of previous figure; (b) Elemental images obtained after applying the smart pixel mapping.
Fig. 9
Fig. 9 (a) Real, pseudoscopic image calculated from the elemental images in Fig. 8 (a); (b) Real orthoscopic image calculated from the elemental images in Fig. 8 (b). In both reconstructions we assume that the observer is placed at 500 mm from the microlenses, and he/she laterally displaces the eye from left to right The eye displacement is from x=-25mm to x=+25 mm. (Video file of 0.27 Mb).

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

N = m M and M = 2 d g ,
D i , j = O k , where k = i + ( M + 1 2 ) j ; and = ( M + 1 ) j .

Metrics