Abstract

With the recent advancement in visualization devices over the last years, we are seeing a growing market for stereoscopic content. In order to convey 3D content by means of stereoscopic displays, one needs to transmit and display at least 2 points of view of the video content. This has profound implications on the resources required to transmit the content, as well as demands on the complexity of the visualization system. It is known that stereoscopic images are redundant which may prove useful for compression and may have positive effect on the construction of the visualization device. In this paper we describe an experimental evaluation of data redundancy in color stereoscopic images. In the experiments with computer generated and real life test stereo images, several observers visually tested the stereopsis threshold and accuracy of parallax measurement in anaglyphs and stereograms as functions of the blur degree of one of two stereo images. In addition, we tested the color saturation threshold in one of two stereo images for which full color 3D perception with no visible color degradations was maintained. The experiments support a theoretical estimate that one has to add, to data required to reproduce one of two stereoscopic images, only several percents of that amount of data in order to achieve stereoscopic perception.

© 2005 Optical Society of America

Full Article  |  PDF Article

References

  • View by:
  • |

  1. B. Blundell, A. Schwarz, Volumetric Three Dimensional Display Systems, (J. Wiley&Sons, N.Y., 2000)
  2. M. Halle. Autoestereoscopic displays and computer graphics, Computer Graphics. pp 58-62. 1997.
  3. K. Choi,, H. Kim, B. Lee, "Full-color autostereoscopic 3D display system using color-dispersion-compensated synthetic phase holograms," Opt. Express, 8 Vol. 12, No. 21, 5229-5236 (2004)
    [CrossRef]
  4. G. Mather and D.R.R. Smith. "Depth cue integration: stereopsis and image blur," Vision Research 40 (2000), 3501-3506.
  5. L. Yaroslavsky, Digital Holography and Digital Image Processing, (Kluwer Academic Publishers, Boston, 2004).
  6. B. Julesz, Foundations of Cyclopean Perception, (The University of Chicago Press, 1971), p. 96
  7. L. P. Yaroslavsky, "On Redundancy of Stereoscopic Pictures," Acta Polytechnica Scandinavica, n. 149. Image Science'85. Proceedings. Helsinki, Finland., 11-14 June 1985, V. 1, p. 82-85
  8. L.P. Yaroslavsky, The Theory of Optimal Methods for Localization of Objects in Pictures, In: "Progress in Optics", Ed. E. Wolf, v.XXXII, Elsevier Science Publishers, Amsterdam, 1993
  9. S. Westland and C. Ripamonti, Charactierization of Computer displays, In: "Computational Colour Sicence Using MATLAB", John Wiley & Sons, 2004

Acta Polytechnica Scandinavica (1)

L. P. Yaroslavsky, "On Redundancy of Stereoscopic Pictures," Acta Polytechnica Scandinavica, n. 149. Image Science'85. Proceedings. Helsinki, Finland., 11-14 June 1985, V. 1, p. 82-85

Computational Colour Sicence Using MATLA (1)

S. Westland and C. Ripamonti, Charactierization of Computer displays, In: "Computational Colour Sicence Using MATLAB", John Wiley & Sons, 2004

Computer Graphics (1)

M. Halle. Autoestereoscopic displays and computer graphics, Computer Graphics. pp 58-62. 1997.

Opt. Express (1)

K. Choi,, H. Kim, B. Lee, "Full-color autostereoscopic 3D display system using color-dispersion-compensated synthetic phase holograms," Opt. Express, 8 Vol. 12, No. 21, 5229-5236 (2004)
[CrossRef]

Progress in Optics (1)

L.P. Yaroslavsky, The Theory of Optimal Methods for Localization of Objects in Pictures, In: "Progress in Optics", Ed. E. Wolf, v.XXXII, Elsevier Science Publishers, Amsterdam, 1993

Vision Research (1)

G. Mather and D.R.R. Smith. "Depth cue integration: stereopsis and image blur," Vision Research 40 (2000), 3501-3506.

Other (3)

L. Yaroslavsky, Digital Holography and Digital Image Processing, (Kluwer Academic Publishers, Boston, 2004).

B. Julesz, Foundations of Cyclopean Perception, (The University of Chicago Press, 1971), p. 96

B. Blundell, A. Schwarz, Volumetric Three Dimensional Display Systems, (J. Wiley&Sons, N.Y., 2000)

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1.

Test images used in the experiments (a-h). Image h) exemplifies low pass filtering to 1/4 of the image base band. Note that when images g) and h) are fused using stereoscope or by crossed eyes, a 3-D sharp image can be seen

Fig. 2.
Fig. 2.

Depth maps used in the experiments

Fig. 3.
Fig. 3.

Examples of computer generated random dot patterns used in the experiments: a), b) - right and left images for stereoscope or crossed eyes view, c) -anaglyph (left eye-red, right eye- blue). Right images exemplify low pass filtering to 1/4 of the image base band. Note that when images are fused using stereoscope or crossed eyes and, correspondingly, color red-blue glasses, a test circle 3-D target can be seen without visible losses of the image resolution

Fig. 4
Fig. 4

Root mean square error of parallax measurements (in microns) as a function of the decimation factor as measured on 31 randomly selected fragments of a training stereo air photograph analyzed by a professional human operator.

Fig.5.
Fig.5.

Stereopsis threshold as a function of image blur.: a) test image “random dots”, case “ideal low pass filter, 10 experiments (dots) with one viewer and data average (red) and median (blue); b) - test image “Color random patches”, case “low pass filter, 3 viewers (dots) and data average (red) and median (blue); c) - test image “random dots”, “Haar” wavelet low pass filter”; d) - test image “random dots”, “Daubechis -1 wavelet low pass filter. In d) and c), blue curves represent mean values of stereopsis threshold measurements and green curves represents standard deviation of the measurements. One pixel corresponds to 0.87 mrad

Fig. 6, a-b.
Fig. 6, a-b.

Mean square of 3D target localization error as a function of image blur using approximated low pass filter. Error bars indicate measurement standard deviation.

Fig. 6, e-f.
Fig. 6, e-f.

Mean square of 3D target localization error as a function of image blur using approximated low pass filter as a function of image blur using the approximated low pass filter. Error bars indicate measurement standard deviation. For stereoscopic images one pixel corresponds to 0.87 mrad, and for anaglyphs to 0.588 mrad.

Fig 7.
Fig 7.

Mean square of 3D target localization error as a function of image blur using Daubechis-1 wavelet low pass filter, depth = 4: a) - all data; b) - data for the first three scales. Error bars indicate standard deviation of measurements. One pixel corresponds to 0.87 mrad

Fig. 8.
Fig. 8.

An example of color stereo images with one image of low saturation.

Fig. 9.
Fig. 9.

Results on color saturation (left: random patches image, Fig. 1(f); right: real life image, Figs. 1(g), 1(h))

Tables (2)

Tables Icon

Table 1. Measured triestimulus values (X,Y,Z) obtained for different digital values (dr,dg,db) sent to the monitor

Tables Icon

Table 2. Gamma and Gain values obtained for the different RGB channels

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

R mod = RGB ¯ + g ( R RGB ¯ ) , G mod = RGB ¯ + g ( G RGB ¯ ) , B mod = RGB ¯ + g ( B RGB ¯ ) .
RGB ¯ = ( R + G + B ) / 3
R = ( a r d r + ( 1 a r ) ) γ r G = ( a g d g + ( 1 a g ) ) γ g B = ( a b d b + ( 1 a b ) ) γ b
[ X Y Z ] = [ X r , max X g , max X b , max Y r , max Y g , max Y b , max Z r , max Z g , max Z b , max ] [ R G B ]
e r = k R ( k ) ( a r d r ( k ) + ( 1 a r ) ) γ r 2 e g = k G ( k ) ( a g d g ( k ) + ( 1 a g ) ) γ g 2 e b = k B ( k ) ( a b d b ( k ) + ( 1 a b ) ) γ b 2

Metrics