## Abstract

Non-planar screens are increasingly used in mobile projectors and virtual reality environments. When the screen is modeled as a second order polynomial, a quadric transfer method can be employed to compensate for image distortion. This method uses the quadric matrix that models 3D surface information of a quadric screen. However, if the shape of the screen changes or the screen is moved, the 3D shape of the screen must be measured again to update the quadric matrix. We propose a new method of compensating for image distortion resulting from variation of the quadric screen. The proposed method is simpler and faster than remeasuring the 3D screen matrix.

© 2011 OSA

## 1. Introduction

Projector technology is used widely with curved screens or non-planer screens for virtual reality or mobile projectors. In the case of large displays, curved screens are most often used in immersive display systems [1]. Much research on the compensation for geometric distortion of projected images has been conducted [2–6]. If the screen used in virtual reality displays is non-planar, we need to compensate for the geometric distortion by measuring and modeling the screen. To measure 3D screen geometry, many methods exist: those based on measurements using structured light [7], others on binary patterns that involve synchronizing the camera and projector [8], and also those using 2D gray codes [9].

Shashua and Toelg proposed the theory of the relationship between two perspective views on quadric surfaces [10]. Raskar *et al*. proposed the quadric transfer, a geometric compensation method for the projected image on the quadric curved screen. They also used GPU vertex shader for real-time implementation of the quadric transfer [11]. Emori and Saito proposed a stereo texture-overlay system with an HMD, which can warp the projected images adaptively to the surface of the projected object in real-time [12]. They used a real-time quadratic or a cubic geometric transformation.

When images are projected on quadric screens and a camera observes the screen, the quadric matrix of the screen is used to compensate for the geometric distortion on the camera image plane. The quadric transfer by Raskar is used to correct for the image distortion using the relationship between the projector, the camera, and the quadric screen. The camera is located at the observer’s position. Figure 1 shows an example of a projector-camera system used for the quadric transfer.

For mobile projectors, screens and projectors are not anchored stably. Therefore, image distortions may arise due to relative movements between them after the initial calibration. Consider a particular case: the touch screen of an interactive projector. It is critical to correct for image distortion for correct interactions. Image warping can be caused by slight movements of the projector or by screen deformation.

In this paper, we will expand this quadric transfer method using a projector-camera system. When the curved screen moves or changes curvature, the image observed by the camera will be distorted accordingly. To show distortion-free images to an observer even after the screen is altered, we must estimate the changed 3D surface parameters. Then we need to measure the 3D coordinates of the screen points, and calculate the quadric parameters from the 3D surface positions.

If changes of parameters are small enough so as to approximate the change of the quadric transfer with the first order Taylor series, then we can calculate the quadric matrix change from the 2D shift of camera images. We propose a compensation method of the image distortion using 2D image coordinates instead of measuring 3D screen coordinates: our method estimates the perturbation of the quadric matrix from 2D measurements of the distorted image. The proposed method is simpler and faster than calculating a new 3D screen matrix, and real-time monitoring of the image distortion is possible when watermarks are employed.

The remainder of this paper is structured as follows: Section 2 describes the quadric matrix of the curved screen and that of the changed screen. The linear approximation of curved screen change is presented in Section 3. Simulations and experimental results are shown in Section 4, and conclusions are presented in the final section.

## 2. Quadric Transfer and Screen Change

Since the proposed method relies on the quadric transfer, we will describe the compensation of the projected image distortion on curved screen using the quadric transfer proposed by Raskar *et al* [11]. The quadric transfer is the mapping of image coordinates of two views on quadric curved screen.

**x**represents the 3D coordinates of the first view, ${x}^{\prime}$ the 3D coordinates of the second view, and $e={[\begin{array}{ccc}{e}_{x}& {e}_{y}& {e}_{z}\end{array}]}^{T}$ the epipole, the projection center of the first view in the second view. The ± sign indicates whether the screen type is concave or convex. We can determine the sign using one point correspondence. Matrices

**A**and

**E**are defined as follows:where

**B**is a 3 × 3 homography matrix between the two coordinates, and

**Q**is the quadric matrix of the screen. ${Q}_{33}$ and

**q**are submatrices of

**Q**as defined in the following equations:

When the quadric screen sways or changes shape, we must find the new quadric matrix of the altered screen. The conventional method is to use the 3D coordinates of the screen. In this case, the 3D coordinates of the screen will be measured continuously for real-time compensation, and the quadric matrix must be calculated again. In this paper, however, we propose a new compensation method for image distortion that uses 2D image coordinates observed by a camera, and calculates the change of the quadric matrix to correct for the quadric transfer.

Let the changed quadric matrix of the curved screen be ${Q}_{ch}=Q+\Delta Q$. Then the changed 3D coordinates of the second view can be expressed as the following equation using the change of the quadric matrix$\Delta Q$:

where $\Delta {x}^{\prime}$ is the change of coordinates in the second view. We assume that the centers and the orientations of the two views are fixed. The change in the quadric transfer, $\Delta A$ and $\Delta E$ can be represented as follows.#### 2.1 Quadric Transfer

A camera 3D point ${\tilde{x}}^{\prime}$ is mapped to a projector 3D point $\tilde{x}$ by the quadric transfer:

where the sign ≅ denotes equality up to a scale factor for the homogeneous coordinates. The superscript tilde on ${\tilde{x}}^{\prime}$ and $\tilde{x}$ indicates that the coordinates are homogeneous. That is, the above equation specifies the mapping relationship between the projection ray $\overline{OX}$, and the camera image line $\overline{{O}^{\prime}X}$ as shown in Fig. 2 . Let ${x}^{\prime}$ be the image position on the*z*= 1 plane, which is the projection of ${\tilde{x}}^{\prime}$on the

*z*= 1 camera image plane. Then we define $\widehat{x}$ as

*z*= 1 plane.

#### 2.2 Quadric Transfer after Screen Change

After screen deformation, a projected point on the screen moves from **X** to ${X}^{\prime}$, and $\widehat{x}$ is projected to ${{\widehat{x}}^{\prime}}_{ch}$. The quadric transfer of the changed screen is:

*z*= 1 plane. Then using the scale factor

*h*, ${{\widehat{x}}^{\prime}}_{ch}$ can be represented aswhere

*α*is defined as $\alpha \equiv \pm (\sqrt{{\widehat{x}}^{T}(E+\Delta E)\widehat{x}}-\sqrt{{\widehat{x}}^{T}E\widehat{x}})-\Delta {q}^{T}\widehat{x}$.

The geometrical meaning of Eq. (1) is the intersection of line $\overline{{O}^{\prime}{X}^{\prime}}$ and a line with the direction vector **e** passing ${x}^{\prime}$. The scale parameter *h* can be calculated using the minimum mean square error criterion.

Let ${{\widehat{x}}^{\prime}}_{ch}\equiv {x}^{\prime}+\Delta {x}^{\prime}$. Then,

## 3. Change of Quadric Matrix

To simplify the derivation of the perturbation of the screen quadric matrix, let us omit the superscript ^; we will use a prime notation for transferred coordinates. Then the original and the changed quadric transfer equations are succinctly represented by the following equations using the predefined *m* and $\Delta m$:

The change of the camera image coordinates is represented by$\Delta {x}^{\prime}={[\begin{array}{ccc}\Delta {x}^{\prime}& \Delta {y}^{\prime}& \Delta {z}^{\prime}\end{array}]}^{T}$, and the epipole is $e={[\begin{array}{ccc}{e}_{x}& {e}_{y}& {e}_{z}\end{array}]}^{T}$. Then we have

To obtain a linear solution for the quadric matrix change, we take the Taylor expansion of $\sqrt{m+\Delta m}$. We assume that $\left|\Delta m/m\right|<<1$ to obtain a linear solution. Also we ignore the second order term in $\Delta m$, i.e. $\Delta m\approx \Delta {m}_{a}$. Then we obtain Eq. (4):

where $k\equiv \pm \sqrt{m}-{q}^{T}x$.By omitting the second order term $\Delta {m}_{r}$ in Eq. (4), $\pm 2\sqrt{m}\epsilon \approx -({x}^{T}\Delta {Q}_{33}x+2k\Delta {q}^{T}x)$, we can find a linear solution. We rearrange the equation as a multiplication of the quadric matrix perturbation $\Delta \phi $ and the projector coordinates.

Since the above equation is linear when the quadric matrix changes, we can find the perturbation using the change of the camera coordinates and the projector coordinates. The changes of the nine quadric matrix parameters from $\Delta a$ to $\Delta i$ are calculated by nine or more correspondences between the camera and projector images. Thus, the parameters of the quadric transfer, **A** and **E**, are corrected using the linear solution of the change of the quadric matrix. With the corrected quadric transfer, we can compensate for the change and the movement of the screen.

## 4. Simulations and Experimental Results

We verify the accuracy of the proposed correction method from simulations and experiments.

#### 4.1. Simulations

For the simulation, we use a spherical screen for simplicity. The radius of the sphere is 50, the center is (−1,0,50). Figure 3 shows the simulation setup. A projector on the left side projects a test pattern on a spherical screen and the right-side camera captures the pattern.

Figure 4(a) shows the test pattern observed after the quadric transfer. Figure 4(b) is a distorted pattern resulting from a translation of the sphere 5 units to the left. Figure 4(c) is the corrected pattern using the perturbation of the quadric matrix using the observed image change of Fig. 4(a) and Fig. 4(b). When the width of the pattern is normalized to unit length, the mean absolute difference (MAD) of the position difference is 1.3%, while the MAD of the incremental compensation is 0.09%; therefore, the ratio of error reduction is around 15.

Figure 5
shows a plot of the mean absolute errors of the test patterns when the screen is shifted by $\Delta t$ along the *x*-axis. When $\Delta t$ changes from –15 to 15 units, the MAD increases up to 4 pixels. However, after the compensation using the proposed method, the MAD is reduced under 1 pixel.

We compare calculation time of the conventional and the proposed method; calculation of a new quadric matrix from the 3D positions of the screen versus the proposed 2D perturbation correction method. The number of multiplication is decreased by 1/5 compared to the conventional method; however, because the proposed calculation method involves a square root, the computation time is decreased by 1/3 on a Core2Duo E6600 CPU and an nVidia GeForce 8800GTX GPU. Reduction in computation time enables rapid correction of image distortion.

#### 4.2. Experimental Results

We used Flea® Miniature IEEE-1394 camera and Infocus LP600 projector. The resolution of both the camera and the projector is 1024 × 768 pixel. The experimental quadric curved screen has a cylindrical shape. This experimental setup works in real-time and it is shown in Fig. 6 .

We adopt SIFT [13] to extract feature points from real images. We map the projection image using the quadric transfer to display the ideal pattern on the camera image plane. We use GPU pixel shader coded with Cg to generate the pre-warped pattern. Figure 7(a) and Fig. 7(b) show the pre-warped image and the compensated image. If the screen deforms or moves after the quadric transfer compensation, we need to measure the shape change and update the quadric matrix after the screen change. However, we do not need to calculate a new quadric matrix using 3D screen coordinates; rather, we calculate the perturbation of quadric matrix using 2D image coordinate changes captured by the camera using Eq. (5). The distorted camera-captured image after screen change is shown in Fig. 8(a) . The upper left corner of the image is lower than the upper right corner. The compensated image with the proposed method using the change of the quadric matrix is shown in Fig. 8(b). As shown in Fig. 7(b) and Fig. 8(b), the images corrected for curvature and translation are almost identical.

## 5. Conclusion

In this paper, we proposed a compensation method for geometric distortion due to the change of a quadric curved screen. It does not measure a new quadric matrix after screen change; it estimates perturbation of the quadric matrix from changes of 2D image coordinates. Therefore, the 3D shape information of the screen or the calculation of a new quadric matrix is not required. The proposed method is simpler and faster than calculating 3D screen matrices, enabling more frequent updates. In the future, we plan to use the watermarks to monitor the deformation of the screen in real-time.

## Acknowledgments

This research was partly supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (2011-0010378), and the Human Resource Training Project for Strategic Technology through the Korea Institute for Advancement of Technology (KIAT) funded by the Ministry of Knowledge Economy, the Republic of Korea.

## References and links

**1. **J. van Baar, T. Willwacher, S. Rao, and R. Raskar, “Seamless multi-projector display on curved screens,” Eurographics Workshop on Virtual Environments, 281–286 (2003).

**2. **R. Raskar, G. Welch, M. Cutts, A. Lake, L. Stesin and H. Fuchs, “The office of the future: a unified approach to image-based modeling and spatially immersive displays,” SIGGRAPH, 179–188 (1998).

**3. **R. Yang, M. S. Brown, W. B. Seales, and H. Fuchs, “Geometrically correct imagery for teleconferencing,” in *Proceedings of ACM Multimedia*, 179–186 (1999).

**4. **R. Yang and G. Welch, “Automatic and continuous projector display surface calibration using every-day imagery,” in *Proceedings of 9th Int. Conf. in Central Europe in Computer Graphics, Visualization, and Computer Vision* (2001).

**5. **S. Webb and C. Jaynes, “The DOME: a portable multi-projector visualization system for digital artifacts,” IEEE Workshop on Emerging Display Technologies (2005).

**6. **Y. Oyamada and H. Saito, “Focal pre-correction of projected image for deblurring screen image,” IEEE Int. Workshop on Projector-Camera systems (2007).

**7. **R. Raskar, M. Brown, R. Yang, W. Chen, G. Welch, H. Towels, B. Seales, and H. Fuchs, “Multi-projector displays using camera-based registration,” in *Proceedings of IEEE Visualization*, 161–168 (1999).

**8. **S. Zollmann, T. Langlotz, and O. Bimber, “Passive-active geometric calibration for view-dependent projections onto arbitrary surfaces,” Workshop on Virtual and Augmented Reality of the GI-Fachgruppe AR/VR (2006).

**9. **S. Jordan and M. Greenspan, “Projector optical distortion calibration using gray code patterns,” IEEE Int. Workshop on Projector-Camera systems (2010).

**10. **A. Shashua and S. Toelg, “The quadric reference surface: theory and applications,” Int. J. Comput. Vis. **23**(2), 185–198 (1997). [CrossRef]

**11. **R. Raskar, J. van Baar, S. Rao, T. Willwacher, and S. Rao, “Quadric transfer for immersive curved screen displays,” Comput. Graph. **23**(3), 451–460 (2004). [CrossRef]

**12. **M. Emori and H. Saito, “Texture overlay onto deformable surface using HMD,” in *Proceedings of IEEE Virtual Reality*, 221–222 (2004).

**13. **D. G. Lowe, “Object recognition from local scale-invariant features,” in *Proceedings of ICCV*, 1150–1157 (1999).