Abstract

Measuring the three-dimensional (3D) deformation of submerged objects through different media with the stereo digital image correlation (stereo-DIC) method involves the refractive optical imaging problem where the non-linear transmission of light is induced by a change of medium density. The problem invalidates the underlying single viewpoint assumption of the perspective model in regular stereo-DIC, thereby resulting in erroneous measurements of 3D shape and deformation. In this work, we propose a refractive stereo-DIC method that overcomes the problem by considering light refraction in 3D reconstruction. We formulate a full refractive reconstruction geometry description based on Snell’s law of flat refraction and the regular triangulation. This allows the true shape to be effectively reconstructed by tracing and establishing the refracted ray-paths based on the regular 3D reconstruction, without reformulating the camera model and image formation. The refractive stereo-DIC is finally established by integrating the refractive 3D reconstruction into the regular DIC framework for measuring accurate 3D shape and deformation of submerged objects. We experiment the proposed approach with underwater 3D shape and deformation measurements. Both results prove its feasibility and correctness, further heralding our approach as a flexible solution that could readily extend the stereo-DIC to fluid-immersed 3D deformation characterization.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Three-dimensional (3D) deformation measurement is a fundamental problem in photo-mechanics and related fields. In the past decade, it was significantly altered by the establishment of stereo digital image correlation (stereo-DIC) [1,2]. As an optical measurement technique, stereo-DIC measures the full-field 3D deformation data with a simple optical arrangement, and thus goes beyond the traditional techniques (e.g., strain gauge and mechanical extensometer) in many land-based tasks. Huge parts of daily experiments and engineering tests, such as characterization of mechanical and geometrical properties for materials [3,4], biomechanical analysis [5,6], deformation control in industrial fabrication [7,8], and structural health monitoring [9,10], can be powered by stereo-DIC. When it comes to fluid-immersed applications (such as submerged biological tissues and materials characterization, marine structures measurement, etc.), regular stereo-DIC faces challenges posed by light refraction.

In fluid-immersed measurements, the cameras are often placed outside a fluid-filled tank or in a sealed housing and then, observe the object to be measured through a transparent window. For such scenarios, light rays reflected from the surface of the object will pass through the media with a graded refractive index, such as water, glass, and air. Because of light refraction, rays gathered by the cameras are not transmitted straightly but bent at interfaces between different media. However, cameras used in the regular stereo-DIC system are represented as the perspective model (also known as the pinhole camera model) to describe the projection transformation from 3D points in the world to 2D points in images [1,11]. The vision geometry in both forward and backward projection is assumed to correspond to the single viewpoint assumption where light travels in straight lines. If the light rays are bent due to the refraction, this assumption does not hold [12], leading to erroneous 3D reconstruction. This will further lead to errors in deformation estimates since stereo-DIC is highly relying on the recovered object shape in retrieving 3D displacement. According to the law of refraction, i.e. Snell’s law [13], the refractive distortion depends on the refractive index of medium and the incident angle of the entering ray. Adopting dome windows is accordingly a natural yet efficient way to compensate for the refractive error. However, they need to be carefully fabricated and assembled to fit the optics lens and exhibits undesirable field curvature which leads a plane to be projected as a paraboloid [14,15], thereby limiting its use in daily stereo-DIC measurement tasks.

In contrast, flat windows are more flexible and practical and then, the foci of this study. Although the literature has shown that the refractive distortion can be partially corrected by adjusting the focal length and lens distortion parameters [16,17], these approximate methods are insufficient because the flat refractive effect is highly nonlinear and depends on the location of an object [18,19]. A more attractive approach to eliminate errors caused by the refraction of light is to use specific image formations, such as refractive camera models [12,2022], generalized camera models [18,23], and the combined pinax camera model [15], to explicitly describe the effect of light refraction. These methods have facilitated great progress in refractive imaging and measurement, however, the complexity of the theoretical models leads to they are rarely concerned in practical stereo-DIC applications. Particularly, related work can be found in a range of literature for stereo-DIC. For example, authors in [24] developed a non-linear stereo calibration method by explicitly modeling the refraction at the refracting interface(s). The framework is theoretically comprehensive to represent the refractive effects, which however is complicated in finding the optimal calibration due to the high nonlinearity of light refraction. In contrast to developing a specific calibration model, the refraction error can be eliminated partially by carefully arranging the cameras with their optical axes perpendicular to the refracting interface and then, performing submerged stereo calibration with the conventional calibration framework [25,26]. In addition, the refraction error can also be suppressed to some degree by using a specific image registration method that takes into account the refraction-induced image distortion [27]. Though both methods are simple and practical, they, similar to the aforementioned approximate methods, cannot compensate for the refractive distortion completely. More recently, a single-camera stereo-DIC was specially proposed for underwater 3D deformation measurement by combining a bilateral telecentric lens and a bi-prism [28]. Because of such specific optical configuration, the device is allowed to perform measurement without stereo camera calibration but is also limited to small-scale object measurement accordingly. A notable current limitation of most of the existing methods for measuring submerged objects is that they are rarely compatible with the popular stereo-DIC systems which rely on the perspective cameras, resulting in a major practical limitation given that building an easy-to-use stereo-DIC system for fluid-immersed measurement in daily experiments is not easy. This implies that flexible, accurate, and compatible methods for measuring submerged objects with stereo-DIC are an important area for further development in fluid-immersed full-field optical deformation characterization.

In this paper, we propose a flexible and accurate stereo-DIC based on the flat refractive geometry to measure the 3D shape and deformation of fluid-immersed objects. We refer to it as a refractive stereo-DIC to tell it apart from the regular one. In the proposed refractive stereo-DIC, we consider the refractive distortion by reformulating the 3D reconstruction process in stereo-DIC. Based on Snell’s law, we establish a complete refractive geometry for parallel flat refracting interfaces using backward ray tracing and then, propose a refractive 3D reconstruction accordingly to trace the true object shape from that resulted by the regular triangulation. Given media, the introduced refractive geometry depends only on the refracting interface(s) and the 3D shape recovered by the triangulation, allowing the refractive 3D reconstruction to be performed flexibly without the need of modifying the camera model and image formation. By building the refractive 3D deformation measurement procedures, we show it can be integrated into the existing frameworks conveniently to establish a refractive stereo-DIC system. To demonstrate and highlight the contributions of our method, we evaluate the feasibility and correctness with experiments on measuring 3D shape and displacement of underwater specimens, respectively.

The rest of this paper is organized as follows. Section 2 presents the principles of refractive stereo-DIC, including geometry of flat refraction in Section 2.1, refractive 3D shape reconstruction in Section 2.2, and deformation measurement procedures in Section 2.3. Experimental verifications and results are shown in Section 3. We conclude in Section 4.

2. Refractive stereo-DIC measurement

2.1 Geometry of flat refraction

The geometry of refractive stereo-DIC is established with backward ray tracing of geometric optics based on the refraction law. Without loss of generality we consider two-layer refracting interfaces between three transmission media with different optical densities. We suppose for simplicity of exposition both interfaces are a practical flat surface and parallel to each other. Given a stereo-DIC system composed of left and right cameras, the refractive ray geometry is built with the complete Snell’s law in Fig. 1. Details are presented as follows.

 figure: Fig. 1.

Fig. 1. Refractive geometry for two flat interfaces. $O-XYZ$ and $O'-X'Y'Z'$ are the reference 3D coordinate frame and the right camera frame, respectively. The end node of each $Y$-axis is visualized with a green circle containing a cross. See text for more details.

Download Full Size | PPT Slide | PDF

In Fig. 1, both cameras are placed in air, indicated by refractive index $n_1 = 1$, to image a randomly speckled object surface through double medium with refractive indices $n_2$ and $n_3$ respectively (the most common types are glass and water). The two interfaces are denoted by $\Pi _1$ and $\Pi _2$, respectively, and are assumed to be parallel at an interval of $d$. In order to observe a 3D point $\mathbf {Q}$ on the surface, we suppose the left and right cameras cast two light rays $\mathbf {L}_1$ and $\mathbf {L}_1^\prime$, respectively. If there is no refraction (i.e. $n_1=n_2=n_3$), they travel in straight and converge at the point $\mathbf {P}$, corresponding to the regular stereo-DIC measurement. For the refractive scenario, $\mathbf {L}_1$ and $\mathbf {L}_1^\prime$ are bent at points $\mathbf {P}_1$ and $\mathbf {P}_1^\prime$ respectively when they pass the interface $\Pi _1$. According to Snell’s law [13], their directions in propagation are changed and thus turn to travel along the lines $\mathbf {L}_2$ and $\mathbf {L}_2^\prime$ correspondingly. Subsequently, $\mathbf {L}_1$ bends again at point $\mathbf {P}_2$ and turns to travel along the line $\mathbf {L}_3$ after it runs through the interface $\Pi _2$; so does the ray $\mathbf {L}_1^\prime$ at $\mathbf {P}_2^\prime$ and the refracted ray path is denoted by line $\mathbf {L}_3^\prime$. Finally, $\mathbf {Q}$ is observed and imaged by both cameras when the rays hit the object along $\mathbf {L}_3$ and $\mathbf {L}_3^\prime$, respectively. The corresponding angles of refraction for both light rays are denoted by $\theta _i$ and $\theta _i^\prime ~(i=1,2,3)$, respectively.

The geometry in Fig. 1 shows that the refracted ray-path pair, denoted by $\mathbf {L}_1-\mathbf {L}_2-\mathbf {L}_3$ and $\mathbf {L}_1^\prime -\mathbf {L}_2^\prime -\mathbf {L}_3^\prime$, for an observed object point is clearly different from that (i.e., $\mathbf {OP}$ and $\mathbf {OP}'$) without refraction. It varies not only with the distance from the optical axis, but also the positions and orientation of the interfaces, as well as the position of the object point. Therefore, for stereo-DIC based on the projective camera model, the presence of refraction leads to the erroneous location and displacement estimation of the measured points if the regular 3D reconstruction is applied.

2.2 Refractive 3D reconstruction

With the refractive geometry in Fig. 1, our goal is to recover the true 3D coordinates of $\mathbf {Q}$ in the reference coordinate frame $O-XYZ$ from its projections for avoiding the distortion caused by light refraction. We suppose the stereo camera system is fully calibrated as usual. The reference frame $O-XYZ$ is assumed to be aligned to the coordinate system of the left camera, thereby the optical centers of the left and right cameras are $\mathbf {O} = (0, 0, 0)$ and $\mathbf {O}^\prime = \mathbf {t} = (t_x, t_y, t_z)$, respectively, where $\mathbf {t}$ is the calibrated translation vector of the right camera relative to the reference frame. According to the refractive geometry, $\mathbf {Q}$ is the intersection of lines $\mathbf {L}_3$ and $\mathbf {L}_3^\prime$, thus its coordinates can be determined by establishing the refracted ray-paths $\mathbf {L}_1-\mathbf {L}_2-\mathbf {L}_3$ and $\mathbf {L}_1^\prime -\mathbf {L}_2^\prime -\mathbf {L}_3^\prime$ in 3D space. Detailed procedure is given below.

Given a stereo-pair of the speckled object surface captured by the left and right cameras simultaneously. The projections corresponding to $\mathbf {Q}$ are found firstly by the DIC-based stereo matching, constructing a stereo correspondence with pixel coordinates. Then we can adopt the regular triangulation algorithms [2,29] to estimate the point $\mathbf {P}$ instantly. Because of the light refraction, the point $\mathbf {P}$ is the distorted position of the true object point $\mathbf {Q}$. To track the point $\mathbf {Q}$, we need to build the refracted ray paths from $\mathbf {P}$ with the backward ray tracing.

The refractive geometry in Fig. 1 shows $\mathbf {L}_1$ and $\mathbf {L}_1^\prime$ are a segment of the lines $\mathbf {OP}$ and $\mathbf {O}^\prime \mathbf {P}$, respectively. Therefore, the normalized direction vectors of $\mathbf {L}_1$ and $\mathbf {L}_1^\prime$ are respectively given by

$$\left\lbrace \begin{array}{l} \boldsymbol{\alpha} = (\alpha_x, \alpha_y, \alpha_z) = \dfrac{\mathbf{P} - \mathbf{O}}{||\mathbf{P} - \mathbf{O}||} \\ \boldsymbol{\alpha}' = (\alpha_x', \alpha_y', \alpha_z') = \dfrac{\mathbf{P} - \mathbf{O}'}{||\mathbf{P} - \mathbf{O}'||} \end{array}\right..$$

Let $\mathbf {O}$ be the origin of $\mathbf {L}_1$ and $\mathbf {O}^\prime$ be the origin of $\mathbf {L}_1^\prime$. In the reference frame $O-XYZ$, the rays $\mathbf {L}_1$ and $\mathbf {L}_1^\prime$ can be determined by the following equations:

$$\left\lbrace \begin{array}{l} \mathbf{L}_1: \dfrac{X}{\alpha_x} = \dfrac{Y}{\alpha_y} = \dfrac{Z}{\alpha_z} \\ \mathbf{L}_1': \dfrac{X-t_x}{\alpha_x'} = \dfrac{Y-t_y}{\alpha_y'} = \dfrac{Z-t_z}{\alpha_z'} \end{array}\right..$$

When the light rays pass through the interface $\Pi _1$, they are bent to form the refracted rays $\mathbf {L}_2$ and $\mathbf {L}_2^\prime$. Both can be determined by computing the incident points $\mathbf {P}_1$ and $\mathbf {P}_1^\prime$ and the directions of both refracted rays.

According to Fig. 1, $\mathbf {P}_1$ and $\mathbf {P}_1^\prime$ are the intersections of $\mathbf {L}_1$ and $\mathbf {L}_1^\prime$ and the interface $\Pi _1$, respectively. Suppose the normal vector of the interface $\Pi _1$ is $\mathbf {N} = (N_x, N_y, N_z)$ and the distance from the origin is $D$. The equation of $\Pi _1$ is given by the following vector form

$$\mathbf{N}^T\mathbf{X}+D = 0.$$

Note that the interface is needed to reconstructed with the calibrated stereo-DIC system in advance, which is introduced at the end of this section. Combining Eqs. (2) and (3), we can obtain the expressions of the incidence points $\mathbf {P}_1$ and $\mathbf {P}_1^\prime$:

$$\left\lbrace \begin{array}{l} \mathbf{P}_1 ={-}\dfrac{D}{\mathbf{N}^T\boldsymbol{\alpha}}\boldsymbol{\alpha} \\ \mathbf{P}_1^\prime = \dfrac{1}{\mathbf{N}^T\boldsymbol{\alpha}'}\left(\left[t_x\boldsymbol{\alpha}'-\alpha_x'\mathbf{t},t_y\boldsymbol{\alpha}'-\alpha_y'\mathbf{t},t_z\boldsymbol{\alpha}'-\alpha_z'\mathbf{t}\right]^T\mathbf{N}-D\boldsymbol{\alpha}'\right) \end{array}\right..$$

Subsequently, we identify the directions of the refracted rays $\mathbf {L}_2$ and $\mathbf {L}_2^\prime$. They are denoted by unit vectors $\boldsymbol{\beta }$ and $\boldsymbol{\beta }'$, respectively. Because the refracted ray is coplanar with its incident ray and the interface normal, the vectors $\boldsymbol{\beta }$ and $\boldsymbol{\alpha }$ can be projected onto the directions parallel to and perpendicular to the normal $\mathbf {N}$ respectively, so that

$$\begin{aligned} \boldsymbol{\alpha} = \boldsymbol{\alpha}_{{\perp}} + \textrm{cos}\theta_1 \mathbf{N}, \end{aligned}$$
$$\begin{aligned} \boldsymbol{\beta} = \boldsymbol{\beta}_{{\perp}} + \textrm{cos}\theta_2 \mathbf{N}, \end{aligned}$$
where the subscript $\perp$ denotes the vertical projections, $\textrm {cos}\theta _1 = \mathbf {N}^T\boldsymbol{\alpha }$ and $\textrm {cos}\theta _2 = \mathbf {N}^T\boldsymbol{\beta }$. Similarly, the following equations are obtained for $\boldsymbol{\beta }'$ and $\boldsymbol{\alpha }'$:
$$\begin{aligned} \boldsymbol{\alpha}' = \boldsymbol{\alpha}'_{{\perp}} + \textrm{cos}\theta'_1 \mathbf{N}, \end{aligned}$$
$$\begin{aligned} \boldsymbol{\beta}' = \boldsymbol{\beta}'_{{\perp}} + \textrm{cos}\theta'_2 \mathbf{N}, \end{aligned}$$
with $\textrm {cos}\theta _1' = \mathbf {N}^T\boldsymbol{\alpha }'$ and $\textrm {cos}\theta _2' = \mathbf {N}^T\boldsymbol{\beta }'$. It is worth mentioning that all the vertical projections are perpendicular to $\mathbf {N}$, thereby the two coplanar projections are parallel to each other. This implies that $\boldsymbol{\beta }_{\perp }$ and $\boldsymbol{\beta }'_{\perp }$ can be respectively determined by scaling $\boldsymbol{\alpha }_{\perp }$ and $\boldsymbol{\alpha }'_{\perp }$ according to their length ratios. Since each of the direction vectors is assumed to be a unit vector, the length of its vertical component is the sine function of the corresponding incident or refracted angle. Therefore, we have two equations $\boldsymbol{\beta }_{\perp } = \frac {\textrm {sin}\theta _2}{\textrm {sin}\theta _1}\boldsymbol{\alpha }_{\perp }$ and $\boldsymbol{\beta }'_{\perp } = \frac {\textrm {sin}\theta '_2}{\textrm {sin}\theta '_1}\boldsymbol{\alpha }'_{\perp }$ according to the fundamental rule of orthogonal projection. The former links Eqs. (5a) and (5b), leading to $\boldsymbol{\beta } = \frac {\textrm {sin}\theta _2}{\textrm {sin}\theta _1}(\boldsymbol{\alpha } - \textrm {cos}\theta _1\mathbf {N}) + \textrm {cos}\theta _2\mathbf {N}$; the latter connects Eqs. (6a) and (6b), so that $\boldsymbol{\beta }' = \frac {\textrm {sin}\theta _2'}{\textrm {sin}\theta _1'}(\boldsymbol{\alpha }' - \textrm {cos}\theta _1'\mathbf {N}) + \textrm {cos}\theta _2'\mathbf {N}$. With Snell’s law, i.e., $n_1\textrm {sin}\theta _1 = n_2\textrm {sin}\theta _2$ and $n_1\textrm {sin}\theta _1' = {n_2}\textrm {sin}\theta _2'$, we finally obtain the unit direction vectors of the refracted rays $\mathbf {L}_2$ and $\mathbf {L}_2^\prime$ as follows:
$$\left\lbrace \begin{array}{l} \boldsymbol{\beta} = \dfrac{n_1}{n_2}\boldsymbol{\alpha}~~- \left(\dfrac{n_1}{n_2}\textrm{cos}\theta_1 - \textrm{cos}\theta_2\right)\mathbf{N} \\ \boldsymbol{\beta}' = \dfrac{n_1}{n_2}\boldsymbol{\alpha}' - \left(\dfrac{n_1}{n_2}\textrm{cos}\theta_1' - \textrm{cos}\theta_2'\right)\mathbf{N} \end{array}\right..$$

Denote the determined coordinates of $\mathbf {P}_1$ and $\mathbf {P}_1'$ as $(X_1, Y_1, Z_1)$ and $(X_1', Y_1', Z_1')$ respectively, the equations for $\mathbf {L}_2$ and $\mathbf {L}_2^\prime$ are established as follows:

$$\left\lbrace \begin{array}{l} \mathbf{L}_2: \dfrac{X-X_1}{\beta_x} = \dfrac{Y-Y_1}{\beta_y} = \dfrac{Z-Z_1}{\beta_z} \\ \mathbf{L}_2': \dfrac{X-X_1'}{\beta_x'} = \dfrac{Y-Y_1'}{\beta_y'} = \dfrac{Z-Z_1'}{\beta_z'} \end{array}\right..$$

Revisiting the process above, it is found that Eqs. (4)–(8) give the general rule to establish the refracted ray-path for a binocular stereo camera system, thereby forming the baseline formulations for reconstructing an observed 3D point in the refracting media. Therefore, the refracted rays $\mathbf {L}_3$ and $\mathbf {L}_3'$ at the interface $\Pi _2$ are determined directly by following the baseline formulations. The interface $\Pi _2$ is expressed by $\mathbf {N}^T\mathbf {X} + D + d = 0$ since it is parallel to $\Pi _1$. The two incident points $\mathbf {P}_2$ and $\mathbf {P}_2'$ on $\Pi _2$ are computed as:

$$\left\lbrace \begin{array}{l} \mathbf{P}_2 = \dfrac{1}{\mathbf{N}^T\boldsymbol{\beta}}\left(\left[X_1\boldsymbol{\beta}-\beta_x\mathbf{P}_1,Y_1\boldsymbol{\beta}-\beta_y\mathbf{P}_1,Z_1\boldsymbol{\beta}-\beta_z\mathbf{P}_1\right]^T\mathbf{N}-(D+d)\boldsymbol{\beta}\right) \\ \mathbf{P}_2' = \dfrac{1}{\mathbf{N}^T\boldsymbol{\beta}'}\left(\left[X_1'\boldsymbol{\beta}'-\beta_x'\mathbf{P}_1',Y_1'\boldsymbol{\beta}'-\beta_y'\mathbf{P}_1',Z_1'\boldsymbol{\beta}'-\beta_z'\mathbf{P}_1'\right]^T\mathbf{N}-(D+d)\boldsymbol{\beta}'\right) \end{array}\right..$$

The direction vectors of the rays $\mathbf {L}_3$ and $\mathbf {L}_3'$ are respectively determined by:

$$\left\lbrace \begin{array}{l} \boldsymbol{\gamma} = \dfrac{n_2}{n_3}\boldsymbol{\beta}~~- \left(\dfrac{n_2}{n_3}\textrm{cos}\theta_2 - \textrm{cos}\theta_3\right)\mathbf{N} \\ \boldsymbol{\gamma}' = \dfrac{n_2}{n_3}\boldsymbol{\beta}' - \left(\dfrac{n_2}{n_3}\textrm{cos}\theta_2' - \textrm{cos}\theta_3'\right)\mathbf{N} \end{array}\right..$$

With the computed incident points and the direction vectors, the refracted rays $\mathbf {L}_3$ and $\mathbf {L}_3'$ are determined according to Eqs. (8). The coordinates of the observed 3D point $\mathbf {Q}$ are finally identified as the intersection of $\mathbf {L}_3$ and $\mathbf {L}_3'$ with form of $\mathbf {Q} = \boldsymbol{\gamma }s - \mathbf {P}_2$ or $\mathbf {Q} = \boldsymbol{\gamma }'s' - \mathbf {P}_2'$, where $s$ and $s'$ are determined by solving

$$[\boldsymbol{\gamma}, \boldsymbol{\gamma}'] \left(\begin{array}{c} s \\ s' \end{array}\right) = \mathbf{P}_2 - \mathbf{P}_2'.$$

Despite these equations above seem complicated, they are all analytic and can be used straightforwardly without any complex loops. Meanwhile, the baseline formulations of the backward ray tracing give a concise procedure for determining the ray paths in the refractive 3D reconstruction. In Section 2.3, we will show the full measurement procedure by integrating the refractive 3D reconstruction into the stereo-DIC framework.

It is noteworthy that the pose and position, i.e. the normal vector $\mathbf {N}$ and the distance $D$, of the interface $\Pi _1$ play a crucial role in the refractive 3D reconstruction process, but are often unknown in daily experiments. Therefore, we propose to determine $\mathbf {N}$ and $D$ through reconstructing the flat interface $\Pi _1$. The method is pasting a set of removable circular markers on the interface $\Pi _1$, then reconstructing the 3D positions of all markers relative to the frame $O-XYZ$. To tackle the possible defocusing effect, we recommend detecting the markers by thresholding followed by positioning with the center-of-gravity method. For blurred images due to serious defocusing or poor illumination, several improved detection algorithms [30,31] can be adopted to extract the markers. The equation of $\Pi _1$ can be finally fitted by identifying the normal vector $\mathbf {N}$ and the distance $D$. Though only three non-collinear markers are enough to determine them, more markers allow us to find robust estimates of $\mathbf {N}$ and $D$ by integrating the interface fitter into a hypothesize-and-test framework, such as RANSAC [32]. By doing this, the 3D shape of fluid-immersed objects can be reconstructed completely and flexibly to perform 3D shape and deformation calculations without reformulating the camera model and relying on specific optics.

2.3 Deformation measurement procedures

In this section, we describe the refractive stereo-DIC by integrating the refractive 3D reconstruction above into the regular 3D shape and deformation measurement procedures. Similarly, the pipeline of refractive stereo-DIC measurement consists of five major steps: (1) speckle image acquisition, (2) calibration of the stereo camera system, (3) stereo and temporal image matching, (4) 3D reconstruction with the method in the previous section, and (5) estimation of deformation fields. An overview of the measurement procedures is shown in Fig. 2. The first step is straightforward, that is to simultaneously capture a stereo-pair of a randomly speckled object at each measurement stage. Each stereo-pair comprises a left image and a right image taken by the left and right cameras, respectively, and the stereo-pair acquired at the initial state is defined as the reference. Steps 2 to 5 are described as follows.

  • Stereo camera calibration. This step is carried out to determine the internal parameters of and the external parameters between the left and right cameras. Generally, it can be done with the regular calibration methods, such as the well-known Zhang’s method [33]. However, for some applications (such as marine blades measurement in a cavitation channel), stereo calibration is not as trivial as land-based cases since the calibration targets may not be suitable or allowed to be used. For that, we recommend calibrating the internal parameters of each camera with Zhang’s method and then, determining the external parameters with the relative pose estimation methods, such as the five-point algorithm [34]. In practical stereo-DIC measurement, the bundle adjustment can be applied to refine the solved external parameters for providing a high accuracy guarantee [35].
  • Stereo and temporal image matching. First of all, a number of points of interest (POIs) are selected in the left part of the reference stereo-pair. Stereo matching is applied to find the stereo correspondences of POIs in the right part and the temporal matching is carried out to track the positions of each stereo correspondence in the following up deformed stereo-pairs. As shown in Fig. 2, both are implemented by the DIC algorithms. More details of the stereo and temporal DIC matching can be found in [36,37]. Once the stereo correspondences of all POIs are established in each measurement stage, the initial and deformed 3D shapes of the object being measured are ready to be recovered.
  • Refractive 3D reconstruction. By following the method in Section 2.2, this step can be performed as follows. First, the refracting interfaces (i.e., $\Pi _1$ and $\Pi _2$) are determined with the help of the optical markers, which is done only once priori to the loading. Subsequently, the distorted 3D shape is estimated from the stereo correspondences for each state by the linear triangulation method [29] or the conditional one [2]; this process is the same as the regular stereo-DIC. Finally, the true 3D profiles are reconstructed by following the proposed method in Section 2.2. The 3D shape recovered in the reference state is set as the initial configuration, and those reconstructed in the loading phases are the deformed configurations in order. One can see herein the only additional operation is the reconstruction of the refracting interfaces compared with the land-based measurement. This implies we can readily extend the existing stereo-DIC for performing the submerged 3D measurement. We have implemented this step as an independent module for refractive stereo-DIC measurement; the source code is available in our GitHub repository Matrice.
  • Deformation field estimation. Once all configurations in both initial and deformed states of the object being measured are determined, 3D displacement fields are computed by comparing the deformed ones with the initial one. E.g., for a measured point on the object, suppose its 3D coordinates on the initial and deformed configurations are reconstructed as $\mathbf {Q}$ and $\mathbf {Q}'$. The 3D displacements at the point are then estimated as:
    $$(U, V, W) = \mathbf{Q}' - \mathbf{Q},$$
    where $U, V$ and $W$ are the displacement components in X-, Y- and Z-directions, respectively. Applying Eq. (12) to every measured point for all measurement stages, we can obtain the 3D displacement fields, i.e. U, V, and W fields, for subsequent dynamic and/or static analysis. If necessary, strain fields can be derived by differentiating the displacement fields [1].

 figure: Fig. 2.

Fig. 2. Overview of the refractive stereo-DIC measurement pipeline.

Download Full Size | PPT Slide | PDF

3. Experiments and results

The performance of the proposed refractive stereo-DIC method has been evaluated on both 3D shape reconstruction and 3D deformation estimation. Both tests consider the scenario of a pair of cameras measuring a standard specimen situated at a glass tank filled with water. The experimental setup is shown in Fig. 3(a). The stereo measurement system was composed of two cameras, both were equipped with an imaging sensor with a resolution of $2048 \times 2048$ pixels and a 25 mm prime lens. The working distance and the horizontal field of view were about 550 mm and 240 mm, respectively. The thickness of the glass is 5 mm, and the refractive indices of the glass and water are 1.52 and 1.33 respectively. The left and right cameras were calibrated by adopting Zhang’s method based on a chessboard with $12 \times 9$ corners prior to the experiments. To reconstruct the refracting interfaces, we adhered a number 44 of circular markers to the front face of the water tank. A pair of images for the markers are shown in (b). Subsequently, a standard sphere with diameter 40 mm and a disk were used to evaluate the performances in 3D shape reconstruction and deformation measurement, respectively. Random speckle patterns were fabricated on each of the specimen surfaces. Speckle image samples for both specimens are shown in (c). Details of the experiments and results are given in the following subsections.

 figure: Fig. 3.

Fig. 3. (a) Experimental setup. (b) Stereo-pair of 44 circular markers. (c) Speckle images of the underwater sphere and disk specimens.

Download Full Size | PPT Slide | PDF

3.1 Underwater 3D shape reconstruction

We reconstructed the 3D shape of the underwater sphere to validate that the presented refractive geometry provides a good approximation to a complex refractive optical system. For that, a stereo-pair of the randomly speckled sphere was captured by the left and right cameras, and a number 5876 of stereo correspondences were established according to the foregoing stereo-DIC matching algorithm. Following the refractive 3D reconstruction method described in Section 2.2, the observed segment of the sphere surface was recovered successively.

Figure 4 shows the measured point clouds for the sphere with the proposed and regular stereo-DIC methods. Both were reconstructed in the reference coordinate frame(i.e., $O-XYZ$ in Fig. 1). For comparison and visualization, they were moved from the measured centers $(-3.50, 10.22, 654.74)$ mm and $(-4.82, 10.15, 621.00)$ mm to the origin respectively and then, were plotted in the same viewport with reversed $Z$-coordinates. Intuitively, the measured profile in (b) has a narrower depth range than that in (a), implying that the former seems larger than the latter in terms of geometrical size. To verify this point, two spheres were fitted from the measured point clouds. Results show that both point clouds were fitted well to the spheres respectively, but the resulted sphere radii were different. The radius corresponding to the profile in (a) is 20.08 mm, which is close to the ground-truth 20 mm (half of the true diameter), while that of the profile in (b), 25.98 mm, is significantly larger than the ground-truth. In addition, we generated an ideal surface of the measured sphere at the centers of the reconstructed 3D profiles, respectively, for evaluating the discrepancies of the measure profiles. The discrepancy maps are illustrated in Fig. 5(a) and (b). We find discrepancies in (a) almost range from 0 to 0.1 mm, while in (b) are obviously large, ranging from 4.35 to 7.11 mm. In (c), we compare the the spherical surface measured by our method with that measured in air with the regular stereo-DIC by computing the differences in radial. The comparison shows that the 3D profiles measured by both methods are in good agreement. The results clearly show that a good reconstruction of the observed sphere surface could be obtained by using the proposed method. From a pure geometry point of view, the results of 3D shape reconstruction show that the introduced flat refractive geometry is reasonable and correct, and demonstrate our 3D reconstruction method based on the geometry is capable of retrieving the true 3D shape of underwater objects with high accuracy.

 figure: Fig. 4.

Fig. 4. 3D sphere shape measured by the refractive stereo-DIC (a) and the regular version (b), respectively.

Download Full Size | PPT Slide | PDF

 figure: Fig. 5.

Fig. 5. Discrepancy maps between the 3D shapes measured by our method (a) and the regular stereo-DIC method (b) and the ideal spherical surface; (c) the discrepancy map between the spheres measured in water with our method and in air with the regular stereo-DIC.

Download Full Size | PPT Slide | PDF

3.2 Underwater 3D deformation evaluation

The performance in underwater 3D deformation was evaluated with the disk specimen. The thickness of the disk is $\delta = 0.5$ mm and the diameter is $\phi = 80$ mm. In order to verify our method as clear as possible, in- and out-of-plane rigid body translation and out-of-plane deformation of the disk specimen were conducted, respectively.

3.2.1 In-plane and out-of-plane rigid translation

Here the specimen was moved by a two-axis translation stage with a precision of $\pm$0.05 mm, as shown in Fig. 3(a). In both tests, the same large displacement range, i.e. from -5 mm to 5 mm, was adopted. For each test, we moved the disk specimen to the zero position of the stage firstly and captured a pair of images as the reference stereo-pair. Subsequently, we adjusted the translation stage to shift the specimen to -5 mm position and then, moved it to the 5 mm position stepwise with a stride of 1 mm. At every step, a stereo-pair of the disk specimen was recorded for displacement computation. With the measurement procedures in Section 2.3, both in-plane and out-of-plane displacements were estimated. Then, the absolute errors relative to the imposed translation values were computed. The displacement results and errors were averaged finally. Results are shown in Fig. 6(a). For comparison, the displacements in both tests were estimated by the regular stereo-DIC, corresponding results are shown in Fig. 6(b).

 figure: Fig. 6.

Fig. 6. Measured mean displacements versus the imposed translations for the proposed refractive stereo-DIC (a) and the regular stereo-DIC (b). In (b), the SD errors are magnified ten-fold for visualization.

Download Full Size | PPT Slide | PDF

On close inspection of Fig. 6, we see that, for in-plane rigid displacement, both methods produced similar results that are consistent with the imposed displacements; the absolute errors of both range about from -0.01 to 0.02 mm with the maximum standard deviation (SD) 0.01 mm. Comparison of (a) and (b) shows, however, that the measured results are significantly different for out-of-plane rigid displacement. Out-of-plane displacements measured by our method well agree with the actual translations of the disk specimen, and maintain a reasonable error fluctuation from -0.06 to 0.02 mm with the maximum SD of 0.01 mm. While the displacement errors produced by the regular stereo DIC increase linearly with increasing of the distance from the reference position of the specimen; and the errors, range from -1.15 to 1.26 mm with the maximum SD 0.03 mm, are much larger than that of our method. These results imply our method shows the expected performance in fluid-immersed 3D deformation measurement, even the object is subjected to large displacements. Beyond this, we find that the influence of light refraction is mainly manifested in the out-of-plane direction, while hardly affects the in-plane displacement measurement . The main reason is that the specimen was moved in a plane that is almost parallel to the refracting interface (i.e., the front face of the water tank) for the in-plane displacement case. This phenomenon is reasonable according to the flat refractive geometry presented in Section 2.1.

3.2.2 Out-of-plane deformation

To measure the out-of-plane deformation, we applied a concentrated displacement load perpendicular to the disk surface at the center of the disk specimen. The loading device is a 0.01 mm micrometer with precision of $\pm$0.004 mm as shown in Fig. 3(a). Before loading the specimen, a pair of speckle images of the disk were captured as the reference stereo-pair. Then loading started and 10 loaded states with a known displacement step of 0.1 mm were achieved finally. The corresponding speckled stereo-pairs were recorded in phase. A sample of the speckled disk image has been shown in Fig. 3(c). Following the measurement procedures in Section 2.3, the displacement fields in X-, Y- and Z-directions were computed.

We firstly evaluated the total central displacement since its actual value at each state is known. Figure 7 shows the measured central displacements and the absolute errors versus the applied displacements. For comparison, the results computed by the regular stereo-DIC are also plotted in the figure. We find that, with increasing of the out-of-plane load, the displacements measured by our method in (a) are in good agreement with the actual ones. The corresponding absolute errors in (b) are in a small yet stable range, with the maximum of 0.01 mm and SD of 0.003 mm. The results provide a good validation for our method in both correctness and accuracy. For the results of the regular stereo-DIC, we find the measured displacements in (a) are significantly smaller in amplitude under the influence of light refraction and the errors in (b) increase linearly with the increase of load. This is similar to the results in the test of out-of-plane rigid translation in the previous section. Both translation and deformation experiments seem to show, when measuring the movement or deformation of an object through media with graded refractive index, light refraction will cause non-negligible measurement errors if the direction of movement or deformation is not parallel to the refracting interface(s). E.g., for the out-of-plane translation in the previous section, the maximum absolute error is 1.26 mm, which is about 25.20% of the corresponding translation; for the test in this section, the maximum error is 0.27 mm, which is about 27.00% of the maximum displacement.

 figure: Fig. 7.

Fig. 7. (a) Measured displacements versus the out-of-plane deformation and (b) the corresponding absolute errors.

Download Full Size | PPT Slide | PDF

Based on the aforementioned validations, we show the U, V and W displacement fields measured in the load states of 0.1 mm and 1.0 mm in Fig. 8. Comparing the displacements in three directions with each other in both states, we find the disk mainly undergoes an out-of-plane displacement, shown as W fields in (c) and (f), when it is subjected to a concentrated load at the center, while the displacements in the other two directions are very small, see U fields in (a) and (d) and V fields in (b) and (e). The measured results are in accordance with the theory. Considering the ratio of the characteristic dimensions, $\delta$ and $\phi$, of the disk specimen is small (1/160) and the disk was was fixed with eight equally spaced bolts (see Fig. 3(c)), the case in this experiment could be approximated to a bending problem of an edge-clamped circular plate under the action of the central concentrated load. According to the theory of thin plates [38], the in-plane displacements $U$ and $V$ could be approximately zero if we ignore the thickness effect, and the out-of-plane displacement (also known as deflection) $W$ shows a centrosymmetric distribution and reaches the maximum at the center. To show the radial displacement distribution of the disk, we further estimated and inspected the total displacements along a horizontal line passing through the center of the disk for 0.1 mm and 1.0 mm load states, respectively. The total displacements, denoted by $T$, and their radial distribution curves are shown in Fig. 9. We find, although the sampled displacement curve at the load state 0.1 mm is not smooth enough, the overall distribution in the radial direction agrees well with the theory predicting and reaches the maximum at the center with an error of 0.01 mm; the radial displacement distribution sampled at the load state of 1.0 mm is in perfect agreement with the theoretical analysis, and the error of the peak displacement at the center is also 0.01 mm. As a result, these analyses prove the measured displacement fields are reasonable and correct, thereby validating the performance of our method from the perspective of full-field displacement estimation.

 figure: Fig. 8.

Fig. 8. 3D displacement fields measured by the proposed refractive stereo-DIC: (a)-(c) for the load of 0.1 mm and (d)-(f) for 1.0 mm.

Download Full Size | PPT Slide | PDF

 figure: Fig. 9.

Fig. 9. Measured total displacement fields and the distribution curves in radial direction at the load states of 0.1 mm (a) and 1.0 mm (b).

Download Full Size | PPT Slide | PDF

4. Conclusion

In this work, we propose the refractive stereo-DIC, a new extension of the widely used land-based full-field optical deformation measurement technique, to measure the 3D shape and deformation of fluid-immersed objects. We describe a refractive geometry with the backward ray tracing and provide a formulation for the refractive geometry which forms the baseline of an accurate refractive stereo reconstruction for achieving 3D measurement in refracting media. The method maintains the perspective camera model adopted in the regular stereo-DIC and thus, can be treated as a flexible post-processing approach for readily extending the existing framework to fluid-immersed optical 3D deformation characterization. We show the necessity and performance of our method in refractive 3D shape and deformation measurements. Experimental validation on underwater 3D shape reconstruction demonstrates our method provides a correct description to a complex refractive stereo imaging system without modifying the image formation. Tests on rigid translation and deformation of an underwater specimen reveal that our method is accurate and precise for 3D displacement measurements with both large and small amplitudes. Particularly, we found that light refraction has nearly no effect on the deformation in a plane parallels to the refracting interface(s). Beyond these validations, we expect our method potentially promotes further development of the 3D deformation characterization and stereo reconstruction for objects in refracting media by extending existing stereo-DIC systems and universal stereo visions.

Funding

National Key Research and Development Program of China (2018YFF01014200); National Natural Science Foundation of China (12002197, 11602056, 11727804, 12072184); China Postdoctoral Science Foundation (2020M671070); Shanghai Post-doctoral Excellence Program (2019192).

Disclosures

The authors declare no conflicts of interest.

References

1. M. A. Sutton, J.-J. Orteu, and H. Schreier, Image Correlation for Shape, Motion and Deformation Measurements: Basic Concepts, Theory and Applications (Springer, 2009), 1st ed.

2. Z. Su, L. Lu, F. Yang, X. He, and D. Zhang, “Geometry constrained correlation adjustment for stereo reconstruction in 3d optical deformation measurements,” Opt. Express 28(8), 12219–12232 (2020). [CrossRef]  

3. X. Shao, M. M. Eisa, Z. Chen, S. Dong, and X. He, “Self-calibration single-lens 3d video extensometer for high-accuracy and real-time strain measurement,” Opt. Express 24(26), 30124–30138 (2016). [CrossRef]  

4. Z. Hu, T. Xu, H. Luo, R. Z. Gan, and H. Lu, “Measurement of thickness and profile of a transparent material using fluorescent stereo microscopy,” Opt. Express 24(26), 29822–29829 (2016). [CrossRef]  

5. M. Palanca, G. Tozzi, and L. Cristofolini, “The use of digital image correlation in the biomechanical area: a review,” Int. Biomech. 3(1), 1–21 (2016). [CrossRef]  

6. Z. Chen, X. Shao, X. He, J. Wu, X. Xu, and J. Zhang, “Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo,” J. Biomed. Opt. 22(9), 1–10 (2017). [CrossRef]  

7. M. Malesa, K. Malowany, U. Tomczak, B. Siwek, M. Kujawinska, and A. Sieminska-Lewandowska, “Application of 3d digital image correlation in maintenance and process control in industry,” Comput. Ind. 64(9), 1301–1315 (2013). [CrossRef]  

8. A. Piekarczuk, “Experimental and numerical studies of double corrugated steel arch panels,” Thin-Walled Struct. 140, 60–73 (2019). [CrossRef]  

9. D. Reagan, A. Sabato, and C. Niezrecki, “Feasibility of using digital image correlation for unmanned aerial vehicle structural health monitoring of bridges,” Struct. Heal. Monit. 17(5), 1056–1072 (2018). [CrossRef]  

10. L. Ngeljaratan and M. A. Moustafa, “Structural health monitoring and seismic response assessment of bridge structures using target-tracking digital image correlation,” Eng. Struct. 213, 110551 (2020). [CrossRef]  

11. J.-J. Orteu, “3-d computer vision in experimental mechanics,” Opt. Lasers Eng. 47(3-4), 282–291 (2009). [CrossRef]  

12. F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020). [CrossRef]  

13. F. Bryant, “Snell's law of refraction,” Phys. Bull. 9(12), 317 (1958). [CrossRef]  

14. F. Menna, E. Nocerino, F. Fassi, and F. Remondino, “Geometric and optic characterization of a hemispherical dome port for underwater photogrammetry,” Sensors 16(1), 48 (2016). [CrossRef]  

15. T. Luczynski, M. Pfingsthorn, and A. Birk, “The pinax-model for accurate and efficient refraction correction of underwater cameras in flat-pane housings,” Ocean Eng. 133, 9–22 (2017). [CrossRef]  

16. J. M. Lavest, G. Rives, and J. T. Lapresté, “Underwater camera calibration,” in Computer Vision — ECCV 2000, D. Vernon, ed. (Springer, 2000), pp. 654–668.

17. L. Kang, L. Wu, and Y.-H. Yang, “Experimental study of the influence of refraction on underwater three-dimensional reconstruction using the svp camera model,” Appl. Opt. 51(31), 7591–7603 (2012). [CrossRef]  

18. T. Treibitz, Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 51–65 (2012). [CrossRef]  

19. L. Kang, L. Wu, Y. Wei, S. Lao, and Y.-H. Yang, “Two-view underwater 3d reconstruction for cameras with unknown poses under flat refractive interfaces,” Pattern Recognit. 69, 251–269 (2017). [CrossRef]  

20. A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in 2012 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2012), pp. 3346–3353.

21. T. Yau, M. Gong, and Y. Yang, “Underwater camera calibration using wavelength triangulation,” in 2013 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2013), pp. 2499–2506.

22. X. Chen and Y. Yang, “Two-view camera housing parameters calibration for multi-layer flat refractive interface,” in 2014 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2014), pp. 524–531.

23. V. Chari and P. Sturm, “Multi-view geometry of the refractive plane,” in Proceedings of the British Machine Vision Conference, (British Machine Vision Association, 2009), pp. 56.1–56.11.

24. X. Ke, M. A. Sutton, S. M. Lessner, and M. Yost, “Robust stereo vision and calibration methodology for accurate three-dimensional digital image correlation measurements on submerged objects,” The J. Strain Analysis for Eng. Des. 43(8), 689–704 (2008). [CrossRef]  

25. S. Gupta, V. Parameswaran, M. A. Sutton, and A. Shukla, “Study of dynamic underwater implosion mechanics using digital image correlation,” Proc. R. Soc. A 470(2172), 20140576 (2014). [CrossRef]  

26. S. Kishore, K. Senol, P. Naik Parrikar, and A. Shukla, “Underwater implosion pressure pulse interactions with submerged plates,” J. Mech. Phys. Solids 143, 104051 (2020). [CrossRef]  

27. M. A. Haile and P. G. Ifju, “Application of elastic image registration and refraction correction for non-contact underwater strain measurement,” Strain 48(2), 136–142 (2012). [CrossRef]  

28. B. Chen and B. Pan, “Calibration-free single camera stereo-digital image correlation for small-scale underwater deformation measurement,” Opt. Express 27(8), 10509–10523 (2019). [CrossRef]  

29. R. I. Hartley and P. Sturm, “Triangulation,” Comput. Vis. Image Underst. 68(2), 146–157 (1997). [CrossRef]  

30. Y. Wang, Y. Wang, L. Liu, and X. Chen, “Defocused camera calibration with a conventional periodic target based on fourier transform,” Opt. Lett. 44(13), 3254–3257 (2019). [CrossRef]  

31. S. Dong, J. Ma, Z. Su, and C. Li, “Robust circular marker localization under non-uniform illuminations based on homomorphic filtering,” Measurement 170, 108700 (2021). [CrossRef]  

32. M. A. Fischler and R. C. Bolles, “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24(6), 381–395 (1981). [CrossRef]  

33. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Machine Intell. 22(11), 1330–1334 (2000). [CrossRef]  

34. D. Nister, “An efficient solution to the five-point relative pose problem,” IEEE Trans. Pattern Anal. Machine Intell. 26(6), 756–770 (2004). [CrossRef]  

35. Z. Su, L. Lu, S. Dong, F. Yang, and X. He, “Auto-calibration and real-time external parameter correction for stereo digital image correlation,” Opt. Lasers Eng. 121, 46–53 (2019). [CrossRef]  

36. B. Pan, K. Li, and W. Tong, “Fast, robust and accurate digital image correlation calculation without redundant computations,” Exp. Mech. 53(7), 1277–1289 (2013). [CrossRef]  

37. Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, and Q. Zhang, “High-efficiency and high-accuracy digital image correlation for three-dimensional measurement,” Opt. Lasers Eng. 65, 73–80 (2015). [CrossRef]  

38. J. Reddy, Theory and Analysis of Elastic Plates and Shells (CRC, 2006), 2nd ed.

References

  • View by:

  1. M. A. Sutton, J.-J. Orteu, and H. Schreier, Image Correlation for Shape, Motion and Deformation Measurements: Basic Concepts, Theory and Applications (Springer, 2009), 1st ed.
  2. Z. Su, L. Lu, F. Yang, X. He, and D. Zhang, “Geometry constrained correlation adjustment for stereo reconstruction in 3d optical deformation measurements,” Opt. Express 28(8), 12219–12232 (2020).
    [Crossref]
  3. X. Shao, M. M. Eisa, Z. Chen, S. Dong, and X. He, “Self-calibration single-lens 3d video extensometer for high-accuracy and real-time strain measurement,” Opt. Express 24(26), 30124–30138 (2016).
    [Crossref]
  4. Z. Hu, T. Xu, H. Luo, R. Z. Gan, and H. Lu, “Measurement of thickness and profile of a transparent material using fluorescent stereo microscopy,” Opt. Express 24(26), 29822–29829 (2016).
    [Crossref]
  5. M. Palanca, G. Tozzi, and L. Cristofolini, “The use of digital image correlation in the biomechanical area: a review,” Int. Biomech. 3(1), 1–21 (2016).
    [Crossref]
  6. Z. Chen, X. Shao, X. He, J. Wu, X. Xu, and J. Zhang, “Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo,” J. Biomed. Opt. 22(9), 1–10 (2017).
    [Crossref]
  7. M. Malesa, K. Malowany, U. Tomczak, B. Siwek, M. Kujawinska, and A. Sieminska-Lewandowska, “Application of 3d digital image correlation in maintenance and process control in industry,” Comput. Ind. 64(9), 1301–1315 (2013).
    [Crossref]
  8. A. Piekarczuk, “Experimental and numerical studies of double corrugated steel arch panels,” Thin-Walled Struct. 140, 60–73 (2019).
    [Crossref]
  9. D. Reagan, A. Sabato, and C. Niezrecki, “Feasibility of using digital image correlation for unmanned aerial vehicle structural health monitoring of bridges,” Struct. Heal. Monit. 17(5), 1056–1072 (2018).
    [Crossref]
  10. L. Ngeljaratan and M. A. Moustafa, “Structural health monitoring and seismic response assessment of bridge structures using target-tracking digital image correlation,” Eng. Struct. 213, 110551 (2020).
    [Crossref]
  11. J.-J. Orteu, “3-d computer vision in experimental mechanics,” Opt. Lasers Eng. 47(3-4), 282–291 (2009).
    [Crossref]
  12. F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
    [Crossref]
  13. F. Bryant, “Snell's law of refraction,” Phys. Bull. 9(12), 317 (1958).
    [Crossref]
  14. F. Menna, E. Nocerino, F. Fassi, and F. Remondino, “Geometric and optic characterization of a hemispherical dome port for underwater photogrammetry,” Sensors 16(1), 48 (2016).
    [Crossref]
  15. T. Luczynski, M. Pfingsthorn, and A. Birk, “The pinax-model for accurate and efficient refraction correction of underwater cameras in flat-pane housings,” Ocean Eng. 133, 9–22 (2017).
    [Crossref]
  16. J. M. Lavest, G. Rives, and J. T. Lapresté, “Underwater camera calibration,” in Computer Vision — ECCV 2000, D. Vernon, ed. (Springer, 2000), pp. 654–668.
  17. L. Kang, L. Wu, and Y.-H. Yang, “Experimental study of the influence of refraction on underwater three-dimensional reconstruction using the svp camera model,” Appl. Opt. 51(31), 7591–7603 (2012).
    [Crossref]
  18. T. Treibitz, Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 51–65 (2012).
    [Crossref]
  19. L. Kang, L. Wu, Y. Wei, S. Lao, and Y.-H. Yang, “Two-view underwater 3d reconstruction for cameras with unknown poses under flat refractive interfaces,” Pattern Recognit. 69, 251–269 (2017).
    [Crossref]
  20. A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in 2012 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2012), pp. 3346–3353.
  21. T. Yau, M. Gong, and Y. Yang, “Underwater camera calibration using wavelength triangulation,” in 2013 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2013), pp. 2499–2506.
  22. X. Chen and Y. Yang, “Two-view camera housing parameters calibration for multi-layer flat refractive interface,” in 2014 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2014), pp. 524–531.
  23. V. Chari and P. Sturm, “Multi-view geometry of the refractive plane,” in Proceedings of the British Machine Vision Conference, (British Machine Vision Association, 2009), pp. 56.1–56.11.
  24. X. Ke, M. A. Sutton, S. M. Lessner, and M. Yost, “Robust stereo vision and calibration methodology for accurate three-dimensional digital image correlation measurements on submerged objects,” The J. Strain Analysis for Eng. Des. 43(8), 689–704 (2008).
    [Crossref]
  25. S. Gupta, V. Parameswaran, M. A. Sutton, and A. Shukla, “Study of dynamic underwater implosion mechanics using digital image correlation,” Proc. R. Soc. A 470(2172), 20140576 (2014).
    [Crossref]
  26. S. Kishore, K. Senol, P. Naik Parrikar, and A. Shukla, “Underwater implosion pressure pulse interactions with submerged plates,” J. Mech. Phys. Solids 143, 104051 (2020).
    [Crossref]
  27. M. A. Haile and P. G. Ifju, “Application of elastic image registration and refraction correction for non-contact underwater strain measurement,” Strain 48(2), 136–142 (2012).
    [Crossref]
  28. B. Chen and B. Pan, “Calibration-free single camera stereo-digital image correlation for small-scale underwater deformation measurement,” Opt. Express 27(8), 10509–10523 (2019).
    [Crossref]
  29. R. I. Hartley and P. Sturm, “Triangulation,” Comput. Vis. Image Underst. 68(2), 146–157 (1997).
    [Crossref]
  30. Y. Wang, Y. Wang, L. Liu, and X. Chen, “Defocused camera calibration with a conventional periodic target based on fourier transform,” Opt. Lett. 44(13), 3254–3257 (2019).
    [Crossref]
  31. S. Dong, J. Ma, Z. Su, and C. Li, “Robust circular marker localization under non-uniform illuminations based on homomorphic filtering,” Measurement 170, 108700 (2021).
    [Crossref]
  32. M. A. Fischler and R. C. Bolles, “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24(6), 381–395 (1981).
    [Crossref]
  33. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Machine Intell. 22(11), 1330–1334 (2000).
    [Crossref]
  34. D. Nister, “An efficient solution to the five-point relative pose problem,” IEEE Trans. Pattern Anal. Machine Intell. 26(6), 756–770 (2004).
    [Crossref]
  35. Z. Su, L. Lu, S. Dong, F. Yang, and X. He, “Auto-calibration and real-time external parameter correction for stereo digital image correlation,” Opt. Lasers Eng. 121, 46–53 (2019).
    [Crossref]
  36. B. Pan, K. Li, and W. Tong, “Fast, robust and accurate digital image correlation calculation without redundant computations,” Exp. Mech. 53(7), 1277–1289 (2013).
    [Crossref]
  37. Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, and Q. Zhang, “High-efficiency and high-accuracy digital image correlation for three-dimensional measurement,” Opt. Lasers Eng. 65, 73–80 (2015).
    [Crossref]
  38. J. Reddy, Theory and Analysis of Elastic Plates and Shells (CRC, 2006), 2nd ed.

2021 (1)

S. Dong, J. Ma, Z. Su, and C. Li, “Robust circular marker localization under non-uniform illuminations based on homomorphic filtering,” Measurement 170, 108700 (2021).
[Crossref]

2020 (4)

S. Kishore, K. Senol, P. Naik Parrikar, and A. Shukla, “Underwater implosion pressure pulse interactions with submerged plates,” J. Mech. Phys. Solids 143, 104051 (2020).
[Crossref]

Z. Su, L. Lu, F. Yang, X. He, and D. Zhang, “Geometry constrained correlation adjustment for stereo reconstruction in 3d optical deformation measurements,” Opt. Express 28(8), 12219–12232 (2020).
[Crossref]

L. Ngeljaratan and M. A. Moustafa, “Structural health monitoring and seismic response assessment of bridge structures using target-tracking digital image correlation,” Eng. Struct. 213, 110551 (2020).
[Crossref]

F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
[Crossref]

2019 (4)

A. Piekarczuk, “Experimental and numerical studies of double corrugated steel arch panels,” Thin-Walled Struct. 140, 60–73 (2019).
[Crossref]

B. Chen and B. Pan, “Calibration-free single camera stereo-digital image correlation for small-scale underwater deformation measurement,” Opt. Express 27(8), 10509–10523 (2019).
[Crossref]

Y. Wang, Y. Wang, L. Liu, and X. Chen, “Defocused camera calibration with a conventional periodic target based on fourier transform,” Opt. Lett. 44(13), 3254–3257 (2019).
[Crossref]

Z. Su, L. Lu, S. Dong, F. Yang, and X. He, “Auto-calibration and real-time external parameter correction for stereo digital image correlation,” Opt. Lasers Eng. 121, 46–53 (2019).
[Crossref]

2018 (1)

D. Reagan, A. Sabato, and C. Niezrecki, “Feasibility of using digital image correlation for unmanned aerial vehicle structural health monitoring of bridges,” Struct. Heal. Monit. 17(5), 1056–1072 (2018).
[Crossref]

2017 (3)

Z. Chen, X. Shao, X. He, J. Wu, X. Xu, and J. Zhang, “Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo,” J. Biomed. Opt. 22(9), 1–10 (2017).
[Crossref]

T. Luczynski, M. Pfingsthorn, and A. Birk, “The pinax-model for accurate and efficient refraction correction of underwater cameras in flat-pane housings,” Ocean Eng. 133, 9–22 (2017).
[Crossref]

L. Kang, L. Wu, Y. Wei, S. Lao, and Y.-H. Yang, “Two-view underwater 3d reconstruction for cameras with unknown poses under flat refractive interfaces,” Pattern Recognit. 69, 251–269 (2017).
[Crossref]

2016 (4)

X. Shao, M. M. Eisa, Z. Chen, S. Dong, and X. He, “Self-calibration single-lens 3d video extensometer for high-accuracy and real-time strain measurement,” Opt. Express 24(26), 30124–30138 (2016).
[Crossref]

Z. Hu, T. Xu, H. Luo, R. Z. Gan, and H. Lu, “Measurement of thickness and profile of a transparent material using fluorescent stereo microscopy,” Opt. Express 24(26), 29822–29829 (2016).
[Crossref]

M. Palanca, G. Tozzi, and L. Cristofolini, “The use of digital image correlation in the biomechanical area: a review,” Int. Biomech. 3(1), 1–21 (2016).
[Crossref]

F. Menna, E. Nocerino, F. Fassi, and F. Remondino, “Geometric and optic characterization of a hemispherical dome port for underwater photogrammetry,” Sensors 16(1), 48 (2016).
[Crossref]

2015 (1)

Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, and Q. Zhang, “High-efficiency and high-accuracy digital image correlation for three-dimensional measurement,” Opt. Lasers Eng. 65, 73–80 (2015).
[Crossref]

2014 (1)

S. Gupta, V. Parameswaran, M. A. Sutton, and A. Shukla, “Study of dynamic underwater implosion mechanics using digital image correlation,” Proc. R. Soc. A 470(2172), 20140576 (2014).
[Crossref]

2013 (2)

B. Pan, K. Li, and W. Tong, “Fast, robust and accurate digital image correlation calculation without redundant computations,” Exp. Mech. 53(7), 1277–1289 (2013).
[Crossref]

M. Malesa, K. Malowany, U. Tomczak, B. Siwek, M. Kujawinska, and A. Sieminska-Lewandowska, “Application of 3d digital image correlation in maintenance and process control in industry,” Comput. Ind. 64(9), 1301–1315 (2013).
[Crossref]

2012 (3)

L. Kang, L. Wu, and Y.-H. Yang, “Experimental study of the influence of refraction on underwater three-dimensional reconstruction using the svp camera model,” Appl. Opt. 51(31), 7591–7603 (2012).
[Crossref]

T. Treibitz, Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 51–65 (2012).
[Crossref]

M. A. Haile and P. G. Ifju, “Application of elastic image registration and refraction correction for non-contact underwater strain measurement,” Strain 48(2), 136–142 (2012).
[Crossref]

2009 (1)

J.-J. Orteu, “3-d computer vision in experimental mechanics,” Opt. Lasers Eng. 47(3-4), 282–291 (2009).
[Crossref]

2008 (1)

X. Ke, M. A. Sutton, S. M. Lessner, and M. Yost, “Robust stereo vision and calibration methodology for accurate three-dimensional digital image correlation measurements on submerged objects,” The J. Strain Analysis for Eng. Des. 43(8), 689–704 (2008).
[Crossref]

2004 (1)

D. Nister, “An efficient solution to the five-point relative pose problem,” IEEE Trans. Pattern Anal. Machine Intell. 26(6), 756–770 (2004).
[Crossref]

2000 (1)

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Machine Intell. 22(11), 1330–1334 (2000).
[Crossref]

1997 (1)

R. I. Hartley and P. Sturm, “Triangulation,” Comput. Vis. Image Underst. 68(2), 146–157 (1997).
[Crossref]

1981 (1)

M. A. Fischler and R. C. Bolles, “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24(6), 381–395 (1981).
[Crossref]

1958 (1)

F. Bryant, “Snell's law of refraction,” Phys. Bull. 9(12), 317 (1958).
[Crossref]

Agrawal, A.

A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in 2012 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2012), pp. 3346–3353.

Birk, A.

T. Luczynski, M. Pfingsthorn, and A. Birk, “The pinax-model for accurate and efficient refraction correction of underwater cameras in flat-pane housings,” Ocean Eng. 133, 9–22 (2017).
[Crossref]

Bolles, R. C.

M. A. Fischler and R. C. Bolles, “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24(6), 381–395 (1981).
[Crossref]

Bryant, F.

F. Bryant, “Snell's law of refraction,” Phys. Bull. 9(12), 317 (1958).
[Crossref]

Chadebecq, F.

F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
[Crossref]

Chari, V.

A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in 2012 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2012), pp. 3346–3353.

V. Chari and P. Sturm, “Multi-view geometry of the refractive plane,” in Proceedings of the British Machine Vision Conference, (British Machine Vision Association, 2009), pp. 56.1–56.11.

Chen, B.

Chen, X.

Y. Wang, Y. Wang, L. Liu, and X. Chen, “Defocused camera calibration with a conventional periodic target based on fourier transform,” Opt. Lett. 44(13), 3254–3257 (2019).
[Crossref]

X. Chen and Y. Yang, “Two-view camera housing parameters calibration for multi-layer flat refractive interface,” in 2014 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2014), pp. 524–531.

Chen, Z.

Z. Chen, X. Shao, X. He, J. Wu, X. Xu, and J. Zhang, “Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo,” J. Biomed. Opt. 22(9), 1–10 (2017).
[Crossref]

X. Shao, M. M. Eisa, Z. Chen, S. Dong, and X. He, “Self-calibration single-lens 3d video extensometer for high-accuracy and real-time strain measurement,” Opt. Express 24(26), 30124–30138 (2016).
[Crossref]

Cheng, T.

Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, and Q. Zhang, “High-efficiency and high-accuracy digital image correlation for three-dimensional measurement,” Opt. Lasers Eng. 65, 73–80 (2015).
[Crossref]

Cristofolini, L.

M. Palanca, G. Tozzi, and L. Cristofolini, “The use of digital image correlation in the biomechanical area: a review,” Int. Biomech. 3(1), 1–21 (2016).
[Crossref]

Desjardins, A.

F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
[Crossref]

Dong, S.

S. Dong, J. Ma, Z. Su, and C. Li, “Robust circular marker localization under non-uniform illuminations based on homomorphic filtering,” Measurement 170, 108700 (2021).
[Crossref]

Z. Su, L. Lu, S. Dong, F. Yang, and X. He, “Auto-calibration and real-time external parameter correction for stereo digital image correlation,” Opt. Lasers Eng. 121, 46–53 (2019).
[Crossref]

X. Shao, M. M. Eisa, Z. Chen, S. Dong, and X. He, “Self-calibration single-lens 3d video extensometer for high-accuracy and real-time strain measurement,” Opt. Express 24(26), 30124–30138 (2016).
[Crossref]

Eisa, M. M.

Fassi, F.

F. Menna, E. Nocerino, F. Fassi, and F. Remondino, “Geometric and optic characterization of a hemispherical dome port for underwater photogrammetry,” Sensors 16(1), 48 (2016).
[Crossref]

Fischler, M. A.

M. A. Fischler and R. C. Bolles, “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24(6), 381–395 (1981).
[Crossref]

Gan, R. Z.

Gao, Y.

Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, and Q. Zhang, “High-efficiency and high-accuracy digital image correlation for three-dimensional measurement,” Opt. Lasers Eng. 65, 73–80 (2015).
[Crossref]

Gong, M.

T. Yau, M. Gong, and Y. Yang, “Underwater camera calibration using wavelength triangulation,” in 2013 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2013), pp. 2499–2506.

Gupta, S.

S. Gupta, V. Parameswaran, M. A. Sutton, and A. Shukla, “Study of dynamic underwater implosion mechanics using digital image correlation,” Proc. R. Soc. A 470(2172), 20140576 (2014).
[Crossref]

Haile, M. A.

M. A. Haile and P. G. Ifju, “Application of elastic image registration and refraction correction for non-contact underwater strain measurement,” Strain 48(2), 136–142 (2012).
[Crossref]

Hartley, R. I.

R. I. Hartley and P. Sturm, “Triangulation,” Comput. Vis. Image Underst. 68(2), 146–157 (1997).
[Crossref]

He, X.

Z. Su, L. Lu, F. Yang, X. He, and D. Zhang, “Geometry constrained correlation adjustment for stereo reconstruction in 3d optical deformation measurements,” Opt. Express 28(8), 12219–12232 (2020).
[Crossref]

Z. Su, L. Lu, S. Dong, F. Yang, and X. He, “Auto-calibration and real-time external parameter correction for stereo digital image correlation,” Opt. Lasers Eng. 121, 46–53 (2019).
[Crossref]

Z. Chen, X. Shao, X. He, J. Wu, X. Xu, and J. Zhang, “Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo,” J. Biomed. Opt. 22(9), 1–10 (2017).
[Crossref]

X. Shao, M. M. Eisa, Z. Chen, S. Dong, and X. He, “Self-calibration single-lens 3d video extensometer for high-accuracy and real-time strain measurement,” Opt. Express 24(26), 30124–30138 (2016).
[Crossref]

Hu, Z.

Ifju, P. G.

M. A. Haile and P. G. Ifju, “Application of elastic image registration and refraction correction for non-contact underwater strain measurement,” Strain 48(2), 136–142 (2012).
[Crossref]

Kang, L.

L. Kang, L. Wu, Y. Wei, S. Lao, and Y.-H. Yang, “Two-view underwater 3d reconstruction for cameras with unknown poses under flat refractive interfaces,” Pattern Recognit. 69, 251–269 (2017).
[Crossref]

L. Kang, L. Wu, and Y.-H. Yang, “Experimental study of the influence of refraction on underwater three-dimensional reconstruction using the svp camera model,” Appl. Opt. 51(31), 7591–7603 (2012).
[Crossref]

Ke, X.

X. Ke, M. A. Sutton, S. M. Lessner, and M. Yost, “Robust stereo vision and calibration methodology for accurate three-dimensional digital image correlation measurements on submerged objects,” The J. Strain Analysis for Eng. Des. 43(8), 689–704 (2008).
[Crossref]

Kishore, S.

S. Kishore, K. Senol, P. Naik Parrikar, and A. Shukla, “Underwater implosion pressure pulse interactions with submerged plates,” J. Mech. Phys. Solids 143, 104051 (2020).
[Crossref]

Kujawinska, M.

M. Malesa, K. Malowany, U. Tomczak, B. Siwek, M. Kujawinska, and A. Sieminska-Lewandowska, “Application of 3d digital image correlation in maintenance and process control in industry,” Comput. Ind. 64(9), 1301–1315 (2013).
[Crossref]

Kunz, C.

T. Treibitz, Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 51–65 (2012).
[Crossref]

Lacher, R.

F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
[Crossref]

Lao, S.

L. Kang, L. Wu, Y. Wei, S. Lao, and Y.-H. Yang, “Two-view underwater 3d reconstruction for cameras with unknown poses under flat refractive interfaces,” Pattern Recognit. 69, 251–269 (2017).
[Crossref]

Lapresté, J. T.

J. M. Lavest, G. Rives, and J. T. Lapresté, “Underwater camera calibration,” in Computer Vision — ECCV 2000, D. Vernon, ed. (Springer, 2000), pp. 654–668.

Lavest, J. M.

J. M. Lavest, G. Rives, and J. T. Lapresté, “Underwater camera calibration,” in Computer Vision — ECCV 2000, D. Vernon, ed. (Springer, 2000), pp. 654–668.

Lessner, S. M.

X. Ke, M. A. Sutton, S. M. Lessner, and M. Yost, “Robust stereo vision and calibration methodology for accurate three-dimensional digital image correlation measurements on submerged objects,” The J. Strain Analysis for Eng. Des. 43(8), 689–704 (2008).
[Crossref]

Li, C.

S. Dong, J. Ma, Z. Su, and C. Li, “Robust circular marker localization under non-uniform illuminations based on homomorphic filtering,” Measurement 170, 108700 (2021).
[Crossref]

Li, K.

B. Pan, K. Li, and W. Tong, “Fast, robust and accurate digital image correlation calculation without redundant computations,” Exp. Mech. 53(7), 1277–1289 (2013).
[Crossref]

Liu, L.

Lu, H.

Lu, L.

Z. Su, L. Lu, F. Yang, X. He, and D. Zhang, “Geometry constrained correlation adjustment for stereo reconstruction in 3d optical deformation measurements,” Opt. Express 28(8), 12219–12232 (2020).
[Crossref]

Z. Su, L. Lu, S. Dong, F. Yang, and X. He, “Auto-calibration and real-time external parameter correction for stereo digital image correlation,” Opt. Lasers Eng. 121, 46–53 (2019).
[Crossref]

Luczynski, T.

T. Luczynski, M. Pfingsthorn, and A. Birk, “The pinax-model for accurate and efficient refraction correction of underwater cameras in flat-pane housings,” Ocean Eng. 133, 9–22 (2017).
[Crossref]

Luo, H.

Ma, J.

S. Dong, J. Ma, Z. Su, and C. Li, “Robust circular marker localization under non-uniform illuminations based on homomorphic filtering,” Measurement 170, 108700 (2021).
[Crossref]

Malesa, M.

M. Malesa, K. Malowany, U. Tomczak, B. Siwek, M. Kujawinska, and A. Sieminska-Lewandowska, “Application of 3d digital image correlation in maintenance and process control in industry,” Comput. Ind. 64(9), 1301–1315 (2013).
[Crossref]

Malowany, K.

M. Malesa, K. Malowany, U. Tomczak, B. Siwek, M. Kujawinska, and A. Sieminska-Lewandowska, “Application of 3d digital image correlation in maintenance and process control in industry,” Comput. Ind. 64(9), 1301–1315 (2013).
[Crossref]

Maneas, E.

F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
[Crossref]

Menna, F.

F. Menna, E. Nocerino, F. Fassi, and F. Remondino, “Geometric and optic characterization of a hemispherical dome port for underwater photogrammetry,” Sensors 16(1), 48 (2016).
[Crossref]

Moustafa, M. A.

L. Ngeljaratan and M. A. Moustafa, “Structural health monitoring and seismic response assessment of bridge structures using target-tracking digital image correlation,” Eng. Struct. 213, 110551 (2020).
[Crossref]

Naik Parrikar, P.

S. Kishore, K. Senol, P. Naik Parrikar, and A. Shukla, “Underwater implosion pressure pulse interactions with submerged plates,” J. Mech. Phys. Solids 143, 104051 (2020).
[Crossref]

Ngeljaratan, L.

L. Ngeljaratan and M. A. Moustafa, “Structural health monitoring and seismic response assessment of bridge structures using target-tracking digital image correlation,” Eng. Struct. 213, 110551 (2020).
[Crossref]

Niezrecki, C.

D. Reagan, A. Sabato, and C. Niezrecki, “Feasibility of using digital image correlation for unmanned aerial vehicle structural health monitoring of bridges,” Struct. Heal. Monit. 17(5), 1056–1072 (2018).
[Crossref]

Nister, D.

D. Nister, “An efficient solution to the five-point relative pose problem,” IEEE Trans. Pattern Anal. Machine Intell. 26(6), 756–770 (2004).
[Crossref]

Nocerino, E.

F. Menna, E. Nocerino, F. Fassi, and F. Remondino, “Geometric and optic characterization of a hemispherical dome port for underwater photogrammetry,” Sensors 16(1), 48 (2016).
[Crossref]

Orteu, J.-J.

J.-J. Orteu, “3-d computer vision in experimental mechanics,” Opt. Lasers Eng. 47(3-4), 282–291 (2009).
[Crossref]

M. A. Sutton, J.-J. Orteu, and H. Schreier, Image Correlation for Shape, Motion and Deformation Measurements: Basic Concepts, Theory and Applications (Springer, 2009), 1st ed.

Ourselin, S.

F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
[Crossref]

Palanca, M.

M. Palanca, G. Tozzi, and L. Cristofolini, “The use of digital image correlation in the biomechanical area: a review,” Int. Biomech. 3(1), 1–21 (2016).
[Crossref]

Pan, B.

B. Chen and B. Pan, “Calibration-free single camera stereo-digital image correlation for small-scale underwater deformation measurement,” Opt. Express 27(8), 10509–10523 (2019).
[Crossref]

B. Pan, K. Li, and W. Tong, “Fast, robust and accurate digital image correlation calculation without redundant computations,” Exp. Mech. 53(7), 1277–1289 (2013).
[Crossref]

Parameswaran, V.

S. Gupta, V. Parameswaran, M. A. Sutton, and A. Shukla, “Study of dynamic underwater implosion mechanics using digital image correlation,” Proc. R. Soc. A 470(2172), 20140576 (2014).
[Crossref]

Pfingsthorn, M.

T. Luczynski, M. Pfingsthorn, and A. Birk, “The pinax-model for accurate and efficient refraction correction of underwater cameras in flat-pane housings,” Ocean Eng. 133, 9–22 (2017).
[Crossref]

Piekarczuk, A.

A. Piekarczuk, “Experimental and numerical studies of double corrugated steel arch panels,” Thin-Walled Struct. 140, 60–73 (2019).
[Crossref]

Ramalingam, S.

A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in 2012 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2012), pp. 3346–3353.

Reagan, D.

D. Reagan, A. Sabato, and C. Niezrecki, “Feasibility of using digital image correlation for unmanned aerial vehicle structural health monitoring of bridges,” Struct. Heal. Monit. 17(5), 1056–1072 (2018).
[Crossref]

Reddy, J.

J. Reddy, Theory and Analysis of Elastic Plates and Shells (CRC, 2006), 2nd ed.

Remondino, F.

F. Menna, E. Nocerino, F. Fassi, and F. Remondino, “Geometric and optic characterization of a hemispherical dome port for underwater photogrammetry,” Sensors 16(1), 48 (2016).
[Crossref]

Rives, G.

J. M. Lavest, G. Rives, and J. T. Lapresté, “Underwater camera calibration,” in Computer Vision — ECCV 2000, D. Vernon, ed. (Springer, 2000), pp. 654–668.

Sabato, A.

D. Reagan, A. Sabato, and C. Niezrecki, “Feasibility of using digital image correlation for unmanned aerial vehicle structural health monitoring of bridges,” Struct. Heal. Monit. 17(5), 1056–1072 (2018).
[Crossref]

Schechner, Y.

T. Treibitz, Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 51–65 (2012).
[Crossref]

Schreier, H.

M. A. Sutton, J.-J. Orteu, and H. Schreier, Image Correlation for Shape, Motion and Deformation Measurements: Basic Concepts, Theory and Applications (Springer, 2009), 1st ed.

Senol, K.

S. Kishore, K. Senol, P. Naik Parrikar, and A. Shukla, “Underwater implosion pressure pulse interactions with submerged plates,” J. Mech. Phys. Solids 143, 104051 (2020).
[Crossref]

Shao, X.

Z. Chen, X. Shao, X. He, J. Wu, X. Xu, and J. Zhang, “Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo,” J. Biomed. Opt. 22(9), 1–10 (2017).
[Crossref]

X. Shao, M. M. Eisa, Z. Chen, S. Dong, and X. He, “Self-calibration single-lens 3d video extensometer for high-accuracy and real-time strain measurement,” Opt. Express 24(26), 30124–30138 (2016).
[Crossref]

Shukla, A.

S. Kishore, K. Senol, P. Naik Parrikar, and A. Shukla, “Underwater implosion pressure pulse interactions with submerged plates,” J. Mech. Phys. Solids 143, 104051 (2020).
[Crossref]

S. Gupta, V. Parameswaran, M. A. Sutton, and A. Shukla, “Study of dynamic underwater implosion mechanics using digital image correlation,” Proc. R. Soc. A 470(2172), 20140576 (2014).
[Crossref]

Sieminska-Lewandowska, A.

M. Malesa, K. Malowany, U. Tomczak, B. Siwek, M. Kujawinska, and A. Sieminska-Lewandowska, “Application of 3d digital image correlation in maintenance and process control in industry,” Comput. Ind. 64(9), 1301–1315 (2013).
[Crossref]

Singh, H.

T. Treibitz, Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 51–65 (2012).
[Crossref]

Siwek, B.

M. Malesa, K. Malowany, U. Tomczak, B. Siwek, M. Kujawinska, and A. Sieminska-Lewandowska, “Application of 3d digital image correlation in maintenance and process control in industry,” Comput. Ind. 64(9), 1301–1315 (2013).
[Crossref]

Stoyanov, D.

F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
[Crossref]

Sturm, P.

R. I. Hartley and P. Sturm, “Triangulation,” Comput. Vis. Image Underst. 68(2), 146–157 (1997).
[Crossref]

V. Chari and P. Sturm, “Multi-view geometry of the refractive plane,” in Proceedings of the British Machine Vision Conference, (British Machine Vision Association, 2009), pp. 56.1–56.11.

Su, Y.

Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, and Q. Zhang, “High-efficiency and high-accuracy digital image correlation for three-dimensional measurement,” Opt. Lasers Eng. 65, 73–80 (2015).
[Crossref]

Su, Z.

S. Dong, J. Ma, Z. Su, and C. Li, “Robust circular marker localization under non-uniform illuminations based on homomorphic filtering,” Measurement 170, 108700 (2021).
[Crossref]

Z. Su, L. Lu, F. Yang, X. He, and D. Zhang, “Geometry constrained correlation adjustment for stereo reconstruction in 3d optical deformation measurements,” Opt. Express 28(8), 12219–12232 (2020).
[Crossref]

Z. Su, L. Lu, S. Dong, F. Yang, and X. He, “Auto-calibration and real-time external parameter correction for stereo digital image correlation,” Opt. Lasers Eng. 121, 46–53 (2019).
[Crossref]

Sutton, M. A.

S. Gupta, V. Parameswaran, M. A. Sutton, and A. Shukla, “Study of dynamic underwater implosion mechanics using digital image correlation,” Proc. R. Soc. A 470(2172), 20140576 (2014).
[Crossref]

X. Ke, M. A. Sutton, S. M. Lessner, and M. Yost, “Robust stereo vision and calibration methodology for accurate three-dimensional digital image correlation measurements on submerged objects,” The J. Strain Analysis for Eng. Des. 43(8), 689–704 (2008).
[Crossref]

M. A. Sutton, J.-J. Orteu, and H. Schreier, Image Correlation for Shape, Motion and Deformation Measurements: Basic Concepts, Theory and Applications (Springer, 2009), 1st ed.

Taguchi, Y.

A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in 2012 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2012), pp. 3346–3353.

Tomczak, U.

M. Malesa, K. Malowany, U. Tomczak, B. Siwek, M. Kujawinska, and A. Sieminska-Lewandowska, “Application of 3d digital image correlation in maintenance and process control in industry,” Comput. Ind. 64(9), 1301–1315 (2013).
[Crossref]

Tong, W.

B. Pan, K. Li, and W. Tong, “Fast, robust and accurate digital image correlation calculation without redundant computations,” Exp. Mech. 53(7), 1277–1289 (2013).
[Crossref]

Tozzi, G.

M. Palanca, G. Tozzi, and L. Cristofolini, “The use of digital image correlation in the biomechanical area: a review,” Int. Biomech. 3(1), 1–21 (2016).
[Crossref]

Treibitz, T.

T. Treibitz, Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 51–65 (2012).
[Crossref]

Vasconcelos, F.

F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
[Crossref]

Vercauteren, T.

F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
[Crossref]

Wang, Y.

Wei, Y.

L. Kang, L. Wu, Y. Wei, S. Lao, and Y.-H. Yang, “Two-view underwater 3d reconstruction for cameras with unknown poses under flat refractive interfaces,” Pattern Recognit. 69, 251–269 (2017).
[Crossref]

Wu, J.

Z. Chen, X. Shao, X. He, J. Wu, X. Xu, and J. Zhang, “Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo,” J. Biomed. Opt. 22(9), 1–10 (2017).
[Crossref]

Wu, L.

L. Kang, L. Wu, Y. Wei, S. Lao, and Y.-H. Yang, “Two-view underwater 3d reconstruction for cameras with unknown poses under flat refractive interfaces,” Pattern Recognit. 69, 251–269 (2017).
[Crossref]

L. Kang, L. Wu, and Y.-H. Yang, “Experimental study of the influence of refraction on underwater three-dimensional reconstruction using the svp camera model,” Appl. Opt. 51(31), 7591–7603 (2012).
[Crossref]

Xu, T.

Xu, X.

Z. Chen, X. Shao, X. He, J. Wu, X. Xu, and J. Zhang, “Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo,” J. Biomed. Opt. 22(9), 1–10 (2017).
[Crossref]

Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, and Q. Zhang, “High-efficiency and high-accuracy digital image correlation for three-dimensional measurement,” Opt. Lasers Eng. 65, 73–80 (2015).
[Crossref]

Yang, F.

Z. Su, L. Lu, F. Yang, X. He, and D. Zhang, “Geometry constrained correlation adjustment for stereo reconstruction in 3d optical deformation measurements,” Opt. Express 28(8), 12219–12232 (2020).
[Crossref]

Z. Su, L. Lu, S. Dong, F. Yang, and X. He, “Auto-calibration and real-time external parameter correction for stereo digital image correlation,” Opt. Lasers Eng. 121, 46–53 (2019).
[Crossref]

Yang, Y.

X. Chen and Y. Yang, “Two-view camera housing parameters calibration for multi-layer flat refractive interface,” in 2014 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2014), pp. 524–531.

T. Yau, M. Gong, and Y. Yang, “Underwater camera calibration using wavelength triangulation,” in 2013 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2013), pp. 2499–2506.

Yang, Y.-H.

L. Kang, L. Wu, Y. Wei, S. Lao, and Y.-H. Yang, “Two-view underwater 3d reconstruction for cameras with unknown poses under flat refractive interfaces,” Pattern Recognit. 69, 251–269 (2017).
[Crossref]

L. Kang, L. Wu, and Y.-H. Yang, “Experimental study of the influence of refraction on underwater three-dimensional reconstruction using the svp camera model,” Appl. Opt. 51(31), 7591–7603 (2012).
[Crossref]

Yau, T.

T. Yau, M. Gong, and Y. Yang, “Underwater camera calibration using wavelength triangulation,” in 2013 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2013), pp. 2499–2506.

Yost, M.

X. Ke, M. A. Sutton, S. M. Lessner, and M. Yost, “Robust stereo vision and calibration methodology for accurate three-dimensional digital image correlation measurements on submerged objects,” The J. Strain Analysis for Eng. Des. 43(8), 689–704 (2008).
[Crossref]

Zhang, D.

Zhang, J.

Z. Chen, X. Shao, X. He, J. Wu, X. Xu, and J. Zhang, “Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo,” J. Biomed. Opt. 22(9), 1–10 (2017).
[Crossref]

Zhang, Q.

Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, and Q. Zhang, “High-efficiency and high-accuracy digital image correlation for three-dimensional measurement,” Opt. Lasers Eng. 65, 73–80 (2015).
[Crossref]

Zhang, Y.

Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, and Q. Zhang, “High-efficiency and high-accuracy digital image correlation for three-dimensional measurement,” Opt. Lasers Eng. 65, 73–80 (2015).
[Crossref]

Zhang, Z.

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Machine Intell. 22(11), 1330–1334 (2000).
[Crossref]

Appl. Opt. (1)

Commun. ACM (1)

M. A. Fischler and R. C. Bolles, “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24(6), 381–395 (1981).
[Crossref]

Comput. Ind. (1)

M. Malesa, K. Malowany, U. Tomczak, B. Siwek, M. Kujawinska, and A. Sieminska-Lewandowska, “Application of 3d digital image correlation in maintenance and process control in industry,” Comput. Ind. 64(9), 1301–1315 (2013).
[Crossref]

Comput. Vis. Image Underst. (1)

R. I. Hartley and P. Sturm, “Triangulation,” Comput. Vis. Image Underst. 68(2), 146–157 (1997).
[Crossref]

Eng. Struct. (1)

L. Ngeljaratan and M. A. Moustafa, “Structural health monitoring and seismic response assessment of bridge structures using target-tracking digital image correlation,” Eng. Struct. 213, 110551 (2020).
[Crossref]

Exp. Mech. (1)

B. Pan, K. Li, and W. Tong, “Fast, robust and accurate digital image correlation calculation without redundant computations,” Exp. Mech. 53(7), 1277–1289 (2013).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

T. Treibitz, Y. Schechner, C. Kunz, and H. Singh, “Flat refractive geometry,” IEEE Trans. Pattern Anal. Mach. Intell. 34(1), 51–65 (2012).
[Crossref]

IEEE Trans. Pattern Anal. Machine Intell. (2)

Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Machine Intell. 22(11), 1330–1334 (2000).
[Crossref]

D. Nister, “An efficient solution to the five-point relative pose problem,” IEEE Trans. Pattern Anal. Machine Intell. 26(6), 756–770 (2004).
[Crossref]

Int. Biomech. (1)

M. Palanca, G. Tozzi, and L. Cristofolini, “The use of digital image correlation in the biomechanical area: a review,” Int. Biomech. 3(1), 1–21 (2016).
[Crossref]

Int. J. Comput. Vis. (1)

F. Chadebecq, F. Vasconcelos, R. Lacher, E. Maneas, A. Desjardins, S. Ourselin, T. Vercauteren, and D. Stoyanov, “Refractive two-view reconstruction for underwater 3d vision,” Int. J. Comput. Vis. 128(5), 1101–1117 (2020).
[Crossref]

J. Biomed. Opt. (1)

Z. Chen, X. Shao, X. He, J. Wu, X. Xu, and J. Zhang, “Noninvasive, three-dimensional full-field body sensor for surface deformation monitoring of human body in vivo,” J. Biomed. Opt. 22(9), 1–10 (2017).
[Crossref]

J. Mech. Phys. Solids (1)

S. Kishore, K. Senol, P. Naik Parrikar, and A. Shukla, “Underwater implosion pressure pulse interactions with submerged plates,” J. Mech. Phys. Solids 143, 104051 (2020).
[Crossref]

Measurement (1)

S. Dong, J. Ma, Z. Su, and C. Li, “Robust circular marker localization under non-uniform illuminations based on homomorphic filtering,” Measurement 170, 108700 (2021).
[Crossref]

Ocean Eng. (1)

T. Luczynski, M. Pfingsthorn, and A. Birk, “The pinax-model for accurate and efficient refraction correction of underwater cameras in flat-pane housings,” Ocean Eng. 133, 9–22 (2017).
[Crossref]

Opt. Express (4)

Opt. Lasers Eng. (3)

J.-J. Orteu, “3-d computer vision in experimental mechanics,” Opt. Lasers Eng. 47(3-4), 282–291 (2009).
[Crossref]

Y. Gao, T. Cheng, Y. Su, X. Xu, Y. Zhang, and Q. Zhang, “High-efficiency and high-accuracy digital image correlation for three-dimensional measurement,” Opt. Lasers Eng. 65, 73–80 (2015).
[Crossref]

Z. Su, L. Lu, S. Dong, F. Yang, and X. He, “Auto-calibration and real-time external parameter correction for stereo digital image correlation,” Opt. Lasers Eng. 121, 46–53 (2019).
[Crossref]

Opt. Lett. (1)

Pattern Recognit. (1)

L. Kang, L. Wu, Y. Wei, S. Lao, and Y.-H. Yang, “Two-view underwater 3d reconstruction for cameras with unknown poses under flat refractive interfaces,” Pattern Recognit. 69, 251–269 (2017).
[Crossref]

Phys. Bull. (1)

F. Bryant, “Snell's law of refraction,” Phys. Bull. 9(12), 317 (1958).
[Crossref]

Proc. R. Soc. A (1)

S. Gupta, V. Parameswaran, M. A. Sutton, and A. Shukla, “Study of dynamic underwater implosion mechanics using digital image correlation,” Proc. R. Soc. A 470(2172), 20140576 (2014).
[Crossref]

Sensors (1)

F. Menna, E. Nocerino, F. Fassi, and F. Remondino, “Geometric and optic characterization of a hemispherical dome port for underwater photogrammetry,” Sensors 16(1), 48 (2016).
[Crossref]

Strain (1)

M. A. Haile and P. G. Ifju, “Application of elastic image registration and refraction correction for non-contact underwater strain measurement,” Strain 48(2), 136–142 (2012).
[Crossref]

Struct. Heal. Monit. (1)

D. Reagan, A. Sabato, and C. Niezrecki, “Feasibility of using digital image correlation for unmanned aerial vehicle structural health monitoring of bridges,” Struct. Heal. Monit. 17(5), 1056–1072 (2018).
[Crossref]

The J. Strain Analysis for Eng. Des. (1)

X. Ke, M. A. Sutton, S. M. Lessner, and M. Yost, “Robust stereo vision and calibration methodology for accurate three-dimensional digital image correlation measurements on submerged objects,” The J. Strain Analysis for Eng. Des. 43(8), 689–704 (2008).
[Crossref]

Thin-Walled Struct. (1)

A. Piekarczuk, “Experimental and numerical studies of double corrugated steel arch panels,” Thin-Walled Struct. 140, 60–73 (2019).
[Crossref]

Other (7)

M. A. Sutton, J.-J. Orteu, and H. Schreier, Image Correlation for Shape, Motion and Deformation Measurements: Basic Concepts, Theory and Applications (Springer, 2009), 1st ed.

A. Agrawal, S. Ramalingam, Y. Taguchi, and V. Chari, “A theory of multi-layer flat refractive geometry,” in 2012 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2012), pp. 3346–3353.

T. Yau, M. Gong, and Y. Yang, “Underwater camera calibration using wavelength triangulation,” in 2013 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2013), pp. 2499–2506.

X. Chen and Y. Yang, “Two-view camera housing parameters calibration for multi-layer flat refractive interface,” in 2014 IEEE Conference on Computer Vision and Pattern Recognition, (IEEE, 2014), pp. 524–531.

V. Chari and P. Sturm, “Multi-view geometry of the refractive plane,” in Proceedings of the British Machine Vision Conference, (British Machine Vision Association, 2009), pp. 56.1–56.11.

J. M. Lavest, G. Rives, and J. T. Lapresté, “Underwater camera calibration,” in Computer Vision — ECCV 2000, D. Vernon, ed. (Springer, 2000), pp. 654–668.

J. Reddy, Theory and Analysis of Elastic Plates and Shells (CRC, 2006), 2nd ed.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. Refractive geometry for two flat interfaces. $O-XYZ$ and $O'-X'Y'Z'$ are the reference 3D coordinate frame and the right camera frame, respectively. The end node of each $Y$-axis is visualized with a green circle containing a cross. See text for more details.
Fig. 2.
Fig. 2. Overview of the refractive stereo-DIC measurement pipeline.
Fig. 3.
Fig. 3. (a) Experimental setup. (b) Stereo-pair of 44 circular markers. (c) Speckle images of the underwater sphere and disk specimens.
Fig. 4.
Fig. 4. 3D sphere shape measured by the refractive stereo-DIC (a) and the regular version (b), respectively.
Fig. 5.
Fig. 5. Discrepancy maps between the 3D shapes measured by our method (a) and the regular stereo-DIC method (b) and the ideal spherical surface; (c) the discrepancy map between the spheres measured in water with our method and in air with the regular stereo-DIC.
Fig. 6.
Fig. 6. Measured mean displacements versus the imposed translations for the proposed refractive stereo-DIC (a) and the regular stereo-DIC (b). In (b), the SD errors are magnified ten-fold for visualization.
Fig. 7.
Fig. 7. (a) Measured displacements versus the out-of-plane deformation and (b) the corresponding absolute errors.
Fig. 8.
Fig. 8. 3D displacement fields measured by the proposed refractive stereo-DIC: (a)-(c) for the load of 0.1 mm and (d)-(f) for 1.0 mm.
Fig. 9.
Fig. 9. Measured total displacement fields and the distribution curves in radial direction at the load states of 0.1 mm (a) and 1.0 mm (b).

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

{ α = ( α x , α y , α z ) = P O | | P O | | α = ( α x , α y , α z ) = P O | | P O | | .
{ L 1 : X α x = Y α y = Z α z L 1 : X t x α x = Y t y α y = Z t z α z .
N T X + D = 0.
{ P 1 = D N T α α P 1 = 1 N T α ( [ t x α α x t , t y α α y t , t z α α z t ] T N D α ) .
α = α + cos θ 1 N ,
β = β + cos θ 2 N ,
α = α + cos θ 1 N ,
β = β + cos θ 2 N ,
{ β = n 1 n 2 α     ( n 1 n 2 cos θ 1 cos θ 2 ) N β = n 1 n 2 α ( n 1 n 2 cos θ 1 cos θ 2 ) N .
{ L 2 : X X 1 β x = Y Y 1 β y = Z Z 1 β z L 2 : X X 1 β x = Y Y 1 β y = Z Z 1 β z .
{ P 2 = 1 N T β ( [ X 1 β β x P 1 , Y 1 β β y P 1 , Z 1 β β z P 1 ] T N ( D + d ) β ) P 2 = 1 N T β ( [ X 1 β β x P 1 , Y 1 β β y P 1 , Z 1 β β z P 1 ] T N ( D + d ) β ) .
{ γ = n 2 n 3 β     ( n 2 n 3 cos θ 2 cos θ 3 ) N γ = n 2 n 3 β ( n 2 n 3 cos θ 2 cos θ 3 ) N .
[ γ , γ ] ( s s ) = P 2 P 2 .
( U , V , W ) = Q Q ,

Metrics