## Abstract

This paper demonstrates experimental three-dimensional (3D) image reconstruction of optically heterogeneous turbid media from near-infrared continuous-wave measurements. Successful reconstruction is achieved through a full 3D finite-element based, Newton-type reconstruction algorithm. Our experimental evidence shows that both absorption and scattering images of a 10×15 mm cylindrical object embedded in a 50×50 mm cylindrical background can be reconstructed using our algorithm.

©2000 Optical Society of America

## 1. Introduction

Reconstruction-based optical diffusion tomography offers clinical potentials for applications such as breast cancer detection and brain functional imaging. This is primarily due to its capability to form an image of the spatial distribution of absorption and scattering properties of tissue for providing both structural and functional information [1]. In this type of indirect image method, an effective reconstruction algorithm is crucial. While various linear algorithms have been developed [2–4], increasingly attention is turning to iterative, nonlinear reconstruction methods [5–8] which can overcome the systemic errors due to the linear approximation [9].

To date nonlinear methods have been limited to two-dimensional (2D) problems experimentally [5–8]. While quantitative 2D images have been achieved using tissue phantoms in the laboratories [5,6], “infinite” long cylinders have been used as the objects in these experiments. Since these 2D schemes and experiments have ignored the essential 3D nature of light scattering in tissue, some degrees of distortion of the object shape and size and degradation of the absorption and scattering coefficients recovered would be anticipated when the object has a finite volume such as a tumor in a human breast. This necessitates our consideration of optical image reconstruction in a 3D framework. 3D optical image reconstructions have been primarily limited to simulations [10–12]. 3D experimental work from time-resolved data began to appear very recently [13]; however, the data collection used relied on expensive time-domain system. In this paper, we report for the first time an experimental confirmation for full 3D reconstruction using continuous-wave or dc data. Compared with time-domain based method, continuous-wave method uses the simplest, most economical steady-state, continuous-wave optical components. In this optical system, we only need to measure the diffusive light intensity for image reconstruction without any expensive detection techniques and complicated data acquisition software. Due to this simplicity of continuous-wave optical system and high signal-to-noise ratio available from time-independent light intensity measurements, imaging using intensity data only may prove to be considerably easier to implement clinically and to be more robust with respect to the reliability of the data collected in the clinical environment relative to its time-domain counterpart. Both absorption and scattering images are obtained using absolute dc data without invoking a calibration measurement on a homogeneous phantom.

## 2. Reconstruction algorithm

Our 3D nonlinear reconstruction algorithm is based on powerful finite element methods. The algorithm uses a regularized Newton method to update an initial (guess) optical property distribution iteratively in order to minimize an object function composed of a weighted sum of the squared difference between computed and measured data. While the details of the algorithm can be found in Ref. 10, here we present a brief overview for context.

Given a time-independent incident beam, light propagation in tissue can be described by the well-known diffusion equation [1]:

where Φ is the photon density, D is the diffusion coefficient, µ_{a} is the absorption coefficient, and S is the source term. The diffusion coefficient D can be written as D=1/3[${\mu}_{\mathrm{s}}^{\prime}$+µ_{a}], where ${\mu}_{\mathrm{s}}^{\prime}$ is the reduced scattering coefficient (since ${\mu}_{\mathrm{s}}^{\prime}$>>µ_{a} in turbid medium, we work directly with D and µ_{a} during reconstruction). In this study, a point source, S=S_{o}δ(x-x_{o},y-y_{o},z-z_{o}), and Type III boundary conditions (BCs), -D∇Φ·n^,=αΦ, are used where S_{o} expresses the strength of the source, and α is related to the internal reflection at the boundary.

Making use of a finite element discretization, we can obtain a discrete matrix equation for (1) and realize other derived matrix relations (through differentiation) which lead to a set of equations capable of inverse problem solution [5,6,14]

where the elements of matrix [A] are a_{ij}=<-D∇ϕ_{j}·∇ϕ_{i}-µ_{a}ϕ_{j}ϕ_{i}> with < > symbolizing integration over the 3D problem domain; ϕ_{i}, ϕ_{j} are locally spatially-varying Lagrangian basis function; *χ* expresses D or µ_{a} ; ℑ is the Jacobian matrix which should be formed from ∂Φ/∂_{χ} at the boundary measurement sites; Δ
_{χ}
=(ΔD_{1},ΔD_{2},…ΔD_{N},Δµ_{a,1},Δµ_{a,2},…Δµ_{a,N})^{T} is the update vector for the optical property profiles where N is the total number of nodes in the finite element mesh used; Φ^{o}=(${\mathrm{\Phi}}_{1}^{\mathrm{o}}$,${\mathrm{\Phi}}_{2}^{\mathrm{o}}$,…${\mathrm{\Phi}}_{\mathrm{M}}^{\mathrm{o}}$)^{T} and Φ^{c}=(${\mathrm{\Phi}}_{1}^{\mathrm{c}}$,${\mathrm{\Phi}}_{2}^{\mathrm{c}}$,…${\mathrm{\Phi}}_{\mathrm{M}}^{\mathrm{c}}$)^{T}, where ${\mathrm{\Phi}}_{\mathrm{i}}^{\mathrm{o}}$ and ${\mathrm{\Phi}}_{\mathrm{i}}^{\mathrm{c}}$ are observed and calculated data for i=1, 2, … M boundary locations. Note that in order to estimate D and µ_{a} spatially, we expand these quantities in a similar fashion to Φ as a finite sum of unknown coefficients multiplied by the locally defined Lagrangian basis functions. In optical imaging, D and µ_{a} are unknown and Φ is also generally not known except possibly at a finite number of measurement sites. The basic idea for determining D and µ_{a} is to make measurements of diffusive light around the perimeter of the problem domain for a set of known optical excitation positions. Then the image formation task is to make estimates (which are updated and improved through solution of (4)) of the D and µ_{a} distributions that are required to sustain the measured boundary quantities. Since the matrix *ℑ*
^{T}
*ℑ* is known to be ill-conditioned,^{5–8} a way to regularize or stabilize the decomposition of *ℑ*
^{T}
*ℑ* is needed. We have used a hybrid technique in this study which synthesizes standard Marquardt and Tikhonov regularization schemes.

## 3. Reconstruction of 3D absorption and scattering images

Our experimental setup used was an automated multi-channel frequency-domain system that was described in detail elsewhere [15]. Briefly, an intensity-modulated light from a 785-nm 50 mW diode laser was sequentially sent to the phantom by 16 3-mm fiber optic bundles. For each source position, the diffused light was received at 4×16 detector positions along the surface of the cylindrical phantom (see Fig. 1 for the source/detector arrangement). dc, ac and phase shift signals were obtained using the standard heterodyne technique controlled by FFT Labview routines. For each experimental configuration, a total of 3×1024 measurements (1024 dc, ac and phases each) were made which needed about 32 minutes at the present time. We just applied measured dc data to reconstruct the absorption and scattering images in this study. We used a 50×50 mm cylindrical solid phantom (Intralipid+India ink+Agar) as the background medium, making the absorption coefficient 0.005/mm µ_{a}=and the reduced scattering coefficient ${\mu}_{\mathrm{s}}^{\prime}$1.0/mm. An off-centered 10×15 mm cylindrical solid object was embedded in the background medium (see Fig. 1). Two cases with different optical contrast between the object and background were tested. The object in the first case was just an absorber with µ_{a}=0.025/mm and ${\mu}_{\mathrm{s}}^{\prime}$=1.0/mm. The center of this object was located at (13, 0, 0). Both µ_{a} and ${\mu}_{\mathrm{s}}^{\prime}$ for the object in the second case were different from the background: ${\mu}_{\mathrm{a}}$=0.025/mm and ${\mu}_{\mathrm{s}}^{\prime}$=3.0/mm. The center of the second object was located at (13, -4, 0). The 3D finite-element mesh used had 3341 nodes and 16128 tetrahedral elements for both forward and inverse solutions. The initial estimate of the optical properties used for the reconstruction is a homogeneous distribution that is 20% off the exact background value. The images reported were the results of 5 iterations with about 3 hours per iteration for reconstructing µ_{a} only (${\mu}_{\mathrm{s}}^{\prime}$ of the object has been assumed known in the reconstruction for the pure absorber case) and 10 hours per iteration for reconstructing both µ_{a} and ${\mu}_{\mathrm{s}}^{\prime}$ in a 600 MHz Pentium III PC.

Fig. 2 shows the reconstructed image for the pure absorbing object at different cut-planes (see Fig. 1 for the reference coordinate system). The pink-colored region (at 3 o’clock for Fig. 2(a,b) and at the middle right and middle for Fig. 2 (c) and (d) respectively) clearly indicates the location and size of the object. Table 1 below provides the quantitative information about the center and the Full-Width at Half Maximum (FWHM) of the object as well as the recovered average optical properties of the object for the images shown in Fig. 2 and Figs. 3–4. While we noticed that the recovered image is quantitatively approximate in terms of the location, size and absorption coefficient of the object, we also noted that some artifacts obviously appear in the background region. These artifacts generally appear near the sources and detectors where the measurement sensitivity is highest. We believe that noise effect was the primary cause for these artifacts, which was magnified by the inherent ill-conditioned nature of the inverse problem we are dealing with. These artifacts could be minimized by controlling the measurement noise and by using improved regularization schemes such as spatially variant regularization [6]. Simultaneously reconstructed µ_{a} and ${\mu}_{\mathrm{s}}^{\prime}$ images for the second case at multiple cut-planes are shown in Figs. 3 and 4, respectively. Again, these recovered images are not only quantitatively approximate with respect to the location and size of the object, but also quantitatively approximate in terms of the optical properties of the object (see Table 1). It is interesting to note that there is almost no artifact in the µ_{a} image for this case, while some artifacts are clearly shown in the ${\mu}_{\mathrm{s}}^{\prime}$ image (even stronger artifacts, which are not shown here in Fig. 4 (c) and (d), appeared at the bottom boundary of the cutplanes x=4 and y=15 mm). It is further noted that the recovered ${\mu}_{\mathrm{s}}^{\prime}$ value of the object is far lower than the exact value (about 1.4/mm vs. 3.0/mm). It seems that the quality of µ_{a} image is improved with the sacrifice of the ${\mu}_{\mathrm{s}}^{\prime}$ image quality. Although the quality of ${\mu}_{\mathrm{s}}^{\prime}$ image is not as good as that of the µ_{a} image, we have provided experimental evidence that both µ_{a} and ${\mu}_{\mathrm{s}}^{\prime}$ images can be reconstructed simultaneously using dc data only.

X, Y and Z stand for the x, y and z coordinates of the object centers, respectively. L_{x}, L_{y} and L_{z} are the FWHM of the object along the x, y and z directions, respetivesly.

## 4. Conclusions

In summary, we have experimentally demonstrated that quantitative reconstruction of 3D absorption and scattering images can be achieved using dc data. It is important to note that the 3D images reconstructed in this paper have been realized from absolute measured data without invoking any calibration/reference measurement.

## Acknowledgements

This research was supported in part by grants from the National Institutes of Health (NIH) (CA 78334), the Department of Defense (DOD) (BC 980050), and the Greenville Hospital System/Clemson University Biomedical Cooperative.

## References and links

**1. **A. Yodh and B. Chance, “Spectroscopy and imgaing with diffusing light,” Phys. Today **48**, 3440(1995). [CrossRef]

**2. **M. O’Leary, D. Boas, B. Chance, and A. Yodh, “Experimental images of heterogeneous turbid media by frequency-domain diffusing-photon tomography,” Opt. Lett. **20**, 426–428(1995). [CrossRef]

**3. **W. Cai, S. Gayen, M. Xu, M. Zevallos, M. Alrubaiee, M. Lax, and R. Alfano, “Optical tomographic image reconstruction from ultrafast time-sliced transmission measurements,” Appl. Opt. **38**, 4237–4246(1999) [CrossRef]

**4. **S. Colak, D. Papaioannou, G. tHooft, M. vander Mark, H. Schomberg, J. Paasschens, J. Melissen, and N. van Asten, “Tomographic image reconstruction from optical projections in light diffusing media,” Appl. Opt. **36**, 180–213(1997). [CrossRef] [PubMed]

**5. **H. Jiang, K. Paulsen, U. Osterberg, B. Pogue, and M. Patterson, “Simultaneous reconstruction of absorption and scattering maps in turbid media from near-infrared frequency-domain data,” Opt. Lett. **20**, 2128–2130(1995). [CrossRef] [PubMed]

**6. **B. Pogue, T. McBride, J. Prewitt, U. Osterberg, and K. Paulsen, “Spatially variant regularization improves diffuse optical tomography,” Appl. Opt. **38**, 2950–2961(1999). [CrossRef]

**7. **J. Hebden, F. Schmidt, M. Fry, M. Schweiger, E. Hillman, D. Delpy, and S. Arridge, “Simulataneous reconstruction of absorption and scattering images by multichannel measurement of purely temporal data,” Opt. Lett. **24**, 534–536(1999). [CrossRef]

**8. **H. Graber, J. Chang, J. Lubowsky, R. Aronson, and R. Barbour, “Near infrared absorption imaging of dense scattering media by stead-state diffusion tomography,” Proc. SPIE **1888**, 372–386(1993). [CrossRef]

**9. **D. Boas, “A fundamental limitation of linearized algorithms for diffuse optical tomography,” Opt. Express **1**, 404–413(1997). [CrossRef] [PubMed]

**10. **H. Jiang, “Three-dimensional optical image reconstruction: Finite element approach,” Proc. of Advances in Optical Imaging and Photon Migration, Optical Society of America, 156–158(1998).

**11. **M. Schweiger and S. Arridge, “Comparison of two- and three-dimensional reconstruction methods in optical tomography,” Appl. Opt. **37**, 7419–7428(1998). [CrossRef]

**12. **M. Eppstein, D. Dougherty, D. Hawrysz, and E. Sevick-Muraca, “Three-dimensional optical tomography,” Proc. SPIE **3597**, 97–105(1999). [CrossRef]

**13. **S. Arridge, “A method for three-dimensional time-resolved optical tomography,” Int. J. Imaging Syst. Technol. **11**, 2–11(2000). [CrossRef]

**14. **H. Jiang, K. Paulsen, U. Osterberg, and M. Patterson, “Improved continuous light diffusion imaging in single-and multi-target tissue-like phantoms,” Phys. Med. Biol. **43**, 675–693(1998). [CrossRef] [PubMed]

**15. **N. Iftimia and H. Jiang, Appl. Opt. (in press).