Abstract

Biometric authentication is the recognition of human identity via unique anatomical features. The development of novel methods parallels widespread application by consumer devices, law enforcement, and access control. In particular, methods based on finger veins, as compared to face and fingerprints, obviate privacy concerns and degradation due to wear, age, and obscuration. However, they are two-dimensional (2D) and are fundamentally limited by conventional imaging and tissue-light scattering. In this work, for the first time, to the best of our knowledge, we demonstrate a method of three-dimensional (3D) finger vein biometric authentication based on photoacoustic tomography. Using a compact photoacoustic tomography setup and a novel recognition algorithm, the advantages of 3D are demonstrated via biometric authentication of index finger vessels with false acceptance, false rejection, and equal error rates ${\lt} {1.23}\%$, ${\lt} {9.27}\%$, and ${\lt} {0.13}\%$, respectively, when comparing one finger, a false acceptance ${\rm{rate}}\;{\rm{improvement}} \gt {{10}} \times$ when comparing multiple fingers, and ${\lt} {0.7}\%$ when rotating fingers ${\pm}{{30}}$.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. INTRODUCTION

Biometrics are human anatomical features, whose uniqueness is exploited to authenticate human identity. Biometric authentication has shown immense capability in the “Internet of Things” and smart devices because of its high accuracy, user convenience, and security. However, the external visibility of fingerprint and face ID allows the adversary to remotely capture the user’s biometric information and leverage it for spoofing attacks [1,2]. In recent years, finger vessel authentication has gained immense attention as a promising biometric approach in physical access systems, e.g.,  ATM transactions [3]. The vessel pattern within a user’s finger is unique and invisible from the outside; thus, it can hardly be forged.

The current vessel-based authentication systems capture the vessel structure primarily through either ultrasound or optical sensing [46]. Doppler ultrasound has been leveraged to recognize the main vessels based on the blood flow in tissues. However, the system is not adequate for sensing small vascular structures [6]. Lin et al. [7] proposed a novel approach using thermal images of the palm-dorsal vessel patterns for personal verification. However, their system had limited robustness against ambient temperature. Other studies in this domain include the development of a low-cost hand vessel scanner [5]. The integrated device can be used to process the captured infrared (IR) hand images [5], and then, a graph-matching algorithm is utilized for verifying dorsal vessels. However, existing IR sensors can only detect vessel patterns in a two-dimensional (2D) space. These sensors suffer from (1) limited domain information (i.e., vessel depth) and poor image quality due to the diffusive nature of light, and (2) insufficient robustness against human artifacts (e.g.,  variations in measuring position), thereby limiting the overall system performance.

To address these issues, in 2019, we proposed a palm-vessel sensing approach using the photoacoustic (PA) effect [8]. Photoacoustic tomography (PAT), in contrast to pure optical modalities, can acquire profoundly detailed images of the vasculature with sufficient depth information. In PAT, a pulsed laser light illuminates the skin surface, and light absorption causes thermoelastic expansion, which is converted into ultrasonic waves. After detecting the pressure waves, a PAT image can be reconstructed based on the acoustic time of arrival. Compared with pure optical imaging technologies, PAT can detect deeper vascular structures at a higher spatial resolution, because acoustic wave scattering is three orders of magnitude less than optical scattering in tissue [9]. Moreover, optical imaging suffers from light diffusion, and ultrasound imaging suffers from low sensitivity to small blood vessels. PAT provides images with a higher signal-to-noise ratio (SNR) because non-absorbing tissue components will not generate any PA signals. In our previous study, the system utilized side optical illumination and the participant’s hand was placed underneath the imaging platform. The system was bulky and inconvenient for both the participant and the operator. For instance, a small error in the subject’s hand placement would lead to misalignment in the light illumination and might significantly deteriorate the image quality. In addition, scanning the whole palm area is neither common nor convenient in portable biometrics devices.

To address these issues, we developed a new system, 3D Finger, for robust imaging of the finger vasculature. Compared to the previous system, we redesigned the light delivery scheme and adjusted the system’s scanning geometry. With the new design, subjects place their finger directly on top of the imaging window. This imaging pose significantly improves the user experience and reduces the experimental preparation time. In addition, by utilizing a high-performance cold mirror as an acoustic-optical combiner, we achieved co-planar light illumination and acoustic recording, thus significantly improving the system’s robustness and imaging depth [10]. Regardless of the finger placement and curvature, the acoustic and light beams are always coaxial to each other, eliminating alignment errors. In this study, we use fingers instead of palms as the scanning target, which is more convenient for future implementation in portable devices. To cover the four fingers (index to little) of a subject, we used a 2.25 MHz center frequency ultrasound transducer with 8.6-cm lateral coverage. The field of view is over two times larger than that in our previous study [8]. We also developed a new vascular matching algorithm, which leverages the fact that users have distinct vessel patterns whose uniqueness is dependent not only on the overall vessel structure but also on its depth inside the finger. The algorithm employs multiple key features to build a robust 3D vascular model that can classify the input as the legitimate or illegitimate user. After testing the system and the algorithm in 36 subjects, we obtained high authentication accuracy and robustness.

2. METHODS

A. Photoacoustic Effect

The sensing of vasculature in our system is based on the PA effect. Upon light illumination, the initial PA pressure (${p_0}$) can be described as follows [11]:

$${p_0}(r) = \Gamma {\mu _a}F (r),$$
where ${\mu _a}$ is the absorption coefficient and $F(r)$ is the local optical influence. $\Gamma$ represents the Grüneisen parameter, which is determined by the material’s thermal and mechanical properties. Equation (1) indicates that the PA amplitude is proportional to the optical absorption. Because the major absorber in the near-infrared region is hemoglobin, PAT allows for direct sensing of hemoglobin distribution (blood vasculature [12]). As for image formation, based on the speed of sound in tissue and the time of arrival of PA signals, the reconstruction algorithm back-projects data to possible locations of the acoustic source. After performing the projection for all transducer elements, a 2D or 3D representation of the optical absorber can be formed.

B. 3D Finger System

An end-to-end overview of our 3D Finger system is illustrated in Fig. 1. First, we used the newly designed PAT system to collect the raw PA signals from the fingers. A 3D image of the finger blood vessels was then obtained through reconstruction. Next, the 3D vessel structures were fed to the biometric framework, which consists of three major steps: (1) image preprocessing, (2) vessel segmentation, and (3) pattern analysis.

 

Fig. 1. Flow diagram of the 3D Finger sensing system.

Download Full Size | PPT Slide | PDF

 

Fig. 2. Schematic diagram of the PA sensing hardware.

Download Full Size | PPT Slide | PDF

1. 3D Finger Hardware

Figure 2 shows a schematic drawing of the imaging platform. The light source is an Nd: YAG laser (Continuum, SL III) with 1064 nm wavelength output, 10-Hz pulse repetition frequency, and 10-ns pulse width. Laser pulse was coupled into a fiber bundle with 1.1 cm-diameter circular input and 5.1 cm-length line outputs (Schott Fostec). The subject’s fingers are placed on top of a water tank, which is made with transparent acrylic sheets and has an opening at the top sealed with a transparent 50 µm thick plastic film. Inside the water tank are the optical fiber output and ultrasound transducer. The light delivery and acoustic detection paths are combined by a dichroic cold mirror (TECHSPEC, Edmund Optics Inc.), which allows for 97% of the 1064 nm light to pass through. The energy irradiated on the finger surface is approximately ${{25}}\;{\rm{mJ}}/{{\rm{cm}}^2}$, which is well below the ANSI safety limit of ${{100}}\;{\rm{mJ}}/{{\rm{cm}}^2}$ for 1064 nm light [13]. The PA signals generated by the finger vessels were reflected by the cold mirror and received by a 2.25 MHz linear transducer with 128 elements (IMASONIC SAS, 0.67 mm elementary pitch, 15 mm element height, and 40 mm elevation focus). Due to an impedance mismatch, the acoustic reflection by the cold mirror is above 90% [14]. A 3D-printed holder combined the ultrasound transducer, fiber output, and a cold mirror. To scan the entire finger area, we used a 20-cm stroke translation stage (McMaster-Carr, 6734K2) mounted on the optical table. The laser synchronizes the scanning and data acquisition systems. The stepper motor moves at a speed of 1 mm/s, and the transducer collects data immediately after each laser pulse. The raw data of the experiment was collected using a Vantage 256 system (Verasonics) and reconstructed using the back-projection algorithm [15,16]. A similar system has been utilized for breast imaging [17], and we have quantified the spatial resolution to be around 1 mm in both lateral and elevation (linear scanning direction of array) directions, around the acoustic focus.

2. Experimental Procedure

Before beginning the experiment, a small amount of ultrasound gel was placed on the imaging window to improve acoustic impedance matching. Next, we asked the subjects to rest their fingers on the imaging window and started the data acquisition. The whole experiment took approximately 35 seconds, covering a motor scanning distance of 3.5 cm. The raw data matrix size was 2048 (A-line length) by 128 (128 elements) by 350 (scanning steps). To form a 3D image, we first reconstructed 2D data acquired at each laser pulse and then stacked all the 2D data based on their acquisition position. The final 3D data was depth-encoded and maximum intensity projected (MIP).

3. 3D Finger Imaging Processing Scheme

Figure 3 illustrates the imaging processing steps. It is critical to remove nonvascular features, such as background tissue signals and electronic noises, to obtain detailed finger vessel patterns. Therefore, we first de-noised and smoothed the original reconstructed image using Gaussian and bilateral filters [16]. Given the high-frequency attribute of these noises, the Gaussian filter was a low-pass blur filter ($\sigma = 0.2$) for attenuating the high frequency. The bilateral filter also has a Gaussian kernel (${\rm{spatial}}\;{\rm{Sigma}} = {{4}}$) for preserving the vessel information. Specifically, the output image can be described as follows [18]:

$${\vec I^{({t + 1} )}}({\vec x} ) = \frac{{\mathop \sum \nolimits_{i = - S}^{+ S} \mathop \sum \nolimits_{j = - S}^{+ S} {{\vec I}^{(t )}}({{x_1} + i,{x_2} + j}){w^{(t)}}}}{{\mathop \sum \nolimits_{i = - S}^{+ S} \mathop \sum \nolimits_{j = - S}^{+ S} {w^{(t)}}}}$$
with the weights given as follows:
$${w^{(t)}}({\vec x,\vec \xi} ) = {\exp}\!\left({\frac{{- {{\left({\vec \xi - \vec x} \right)}^2}}}{{2\sigma _D^2}}}\right){\exp}\!\left({\frac{{- {{\left({I({\vec \xi} ) - I({\vec x} )} \right)}^2}}}{{2\sigma _R^2}}} \right).$$
Here, $\vec x = ({{x_1},{x_2}})$, $\vec \xi = ({{\xi _1},\;{\xi _2}})$ are space variables and $w$ is the window size of the filter.
 

Fig. 3. Flow diagram of the 3D Finger image processing steps.

Download Full Size | PPT Slide | PDF

Tables Icon

Table 1. Vascular Matching Algorithm

After smoothing, we used a 3D vessel pattern segmentation technique (binarization) to remove the noise and identify vessel structures, where we create a binary image from a 3D PA image by replacing all values above a globally determined threshold with 1 s and set all other values to 0 s [19]. Traditional optical methods [20] cannot offer a detailed perception of blood flow in the vessels due to inferior imaging resolution, light scattering, and optical blurring. By contrast, the 3D Finger can reveal high-quality vascular patterns through image binarization, which is computationally efficient. To improve continuity in vessel patterns, we leveraged a nearest-neighbor searching and vascular structure fine-tuning module (skeleton) to track the vessels, which links disjoint vessels depending on the number of neighboring background points and a speeded up robust features (SURF)-based feature extraction. We also readjusted the vascular topology based on the vessel distinctions, using a multiscale vessel enhancement algorithm. This algorithm searches for geometrical structures in the binarized PA image that can be considered as tubular [21]. The image $\;{\rm{L}}(x)$ is convoluted with a normalized second-order Gaussian derivative as follows:

$${\cal L}({x,\sigma} ) \buildrel \Delta \over = {\sigma ^2}\frac{{{\partial ^2}G({x,\sigma} )}}{{\partial u\partial v}}\;{\rm{*}}\;{\cal L}(x ),$$
where $\sigma$ represents the scales at which the response is computed. An ideal tubular structure would have eigenvalues ${\lambda _1}$ (along the direction of the vessel) and ${\lambda _2}$ (orthogonal to the vessel) of the Hessian as $| {{\lambda _1}} | \approx 0$ and $| {{\lambda _1}} | \ll | {{\lambda _2}} |$. Finally, a vesselness factor is defined to exclusively enhance the vascular patterns in the PA image using the following equation:
$${\cal L}{\rm{R}}({{\sigma}}) = \left\{{\begin{array}{*{20}{c}}0\\{\exp \left({\frac{{{{\left| {\frac{{{\lambda _1}}}{{{\lambda _2}}}} \right|}^2}}}{{- 2{\beta ^2}}}} \right)\left({1 - {\rm EXP}\frac{{{{| {{\lambda _1}} |}^2} + {{| {{\lambda _2}}|}^2}}}{{- 2{c^2}}}} \right)}\end{array}} \right.,$$
where $\beta$ and $c$ are weights to regulate the filter’s sensitivity. It should be noted that even though vessel discontinuities are still present, they only limit the number of features detected from deep regions. As will be shown in the results section, the current features are sufficient to ensure a high matching accuracy.

After vessel enhancement, we need to select appropriate features that can highlight the distinct characteristics of finger vessel patterns for user authentication. To this end, we employed (SURF) [22,23], which are invariant to rotation, scale, blur, and other noise interferences. While the vessel structure lies in the 3D domain, the SURF features are primarily intended for a 2D space. More specifically, the algorithm recovers key points such as bifurcations and endpoints. The feature number varies among users due to their finger position, muscle activity, or body metabolism during the sensing process. To account for these variations, we designed an M-to-N matching algorithm to compute the correlation between SURF obtained from the vessel patterns of two users, while considering the vessel depth attribute. Specifically, the sum of the squared differences of compared subjects was used to calculate the spatial correlation between the SURF of two users. The identified feature pairs with high correlation were further tested to examine their depth similarity (represented by the color of vein patterns). As the finger is weakly compressible, we expect the vessel depth information of the same subject to not vary much under different pressures. A cross-correlation score (CrossCorr) was defined based on the number of similar feature pairs between two users. The score was further used to determine four metrics: accuracy, false acceptance rate (FAR) [24], false rejection rate (FRR), and equal error rate (EER). These metrics are commonly employed in biometric studies to examine the uniqueness of the finger vein pattern. Table 1 describes the detailed processes of the vascular matching algorithm.

 

Fig. 4. 3D-Finger images after each processing step. (a) Original (reconstructed) image without enhancement. (b) Gaussian and bilateral filter processed image of (a). (c) Binarization of image of (b). (d) Skeleton extracted image of (c). (e) Final enhanced image. (f) Biometric features (marked with green circles) extracted by SURF.

Download Full Size | PPT Slide | PDF

 

Fig. 5. CrossCorr scores among vessel patterns from 36 subjects using their (a) left index fingers and (b) right index fingers. The results demonstrate high uniqueness of 3D finger vessels acquired using our 3D Finger system.

Download Full Size | PPT Slide | PDF

To test the potential of our system for biometric applications, we analyzed the uniqueness of 3D finger veins (recorded within a PA image) in 36 subjects (imaged eight fingers per subject). Each individual subject’s PA image was compared against the other 35 subjects. The matching results were used to quantify accuracy, FAR, FRR, and EER scores.

3. RESULTS

The post-processed results are shown in Fig. 4. The SNR of the reconstruction image [Fig. 4(a)] is 9.8, whereas that of smoothed and denoised image [Fig. 4(b)] is 16.2. The vessel structure after binarization is illustrated in Fig. 4(c). We can see that the vascular image contains intricate depth information; however, the patches of the blood flow in between the vessels may still affect the system performance for biometric authentication. The vessel continuity was improved after skeleton extraction, as shown in Fig. 4(d). Finally, the vesselness filter-enhanced image is shown in Fig. 4(e). It can be seen that the vessel background has disappeared in the final image, and the vascular patterns are more prominent. The feature extraction of Fig. 4(e) is shown in Fig. 4(f), where the extracted features are marked by green circles.

 

Fig. 6. False acceptance rate using different numbers of fingers (1–4). (a) Left hand. (b) Right hand.

Download Full Size | PPT Slide | PDF

 

Fig. 7. FAR among 36 subjects with vascular images at 0 deg, rotated at 30 deg and ${-}{{30}}\;{\deg}$. (a) Left index finger. (b) Right index finger.

Download Full Size | PPT Slide | PDF

Figure 5 shows the results of using only one finger (index finger) for biometric identification. Here, different colors represent different cross-correlation scores. Yellow indicates high cross-correlation between the vascular characteristics of the legitimate subject and that of the predicted subject, while blue indicates low cross-correlation. Results from left index fingers [Fig. 5(a)] and right index fingers [Fig. 5(b)] indicate that all nonmatching pairs exhibit very low cross-correlation (except the left index finger of Subject 15, which will be discussed later). These results prove that vascular structures can be used to accurately match the legitimate subject and the predicted subject. The accuracy, FAR, FRR, and EER scores are presented in Table 2. The current system has a low ${\rm{EER}} \lt {1.23}\%$. The FRR also remained reasonably low, except for the left index finger, which could be caused by results from Subject 15. The EER was 0.13% for left index fingers and 0% for right index fingers, showing replicability across trials.

Tables Icon

Table 2. Overall System Performance (All 36 Subjects)

To verify whether the matching accuracy increases with more fingers, we also calculated the FAR using the different number of fingers of the same subject. As expected, the more fingers involved, the smaller the FAR (Fig. 6). This result confirms that subjects can be distinguished more accurately if the number of fingers used for matching is increased. At the same time, the error range of the FAR is also reduced, indicating better stability.

To verify how our system handles rotation invariance, we also applied clockwise (30 deg) and counterclockwise (${-}{{30}}\;{\deg}$) rotation to the index finger images. The results (Fig. 7) indicate that most of the FARs in both left [Fig. 7(a)] and right [Fig. 7(b)] index fingers are lower than 0.2%. There are only two discrete points in the right index finger that are higher than 0.5%; this is probably caused by human artifacts, such as body motion or the finger not in contact with the surface, during the sensing process. These artifacts may induce distortion in the PA image, resulting in incorrect vessel depth information. Nevertheless, the overall performance of our system is close to or slightly better than the palm vein-sensing approach [8]. Based on these results, we can conclude that our system can handle rotation invariance.

 

Fig. 8. Unenhanced reconstructed PA image of (a) Subject 15 and (b) Subject 1.

Download Full Size | PPT Slide | PDF

4. DISCUSSION

Overall, we obtained a high matching accuracy in most subjects, except for one outlier. For instance, from Fig. 5, we can see that the left index finger of Subject 15 has a high false acceptance value. This might be caused by limited information obtained from the blood vessel image. In Fig. 8, we compare the unenhanced reconstructed PA image of Subject 15 [Fig. 8(a)] and Subject 1 [Fig. 8(b)]. It can be seen that the vessel contrast in Fig. 8(b) is much higher than that in Fig. 8(a). We also confirmed that the SNR of Fig. 8(a) is 4.2, while that of Fig. 8(b) is 9.2. We speculate that the low SNR in Subject 15 may be caused by finger movement during the scanning. After removing this outlier, the left finger results (Table 3) are significantly improved. The left FAR and the FRR are both reduced by over 50%. The EER is observed to be 0% in both the left and right index fingers. To further improve our model, we can design a threshold to screen samples with low-quality images, and our imaging protocol can be enhanced to ensure high-quality imaging acquisition. We also aim to explore enhancements to PA imaging setup for resolving vessel discontinuities in the future.

As for the comparison of our technique with an infrared palm vein scanner, our previous studies [25] have shown the difference in both setups based on the SNR, proving that the infrared scanners lead to blurry and less-dense vessel structures. The current performance (${\rm{EER}} \lt {1.23}\%$) of our system is superior to that of several widely used biometrics (e.g.,  EER 5.6% in 2D vein infrared scanners [26], EER 7% in gait [27], EER 2.5% in capacitive hand touchscreens [28]) making it practical to use in a real-world scenario. On the other hand, the preprocessing time averages were around 0.4588 s, well within the real-time range. In the future, it is possible for us to develop the system to include a real-time display function.

Tables Icon

Table 3. Overall System Performance (35 Subjects; Subject 15 excluded)

In Fig. 7, we also noticed an increase in the FAR values of some rotated images. In principle, the rotated images should have the same matching accuracy as the original images. We believe that the variation was caused by zero-padding in our algorithm to accommodate the expanded area. The padded boundary regions might slightly affect our image enhancement, feature extraction, and matching algorithms, leading to a small change in matching accuracy. However, the variation in the FAR is fairly small (${\lt}{0.7}\%$), imposing no effects on the evaluation metrics (i.e., the accuracy remains above 99%).

Further efforts can be made to mitigate the system size. While the whole imaging setup is still relatively big due to the laser and ultrasound systems, our scanning platform is quite compact. To further mitigate the setup, we can use high-energy light-emitting diodes (LEDs) or laser diodes arrays, which are smaller and possess higher energy efficiency than flashlamp-pumped lasers [29,30]. Also, by using LEDs or laser diodes with a high repetition frequency, we can reduce the imaging time to less than 1 second. As for the ultrasound system, we can use multi-channel data acquisition cards or compact ultrasound systems [31] for PA data acquisition. As ultrasound systems have already been realized in smartphones, we believe that PA imaging can be implemented in smart devices as well [32]. Last, the linear array can be replaced by a 2D matrix array, which does not need mechanical scanning [33]. With these improvements, the system has great potential for portable or wearable biometric authentication in real time.

5. CONCLUSION

In summary, we have successfully developed a reliable and robust 3D Finger biometric sensing system. It allows performing a fast and steady scan by simply placing fingers on the scanning window. Compared with existing IR-based palm-vessel imaging techniques, our system provides depth information from 3D vascular structure imaging. Compared with the existing 3D PAT palm vessel sensing system, the 3D Finger system offers a more user-friendly approach, and it distinguishes subjects based on the vascular structure of the fingers. After testing 36 subjects’ left and right fingers, we obtain a low average FRR (1.23% for the left index fingers and 0.31% for the right index fingers) and EER (0.13% for the left index fingers and 0% for the right index fingers). Because of the increasing demand for securer identification and authentication devices, we believe that our system will have a broad application in various areas.

Funding

Center for Identification Technology Research and the National Science Foundation (1822190).

Disclosures

Dr. Jun Xia is the founder of Sonioptix, LLC, which, however, did not support this work. All other authors declare no conflicts of interest.

REFERENCES

1. S. Marcel, M. S. Nixon, and S. Z. Li, Handbook of Biometric Anti-spoofing (Springer, 2014), Vol. 1.

2. D. White, A. M. Burton, R. Jenkins, and R. I. Kemp, “Redesigning photo-ID to improve unfamiliar face matching performance,” J. Exp. Psychol. Appl. 20, 166–173 (2014). [CrossRef]  

3. O. E. Aru and I. Gozie, “Facial verification technology for use in ATM transactions,” Am. J. Eng. Res. 2, 188–193 (2013).

4. J.-C. Lee, “A novel biometric system based on palm vein image,” Pattern Recogn. Lett. 33, 1520–1528 (2012). [CrossRef]  

5. R. R. Fletcher, V. Raghavan, R. Zha, M. Haverkamp, and P. L. Hibberd, “Development of mobile-based hand vein biometrics for global health patient identification,” in IEEE Global Humanitarian Technology Conference (GHTC) (2014).

6. A. Iula, A. Savoia, and G. Caliano, 3D Ultrasound palm vein pattern for biometric recognition,” in IEEE International Ultrasonics Symposium (2012).

7. C.-L. Lin and K.-C. Fan, “Biometric verification using thermal images of palm-dorsa vein patterns,” IEEE Trans. Circuits Syst. Video Technol. 14, 199–213 (2004). [CrossRef]  

8. Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018). [CrossRef]  

9. L. V. Wang and S. Hu, “Photoacoustic tomography: in vivo imaging from organelles to organs,” Science 335, 1458–1462 (2012). [CrossRef]  

10. Y. Wang, R. S. A. Lim, H. Zhang, N. Nyayapathi, K. W. Oh, and J. Xia, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018). [CrossRef]  

11. M. Xu and L. V. Wang, “Photoacoustic imaging in biomedicine,” Rev. Sci. Instrum. 77, 041101 (2006). [CrossRef]  

12. J. Xia, J. Yao, and L. V. Wang, “Photoacoustic tomography: principles and advances,” Electromagn. Waves 147, 1–22 (2014). [CrossRef]  

13. American National Standards Institute, American National Standard for Safe Use of Lasers (2007).

14. Y. Wang, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018). [CrossRef]  

15. L. V. Wang, “Tutorial on photoacoustic microscopy and computed tomography,” IEEE J. Sel. Top. Quantum Electron. 14, 171–179 (2008). [CrossRef]  

16. M. Xu and L. V. Wang, “Universal back-projection algorithm for photoacoustic computed tomography,” Phys. Rev. E 71, 016706 (2005). [CrossRef]  

17. N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019). [CrossRef]  

18. D. Barash, “Fundamental relationship between bilateral filtering, adaptive smoothing, and the nonlinear diffusion equation,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 844–847 (2002). [CrossRef]  

19. “imbinarize,” 2020, https://www.mathworks.com/help/images/ref/imbinarize.html#d120e113233.

20. Y. Zhou and A. Kumar, “Human identification using palm-vein images,” IEEE Trans. Inf. Forensics Security 6, 1259–1274 (2011). [CrossRef]  

21. A. F. Frangi, W. J. Niessen, K. L. Vincken, and M. A. Viergever, “Multiscale vessel enhancement filtering,” in International Conference on Medical Image Computing and Computer-Assisted Intervention (Springer, 1998).

22. H. Bay, T. Tuytelaars, and L. Van Gool, “SURF: speeded up robust features,” in European Conference on Computer Vision (Springer, 2006).

23. D. Mistry and A. Banerjee, “Comparison of feature detection and matching approaches: SIFT and SURF,” GRD J. 2, 7–13 (2017).

24. M. Sabir, “Sensitivity and specificity analysis of fingerprints based algorithm,” in International Conference on Applied and Engineering Mathematics (ICAEM) (IEEE, 2018).

25. Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.

26. K.-Q. Wang, A. S. Khisa, X.-Q. Wu, and Q.-S. Zhao, Finger vein recognition using LBP variance with global matching,” in International Conference on Wavelet Analysis and Pattern Recognition (IEEE, 2012).

27. J. Mantyjarvi, M. Lindholm, E. Vildjiounaite, S.-M. Makela, and H. A. Ailisto, “Identifying users of portable devices from gait pattern with accelerometers,” in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (IEEE, 2005).

28. R. Tartz and T. Gooding, “Hand biometrics using capacitive touchscreens,” in Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (2015).

29. Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018). [CrossRef]  

30. P. K. Upputuri and M. Pramanik, “Performance characterization of low-cost, high-speed, portable pulsed laser diode photoacoustic tomography (PLD-PAT) system,” Biomed. Opt. Express 6, 4118–4129 (2015). [CrossRef]  

31. A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019). [CrossRef]  

32. S. Gummadi, J. Eisenbrey, and J. Li, “Advances in modern clinical ultrasound,” Adv. Ultrasound Diagn. Ther. 2, 51–63 (2018). [CrossRef]  

33. T. L. Szabo and P. A. Lewin, “Ultrasound transducer selection in clinical imaging practice,” J. Ultrasound Med. 32, 573–582 (2013). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. S. Marcel, M. S. Nixon, and S. Z. Li, Handbook of Biometric Anti-spoofing (Springer, 2014), Vol. 1.
  2. D. White, A. M. Burton, R. Jenkins, and R. I. Kemp, “Redesigning photo-ID to improve unfamiliar face matching performance,” J. Exp. Psychol. Appl. 20, 166–173 (2014).
    [Crossref]
  3. O. E. Aru and I. Gozie, “Facial verification technology for use in ATM transactions,” Am. J. Eng. Res. 2, 188–193 (2013).
  4. J.-C. Lee, “A novel biometric system based on palm vein image,” Pattern Recogn. Lett. 33, 1520–1528 (2012).
    [Crossref]
  5. R. R. Fletcher, V. Raghavan, R. Zha, M. Haverkamp, and P. L. Hibberd, “Development of mobile-based hand vein biometrics for global health patient identification,” in IEEE Global Humanitarian Technology Conference (GHTC) (2014).
  6. A. Iula, A. Savoia, and G. Caliano, 3D Ultrasound palm vein pattern for biometric recognition,” in IEEE International Ultrasonics Symposium (2012).
  7. C.-L. Lin and K.-C. Fan, “Biometric verification using thermal images of palm-dorsa vein patterns,” IEEE Trans. Circuits Syst. Video Technol. 14, 199–213 (2004).
    [Crossref]
  8. Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018).
    [Crossref]
  9. L. V. Wang and S. Hu, “Photoacoustic tomography: in vivo imaging from organelles to organs,” Science 335, 1458–1462 (2012).
    [Crossref]
  10. Y. Wang, R. S. A. Lim, H. Zhang, N. Nyayapathi, K. W. Oh, and J. Xia, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
    [Crossref]
  11. M. Xu and L. V. Wang, “Photoacoustic imaging in biomedicine,” Rev. Sci. Instrum. 77, 041101 (2006).
    [Crossref]
  12. J. Xia, J. Yao, and L. V. Wang, “Photoacoustic tomography: principles and advances,” Electromagn. Waves 147, 1–22 (2014).
    [Crossref]
  13. American National Standards Institute, American National Standard for Safe Use of Lasers (2007).
  14. Y. Wang, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
    [Crossref]
  15. L. V. Wang, “Tutorial on photoacoustic microscopy and computed tomography,” IEEE J. Sel. Top. Quantum Electron. 14, 171–179 (2008).
    [Crossref]
  16. M. Xu and L. V. Wang, “Universal back-projection algorithm for photoacoustic computed tomography,” Phys. Rev. E 71, 016706 (2005).
    [Crossref]
  17. N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
    [Crossref]
  18. D. Barash, “Fundamental relationship between bilateral filtering, adaptive smoothing, and the nonlinear diffusion equation,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 844–847 (2002).
    [Crossref]
  19. “imbinarize,” 2020, https://www.mathworks.com/help/images/ref/imbinarize.html#d120e113233 .
  20. Y. Zhou and A. Kumar, “Human identification using palm-vein images,” IEEE Trans. Inf. Forensics Security 6, 1259–1274 (2011).
    [Crossref]
  21. A. F. Frangi, W. J. Niessen, K. L. Vincken, and M. A. Viergever, “Multiscale vessel enhancement filtering,” in International Conference on Medical Image Computing and Computer-Assisted Intervention (Springer, 1998).
  22. H. Bay, T. Tuytelaars, and L. Van Gool, “SURF: speeded up robust features,” in European Conference on Computer Vision (Springer, 2006).
  23. D. Mistry and A. Banerjee, “Comparison of feature detection and matching approaches: SIFT and SURF,” GRD J. 2, 7–13 (2017).
  24. M. Sabir, “Sensitivity and specificity analysis of fingerprints based algorithm,” in International Conference on Applied and Engineering Mathematics (ICAEM) (IEEE, 2018).
  25. Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.
  26. K.-Q. Wang, A. S. Khisa, X.-Q. Wu, and Q.-S. Zhao, Finger vein recognition using LBP variance with global matching,” in International Conference on Wavelet Analysis and Pattern Recognition (IEEE, 2012).
  27. J. Mantyjarvi, M. Lindholm, E. Vildjiounaite, S.-M. Makela, and H. A. Ailisto, “Identifying users of portable devices from gait pattern with accelerometers,” in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (IEEE, 2005).
  28. R. Tartz and T. Gooding, “Hand biometrics using capacitive touchscreens,” in Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (2015).
  29. Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
    [Crossref]
  30. P. K. Upputuri and M. Pramanik, “Performance characterization of low-cost, high-speed, portable pulsed laser diode photoacoustic tomography (PLD-PAT) system,” Biomed. Opt. Express 6, 4118–4129 (2015).
    [Crossref]
  31. A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
    [Crossref]
  32. S. Gummadi, J. Eisenbrey, and J. Li, “Advances in modern clinical ultrasound,” Adv. Ultrasound Diagn. Ther. 2, 51–63 (2018).
    [Crossref]
  33. T. L. Szabo and P. A. Lewin, “Ultrasound transducer selection in clinical imaging practice,” J. Ultrasound Med. 32, 573–582 (2013).
    [Crossref]

2019 (2)

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

2018 (5)

S. Gummadi, J. Eisenbrey, and J. Li, “Advances in modern clinical ultrasound,” Adv. Ultrasound Diagn. Ther. 2, 51–63 (2018).
[Crossref]

Y. Wang, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018).
[Crossref]

Y. Wang, R. S. A. Lim, H. Zhang, N. Nyayapathi, K. W. Oh, and J. Xia, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

2017 (1)

D. Mistry and A. Banerjee, “Comparison of feature detection and matching approaches: SIFT and SURF,” GRD J. 2, 7–13 (2017).

2015 (1)

2014 (2)

J. Xia, J. Yao, and L. V. Wang, “Photoacoustic tomography: principles and advances,” Electromagn. Waves 147, 1–22 (2014).
[Crossref]

D. White, A. M. Burton, R. Jenkins, and R. I. Kemp, “Redesigning photo-ID to improve unfamiliar face matching performance,” J. Exp. Psychol. Appl. 20, 166–173 (2014).
[Crossref]

2013 (2)

O. E. Aru and I. Gozie, “Facial verification technology for use in ATM transactions,” Am. J. Eng. Res. 2, 188–193 (2013).

T. L. Szabo and P. A. Lewin, “Ultrasound transducer selection in clinical imaging practice,” J. Ultrasound Med. 32, 573–582 (2013).
[Crossref]

2012 (2)

J.-C. Lee, “A novel biometric system based on palm vein image,” Pattern Recogn. Lett. 33, 1520–1528 (2012).
[Crossref]

L. V. Wang and S. Hu, “Photoacoustic tomography: in vivo imaging from organelles to organs,” Science 335, 1458–1462 (2012).
[Crossref]

2011 (1)

Y. Zhou and A. Kumar, “Human identification using palm-vein images,” IEEE Trans. Inf. Forensics Security 6, 1259–1274 (2011).
[Crossref]

2008 (1)

L. V. Wang, “Tutorial on photoacoustic microscopy and computed tomography,” IEEE J. Sel. Top. Quantum Electron. 14, 171–179 (2008).
[Crossref]

2006 (1)

M. Xu and L. V. Wang, “Photoacoustic imaging in biomedicine,” Rev. Sci. Instrum. 77, 041101 (2006).
[Crossref]

2005 (1)

M. Xu and L. V. Wang, “Universal back-projection algorithm for photoacoustic computed tomography,” Phys. Rev. E 71, 016706 (2005).
[Crossref]

2004 (1)

C.-L. Lin and K.-C. Fan, “Biometric verification using thermal images of palm-dorsa vein patterns,” IEEE Trans. Circuits Syst. Video Technol. 14, 199–213 (2004).
[Crossref]

2002 (1)

D. Barash, “Fundamental relationship between bilateral filtering, adaptive smoothing, and the nonlinear diffusion equation,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 844–847 (2002).
[Crossref]

Agano, T.

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Ailisto, H. A.

J. Mantyjarvi, M. Lindholm, E. Vildjiounaite, S.-M. Makela, and H. A. Ailisto, “Identifying users of portable devices from gait pattern with accelerometers,” in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (IEEE, 2005).

Aru, O. E.

O. E. Aru and I. Gozie, “Facial verification technology for use in ATM transactions,” Am. J. Eng. Res. 2, 188–193 (2013).

Avanaki, K.

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

Banerjee, A.

D. Mistry and A. Banerjee, “Comparison of feature detection and matching approaches: SIFT and SURF,” GRD J. 2, 7–13 (2017).

Barash, D.

D. Barash, “Fundamental relationship between bilateral filtering, adaptive smoothing, and the nonlinear diffusion equation,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 844–847 (2002).
[Crossref]

Bay, H.

H. Bay, T. Tuytelaars, and L. Van Gool, “SURF: speeded up robust features,” in European Conference on Computer Vision (Springer, 2006).

Bonaccio, E.

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Burton, A. M.

D. White, A. M. Burton, R. Jenkins, and R. I. Kemp, “Redesigning photo-ID to improve unfamiliar face matching performance,” J. Exp. Psychol. Appl. 20, 166–173 (2014).
[Crossref]

Caliano, G.

A. Iula, A. Savoia, and G. Caliano, 3D Ultrasound palm vein pattern for biometric recognition,” in IEEE International Ultrasonics Symposium (2012).

Dadashzadeh, N.

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

Demirci, H.

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Eisenbrey, J.

S. Gummadi, J. Eisenbrey, and J. Li, “Advances in modern clinical ultrasound,” Adv. Ultrasound Diagn. Ther. 2, 51–63 (2018).
[Crossref]

Fan, K.-C.

C.-L. Lin and K.-C. Fan, “Biometric verification using thermal images of palm-dorsa vein patterns,” IEEE Trans. Circuits Syst. Video Technol. 14, 199–213 (2004).
[Crossref]

Fan, X. C.

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Fatima, A.

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

Fletcher, R. R.

R. R. Fletcher, V. Raghavan, R. Zha, M. Haverkamp, and P. L. Hibberd, “Development of mobile-based hand vein biometrics for global health patient identification,” in IEEE Global Humanitarian Technology Conference (GHTC) (2014).

Frangi, A. F.

A. F. Frangi, W. J. Niessen, K. L. Vincken, and M. A. Viergever, “Multiscale vessel enhancement filtering,” in International Conference on Medical Image Computing and Computer-Assisted Intervention (Springer, 1998).

Gandikota, G.

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Gooding, T.

R. Tartz and T. Gooding, “Hand biometrics using capacitive touchscreens,” in Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (2015).

Gozie, I.

O. E. Aru and I. Gozie, “Facial verification technology for use in ATM transactions,” Am. J. Eng. Res. 2, 188–193 (2013).

Gummadi, S.

S. Gummadi, J. Eisenbrey, and J. Li, “Advances in modern clinical ultrasound,” Adv. Ultrasound Diagn. Ther. 2, 51–63 (2018).
[Crossref]

Haverkamp, M.

R. R. Fletcher, V. Raghavan, R. Zha, M. Haverkamp, and P. L. Hibberd, “Development of mobile-based hand vein biometrics for global health patient identification,” in IEEE Global Humanitarian Technology Conference (GHTC) (2014).

Hibberd, P. L.

R. R. Fletcher, V. Raghavan, R. Zha, M. Haverkamp, and P. L. Hibberd, “Development of mobile-based hand vein biometrics for global health patient identification,” in IEEE Global Humanitarian Technology Conference (GHTC) (2014).

Hu, S.

L. V. Wang and S. Hu, “Photoacoustic tomography: in vivo imaging from organelles to organs,” Science 335, 1458–1462 (2012).
[Crossref]

Huang, B.

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

Iula, A.

A. Iula, A. Savoia, and G. Caliano, 3D Ultrasound palm vein pattern for biometric recognition,” in IEEE International Ultrasonics Symposium (2012).

Jenkins, R.

D. White, A. M. Burton, R. Jenkins, and R. I. Kemp, “Redesigning photo-ID to improve unfamiliar face matching performance,” J. Exp. Psychol. Appl. 20, 166–173 (2014).
[Crossref]

Jo, J.

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Kemp, R. I.

D. White, A. M. Burton, R. Jenkins, and R. I. Kemp, “Redesigning photo-ID to improve unfamiliar face matching performance,” J. Exp. Psychol. Appl. 20, 166–173 (2014).
[Crossref]

Khisa, A. S.

K.-Q. Wang, A. S. Khisa, X.-Q. Wu, and Q.-S. Zhao, Finger vein recognition using LBP variance with global matching,” in International Conference on Wavelet Analysis and Pattern Recognition (IEEE, 2012).

Kratkiewicz, K.

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

Kumar, A.

Y. Zhou and A. Kumar, “Human identification using palm-vein images,” IEEE Trans. Inf. Forensics Security 6, 1259–1274 (2011).
[Crossref]

Lee, J.-C.

J.-C. Lee, “A novel biometric system based on palm vein image,” Pattern Recogn. Lett. 33, 1520–1528 (2012).
[Crossref]

Lewin, P. A.

T. L. Szabo and P. A. Lewin, “Ultrasound transducer selection in clinical imaging practice,” J. Ultrasound Med. 32, 573–582 (2013).
[Crossref]

Li, J.

S. Gummadi, J. Eisenbrey, and J. Li, “Advances in modern clinical ultrasound,” Adv. Ultrasound Diagn. Ther. 2, 51–63 (2018).
[Crossref]

Li, S. Z.

S. Marcel, M. S. Nixon, and S. Z. Li, Handbook of Biometric Anti-spoofing (Springer, 2014), Vol. 1.

Li, Z.

Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018).
[Crossref]

Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.

Lim, R.

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Lim, R. S. A.

Y. Wang, R. S. A. Lim, H. Zhang, N. Nyayapathi, K. W. Oh, and J. Xia, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

Lin, C.-L.

C.-L. Lin and K.-C. Fan, “Biometric verification using thermal images of palm-dorsa vein patterns,” IEEE Trans. Circuits Syst. Video Technol. 14, 199–213 (2004).
[Crossref]

Lindholm, M.

J. Mantyjarvi, M. Lindholm, E. Vildjiounaite, S.-M. Makela, and H. A. Ailisto, “Identifying users of portable devices from gait pattern with accelerometers,” in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (IEEE, 2005).

Makela, S.-M.

J. Mantyjarvi, M. Lindholm, E. Vildjiounaite, S.-M. Makela, and H. A. Ailisto, “Identifying users of portable devices from gait pattern with accelerometers,” in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (IEEE, 2005).

Mantyjarvi, J.

J. Mantyjarvi, M. Lindholm, E. Vildjiounaite, S.-M. Makela, and H. A. Ailisto, “Identifying users of portable devices from gait pattern with accelerometers,” in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (IEEE, 2005).

Manwar, R.

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

Marcel, S.

S. Marcel, M. S. Nixon, and S. Z. Li, Handbook of Biometric Anti-spoofing (Springer, 2014), Vol. 1.

Mistry, D.

D. Mistry and A. Banerjee, “Comparison of feature detection and matching approaches: SIFT and SURF,” GRD J. 2, 7–13 (2017).

Niessen, W. J.

A. F. Frangi, W. J. Niessen, K. L. Vincken, and M. A. Viergever, “Multiscale vessel enhancement filtering,” in International Conference on Medical Image Computing and Computer-Assisted Intervention (Springer, 1998).

Nixon, M. S.

S. Marcel, M. S. Nixon, and S. Z. Li, Handbook of Biometric Anti-spoofing (Springer, 2014), Vol. 1.

Nyayapathi, N.

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Y. Wang, R. S. A. Lim, H. Zhang, N. Nyayapathi, K. W. Oh, and J. Xia, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018).
[Crossref]

Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.

Oh, K. W.

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018).
[Crossref]

Y. Wang, R. S. A. Lim, H. Zhang, N. Nyayapathi, K. W. Oh, and J. Xia, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

Pramanik, M.

Raghavan, V.

R. R. Fletcher, V. Raghavan, R. Zha, M. Haverkamp, and P. L. Hibberd, “Development of mobile-based hand vein biometrics for global health patient identification,” in IEEE Global Humanitarian Technology Conference (GHTC) (2014).

Rathore, A. S.

Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.

Sabir, M.

M. Sabir, “Sensitivity and specificity analysis of fingerprints based algorithm,” in International Conference on Applied and Engineering Mathematics (ICAEM) (IEEE, 2018).

Sato, N.

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Savoia, A.

A. Iula, A. Savoia, and G. Caliano, 3D Ultrasound palm vein pattern for biometric recognition,” in IEEE International Ultrasonics Symposium (2012).

Shigeta, Y.

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Song, C.

Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.

Szabo, T. L.

T. L. Szabo and P. A. Lewin, “Ultrasound transducer selection in clinical imaging practice,” J. Ultrasound Med. 32, 573–582 (2013).
[Crossref]

Takabe, K.

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Tartz, R.

R. Tartz and T. Gooding, “Hand biometrics using capacitive touchscreens,” in Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (2015).

Tiao, M.

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Tuytelaars, T.

H. Bay, T. Tuytelaars, and L. Van Gool, “SURF: speeded up robust features,” in European Conference on Computer Vision (Springer, 2006).

Upputuri, P. K.

Van Gool, L.

H. Bay, T. Tuytelaars, and L. Van Gool, “SURF: speeded up robust features,” in European Conference on Computer Vision (Springer, 2006).

Viergever, M. A.

A. F. Frangi, W. J. Niessen, K. L. Vincken, and M. A. Viergever, “Multiscale vessel enhancement filtering,” in International Conference on Medical Image Computing and Computer-Assisted Intervention (Springer, 1998).

Vildjiounaite, E.

J. Mantyjarvi, M. Lindholm, E. Vildjiounaite, S.-M. Makela, and H. A. Ailisto, “Identifying users of portable devices from gait pattern with accelerometers,” in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (IEEE, 2005).

Vincken, K. L.

A. F. Frangi, W. J. Niessen, K. L. Vincken, and M. A. Viergever, “Multiscale vessel enhancement filtering,” in International Conference on Medical Image Computing and Computer-Assisted Intervention (Springer, 1998).

Vu, T.

Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018).
[Crossref]

Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.

Wang, K.-Q.

K.-Q. Wang, A. S. Khisa, X.-Q. Wu, and Q.-S. Zhao, Finger vein recognition using LBP variance with global matching,” in International Conference on Wavelet Analysis and Pattern Recognition (IEEE, 2012).

Wang, L. V.

J. Xia, J. Yao, and L. V. Wang, “Photoacoustic tomography: principles and advances,” Electromagn. Waves 147, 1–22 (2014).
[Crossref]

L. V. Wang and S. Hu, “Photoacoustic tomography: in vivo imaging from organelles to organs,” Science 335, 1458–1462 (2012).
[Crossref]

L. V. Wang, “Tutorial on photoacoustic microscopy and computed tomography,” IEEE J. Sel. Top. Quantum Electron. 14, 171–179 (2008).
[Crossref]

M. Xu and L. V. Wang, “Photoacoustic imaging in biomedicine,” Rev. Sci. Instrum. 77, 041101 (2006).
[Crossref]

M. Xu and L. V. Wang, “Universal back-projection algorithm for photoacoustic computed tomography,” Phys. Rev. E 71, 016706 (2005).
[Crossref]

Wang, X.

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Wang, Y.

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Y. Wang, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018).
[Crossref]

Y. Wang, R. S. A. Lim, H. Zhang, N. Nyayapathi, K. W. Oh, and J. Xia, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.

White, D.

D. White, A. M. Burton, R. Jenkins, and R. I. Kemp, “Redesigning photo-ID to improve unfamiliar face matching performance,” J. Exp. Psychol. Appl. 20, 166–173 (2014).
[Crossref]

Wu, X.-Q.

K.-Q. Wang, A. S. Khisa, X.-Q. Wu, and Q.-S. Zhao, Finger vein recognition using LBP variance with global matching,” in International Conference on Wavelet Analysis and Pattern Recognition (IEEE, 2012).

Xia, J.

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Y. Wang, R. S. A. Lim, H. Zhang, N. Nyayapathi, K. W. Oh, and J. Xia, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018).
[Crossref]

J. Xia, J. Yao, and L. V. Wang, “Photoacoustic tomography: principles and advances,” Electromagn. Waves 147, 1–22 (2014).
[Crossref]

Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.

Xu, G.

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Xu, M.

M. Xu and L. V. Wang, “Photoacoustic imaging in biomedicine,” Rev. Sci. Instrum. 77, 041101 (2006).
[Crossref]

M. Xu and L. V. Wang, “Universal back-projection algorithm for photoacoustic computed tomography,” Phys. Rev. E 71, 016706 (2005).
[Crossref]

Xu, W.

Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018).
[Crossref]

Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.

Yao, J.

J. Xia, J. Yao, and L. V. Wang, “Photoacoustic tomography: principles and advances,” Electromagn. Waves 147, 1–22 (2014).
[Crossref]

Yuan, J.

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Zafar, M.

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

Zha, R.

R. R. Fletcher, V. Raghavan, R. Zha, M. Haverkamp, and P. L. Hibberd, “Development of mobile-based hand vein biometrics for global health patient identification,” in IEEE Global Humanitarian Technology Conference (GHTC) (2014).

Zhang, H.

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Y. Wang, R. S. A. Lim, H. Zhang, N. Nyayapathi, K. W. Oh, and J. Xia, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

Zhang, R.

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

Zhao, Q.-S.

K.-Q. Wang, A. S. Khisa, X.-Q. Wu, and Q.-S. Zhao, Finger vein recognition using LBP variance with global matching,” in International Conference on Wavelet Analysis and Pattern Recognition (IEEE, 2012).

Zheng, W.

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

Zhou, Y.

Y. Zhou and A. Kumar, “Human identification using palm-vein images,” IEEE Trans. Inf. Forensics Security 6, 1259–1274 (2011).
[Crossref]

Zhu, Y.

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Adv. Ultrasound Diagn. Ther. (1)

S. Gummadi, J. Eisenbrey, and J. Li, “Advances in modern clinical ultrasound,” Adv. Ultrasound Diagn. Ther. 2, 51–63 (2018).
[Crossref]

Am. J. Eng. Res. (1)

O. E. Aru and I. Gozie, “Facial verification technology for use in ATM transactions,” Am. J. Eng. Res. 2, 188–193 (2013).

Biomed. Opt. Express (1)

Electromagn. Waves (1)

J. Xia, J. Yao, and L. V. Wang, “Photoacoustic tomography: principles and advances,” Electromagn. Waves 147, 1–22 (2014).
[Crossref]

GRD J. (1)

D. Mistry and A. Banerjee, “Comparison of feature detection and matching approaches: SIFT and SURF,” GRD J. 2, 7–13 (2017).

IEEE J. Sel. Top. Quantum Electron. (1)

L. V. Wang, “Tutorial on photoacoustic microscopy and computed tomography,” IEEE J. Sel. Top. Quantum Electron. 14, 171–179 (2008).
[Crossref]

IEEE Sens. J. (1)

Y. Wang, Z. Li, T. Vu, N. Nyayapathi, K. W. Oh, W. Xu, and J. Xia, “A robust and secure palm vessel biometric sensing system based on photoacoustics,” IEEE Sens. J. 18, 5993–6000 (2018).
[Crossref]

IEEE Trans. Biomed. Eng. (1)

N. Nyayapathi, R. Lim, H. Zhang, W. Zheng, Y. Wang, M. Tiao, K. W. Oh, X. C. Fan, E. Bonaccio, K. Takabe, and J. Xia, “Dual scan mammoscope (DSM)—a new portable photoacoustic breast imaging system with scanning in craniocaudal plane,” IEEE Trans. Biomed. Eng. 67, 1321–1327 (2019).
[Crossref]

IEEE Trans. Circuits Syst. Video Technol. (1)

C.-L. Lin and K.-C. Fan, “Biometric verification using thermal images of palm-dorsa vein patterns,” IEEE Trans. Circuits Syst. Video Technol. 14, 199–213 (2004).
[Crossref]

IEEE Trans. Inf. Forensics Security (1)

Y. Zhou and A. Kumar, “Human identification using palm-vein images,” IEEE Trans. Inf. Forensics Security 6, 1259–1274 (2011).
[Crossref]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

D. Barash, “Fundamental relationship between bilateral filtering, adaptive smoothing, and the nonlinear diffusion equation,” IEEE Trans. Pattern Anal. Mach. Intell. 24, 844–847 (2002).
[Crossref]

J. Exp. Psychol. Appl. (1)

D. White, A. M. Burton, R. Jenkins, and R. I. Kemp, “Redesigning photo-ID to improve unfamiliar face matching performance,” J. Exp. Psychol. Appl. 20, 166–173 (2014).
[Crossref]

J. Ultrasound Med. (1)

T. L. Szabo and P. A. Lewin, “Ultrasound transducer selection in clinical imaging practice,” J. Ultrasound Med. 32, 573–582 (2013).
[Crossref]

Pattern Recogn. Lett. (1)

J.-C. Lee, “A novel biometric system based on palm vein image,” Pattern Recogn. Lett. 33, 1520–1528 (2012).
[Crossref]

Photoacoustics (1)

A. Fatima, K. Kratkiewicz, R. Manwar, M. Zafar, R. Zhang, B. Huang, N. Dadashzadeh, J. Xia, and K. Avanaki, “Review of cost reduction methods in photoacoustic computed tomography,” Photoacoustics 15, 100137 (2019).
[Crossref]

Phys. Rev. E (1)

M. Xu and L. V. Wang, “Universal back-projection algorithm for photoacoustic computed tomography,” Phys. Rev. E 71, 016706 (2005).
[Crossref]

Rev. Sci. Instrum. (1)

M. Xu and L. V. Wang, “Photoacoustic imaging in biomedicine,” Rev. Sci. Instrum. 77, 041101 (2006).
[Crossref]

Sci. Rep. (3)

Y. Wang, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

Y. Wang, R. S. A. Lim, H. Zhang, N. Nyayapathi, K. W. Oh, and J. Xia, “Optimizing the light delivery of linear-array-based photoacoustic systems by double acoustic reflectors,” Sci. Rep. 8, 13004 (2018).
[Crossref]

Y. Zhu, G. Xu, J. Yuan, J. Jo, G. Gandikota, H. Demirci, T. Agano, N. Sato, Y. Shigeta, and X. Wang, “Light emitting diodes based photoacoustic imaging and potential clinical applications,” Sci. Rep. 8, 9885 (2018).
[Crossref]

Science (1)

L. V. Wang and S. Hu, “Photoacoustic tomography: in vivo imaging from organelles to organs,” Science 335, 1458–1462 (2012).
[Crossref]

Other (12)

S. Marcel, M. S. Nixon, and S. Z. Li, Handbook of Biometric Anti-spoofing (Springer, 2014), Vol. 1.

R. R. Fletcher, V. Raghavan, R. Zha, M. Haverkamp, and P. L. Hibberd, “Development of mobile-based hand vein biometrics for global health patient identification,” in IEEE Global Humanitarian Technology Conference (GHTC) (2014).

A. Iula, A. Savoia, and G. Caliano, 3D Ultrasound palm vein pattern for biometric recognition,” in IEEE International Ultrasonics Symposium (2012).

American National Standards Institute, American National Standard for Safe Use of Lasers (2007).

“imbinarize,” 2020, https://www.mathworks.com/help/images/ref/imbinarize.html#d120e113233 .

A. F. Frangi, W. J. Niessen, K. L. Vincken, and M. A. Viergever, “Multiscale vessel enhancement filtering,” in International Conference on Medical Image Computing and Computer-Assisted Intervention (Springer, 1998).

H. Bay, T. Tuytelaars, and L. Van Gool, “SURF: speeded up robust features,” in European Conference on Computer Vision (Springer, 2006).

M. Sabir, “Sensitivity and specificity analysis of fingerprints based algorithm,” in International Conference on Applied and Engineering Mathematics (ICAEM) (IEEE, 2018).

Z. Li, Y. Wang, A. S. Rathore, C. Song, N. Nyayapathi, T. Vu, J. Xia, and W. Xu, “PAvessel: practical 3D vessel structure sensing through photoacoustic effects with its applications in palm biometrics,” in ACM on Interactive, Mobile, Wearable Ubiquitous Technologies (2018), Vol. 2, pp. 1–24.

K.-Q. Wang, A. S. Khisa, X.-Q. Wu, and Q.-S. Zhao, Finger vein recognition using LBP variance with global matching,” in International Conference on Wavelet Analysis and Pattern Recognition (IEEE, 2012).

J. Mantyjarvi, M. Lindholm, E. Vildjiounaite, S.-M. Makela, and H. A. Ailisto, “Identifying users of portable devices from gait pattern with accelerometers,” in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (IEEE, 2005).

R. Tartz and T. Gooding, “Hand biometrics using capacitive touchscreens,” in Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (2015).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1. Flow diagram of the 3D Finger sensing system.
Fig. 2.
Fig. 2. Schematic diagram of the PA sensing hardware.
Fig. 3.
Fig. 3. Flow diagram of the 3D Finger image processing steps.
Fig. 4.
Fig. 4. 3D-Finger images after each processing step. (a) Original (reconstructed) image without enhancement. (b) Gaussian and bilateral filter processed image of (a). (c) Binarization of image of (b). (d) Skeleton extracted image of (c). (e) Final enhanced image. (f) Biometric features (marked with green circles) extracted by SURF.
Fig. 5.
Fig. 5. CrossCorr scores among vessel patterns from 36 subjects using their (a) left index fingers and (b) right index fingers. The results demonstrate high uniqueness of 3D finger vessels acquired using our 3D Finger system.
Fig. 6.
Fig. 6. False acceptance rate using different numbers of fingers (1–4). (a) Left hand. (b) Right hand.
Fig. 7.
Fig. 7. FAR among 36 subjects with vascular images at 0 deg, rotated at 30 deg and ${-}{{30}}\;{\deg}$ . (a) Left index finger. (b) Right index finger.
Fig. 8.
Fig. 8. Unenhanced reconstructed PA image of (a) Subject 15 and (b) Subject 1.

Tables (3)

Tables Icon

Table 1. Vascular Matching Algorithm

Tables Icon

Table 2. Overall System Performance (All 36 Subjects)

Tables Icon

Table 3. Overall System Performance (35 Subjects; Subject 15 excluded)

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

p 0 ( r ) = Γ μ a F ( r ) ,
I ( t + 1 ) ( x ) = i = S + S j = S + S I ( t ) ( x 1 + i , x 2 + j ) w ( t ) i = S + S j = S + S w ( t )
w ( t ) ( x , ξ ) = exp ( ( ξ x ) 2 2 σ D 2 ) exp ( ( I ( ξ ) I ( x ) ) 2 2 σ R 2 ) .
L ( x , σ ) = Δ σ 2 2 G ( x , σ ) u v L ( x ) ,
L R ( σ ) = { 0 exp ( | λ 1 λ 2 | 2 2 β 2 ) ( 1 E X P | λ 1 | 2 + | λ 2 | 2 2 c 2 ) ,

Metrics