The infrared (IR) spectra of whole blood EDTA samples, in the range between 1500 and 750 cm<sup>−1</sup>, obtained from the patient population of a general hospital, were used to compare different multivariate calibration techniques for quantitative glucose determination. Ninety-six spectra of whole undiluted blood samples with glucose concentration ranging between 44 and 291 mg/dL were used to create calibration models based on a combination of partial least-squares (PLS) and artificial neural network (ANN) methods. The prediction capabilities of these calibration models were evaluated by comparing their standard errors of prediction (SEP) with those obtained with the use of PLS and principal component regression (PCR) calibration models in an independent prediction set consisting of 31 blood samples. The optimal model based on the combined PLS-ANN produced smaller SEP values (15.6 mg/dL) compared with those produced with the use of either PLS (21.5 mg/dL) or PCR (24.0 mg/dL) methods. Our results revealed that the combined PLS-ANN models can better approximate the deviations from linearity in the relationship between spectral data and concentration, compared with either PLS or PCR models

PDF Article

Cited By

OSA participates in CrossRef's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.