In the international standard (International Organization for Standardization 11551) for measuring the absorptance of optical components (i.e., laser calorimetry), the absorptance is obtained by fitting the temporal behavior of laser irradiation-induced temperature rise to a homogeneous temperature model in which the infinite thermal conductivity of the sample is assumed. In this paper, an accurate temperature model, in which both the finite thermal conductivity and size of the sample are taken into account, is developed to fit the experimental temperature data for a more precise determination of the absorptance. The difference and repeatability of the results fitted with the two theoretical models for the same experimental data are compared. The optimum detection position when the homogeneous model is employed in the data-fitting procedure is also analyzed with the accurate temperature model. The results show that the optimum detection location optimized for a wide thermal conductivity range of moves toward the center of the sample as the sample thickness increases and deviates from the center as the radius and irradiation time increase. However, if the detection position is optimized for an individual sample with known sample size and thermal conductivity by applying the accurate temperature model, the influence of the finite thermal conductivity and sample size on the absorptance determination can be fully compensated for by fitting the temperature data recorded at the optimum detection position to the homogeneous temperature model.
© 2011 Optical Society of AmericaFull Article | PDF Article
J. M. Yang and D. W. Sweeney
Appl. Opt. 18(14) 2398-2406 (1979)
Z. L. Wu, M. Reichling, X.-Q. Hu, K. Balasubramanian, and K. H. Guenther
Appl. Opt. 32(28) 5660-5665 (1993)
Z. L. Wu and K. Bange
Appl. Opt. 33(34) 7901-7907 (1994)