Abstract
Assuming that the measured coordinates of the fringes of an interferogram have random errors and that they are considered Gaussian, the system of normal equations that is obtained on application of the least-squares method is converted into a nonlinear set of equations. We present an algorithm to estimate the coefficients of the nonlinear system by applying the Newton–Raphson method and starting the iteration from the standard classic solution. This algorithm is applied to a pattern of straight and equally spaced fringes, obtaining not only the right coefficients but also the adequate election of the terms to be included in the model, to show the contrast with the results of the classic method.
© 1998 Optical Society of America
Full Article | PDF ArticleMore Like This
A. Cordero-Dávila, A. Cornejo-Rodríguez, O. Cardona-Nuñez, and Rufino Díaz-Uribe
Appl. Opt. 33(31) 7343-7349 (1994)
A. Cordero-Dávila, A. Cornejo-Rodríguez, and O. Cardona-Nuñez
Appl. Opt. 33(31) 7339-7342 (1994)
Alberto Cordero-Dávila, Juan Manuel Nuñez-Alfonso, Esteban Luna-Aguilar, and Carlos Ignacio Robledo-Sánchez
Appl. Opt. 40(31) 5600-5609 (2001)