We present an effective way to solve the denoising problem of fringe patterns in optics interferometry. The proposed method is based on the topological analysis of an appropriate cost function. To overcome the blurring drawback of the linear diffusion approach, the linear diffusion coefficient at each edge is perturbed successively. The total variation of a discrete cost function can be taken as an indicator function to pick out the most suitable edges of pixels at which the diffusion coefficients are to be perturbed. Then, a filtered image can be obtained by using selected diffusion coefficients associated to the edges. We demonstrate the performance of the proposed method via application to numerically simulated and experimentally obtained fringe patterns.
© 2010 Optical Society of America
Equations on this page are rendered with MathJax. Learn more.