Abstract
We explore the feasibility of using a local spectral time-domain (LSTD) method to solve Maxwell’s equations that arise in optical and electromagnetic applications. The discrete singular convolution (DSC) algorithm is implemented in the LSTD method for spatial derivatives. Fourier analysis of the dispersive error of the DSC algorithm indicates that its grid density requirement for accurate simulations can be as low as approximately two grid points per wavelength. The analysis is further confirmed by numerical experiments. Our study reveals that the LSTD method has the potential to yield high resolution for solving large-scale electromagnetic problems.
© 2003 Optical Society of America
Full Article | PDF ArticleMore Like This
Julie A. Gruetzmacher and Norbert F. Scherer
Opt. Lett. 28(7) 573-575 (2003)
T. O. Körner and W. Fichtner
Opt. Lett. 22(21) 1586-1588 (1997)
Werner S. Weiglhofer and Akhlesh Lakhtakia
Opt. Lett. 26(16) 1218-1219 (2001)