Abstract
We explore the feasibility of using a local spectral time-domain (LSTD)
method to solve Maxwell’s equations that arise in optical and electromagnetic applications. The discrete singular convolution (DSC) algorithm is implemented in the LSTD method for spatial derivatives. Fourier analysis of the dispersive error of the DSC algorithm indicates that its grid density requirement for accurate simulations can be as low as approximately two grid points per wavelength. The analysis is further confirmed by numerical experiments. Our study reveals that the LSTD method has the potential to yield high resolution for solving large-scale electromagnetic problems.
© 2003 Optical Society of America
Full Article |
PDF Article
Cited By
You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.
Contact your librarian or system administrator
or
Login to access Optica Member Subscription
Tables (2)
You do not have subscription access to this journal. Article tables are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.
Contact your librarian or system administrator
or
Login to access Optica Member Subscription
Equations (4)
You do not have subscription access to this journal. Equations are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.
Contact your librarian or system administrator
or
Login to access Optica Member Subscription