Abstract
A general method using intercalibration methodology and linear regression theory for correcting out-of-band (OOB) effects is proposed and applied for Chinese Mapping Satellite-1 (MS-1) to improve the radiometric quality of multispectral sensors. The correction coefficients for a particular broad band are conducted by the linear regression between simulation-corrected radiances and measured digital numbers. The simulation-corrected radiances are obtained by convolving referenced narrow-hyperspectral spectrum radiances with the spectral response function provided by the instrument vendor. An assumption wherein the relationship of in-band and OOB radiances has an extremely linear characteristic if in-band radiances are slightly greater than OOB radiances is proposed and then verified by experiments. The correction coefficients of OOB effects for four bands of the MS-1 multispectral sensor are presented by using hyperspectral data from EO-1 Hyperion. Positive results are achieved for the corrected images.
© 2017 Optical Society of America
PDF Article
More Like This
Cited By
You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.
Contact your librarian or system administrator
or
Login to access Optica Member Subscription