We show theoretically that the incorporation of a frequency-dependent loss mechanism in a semiconductor laser can lead, in concert with the amplitude-to-phase coupling, to major reductions of the fundamental intensity and phase noise. A loss dispersion of the wrong sign, on the other hand, leads to an increase of the noise and, at a certain strength, to instability.
© 1990 Optical Society of America
Equations on this page are rendered with MathJax. Learn more.