Abstract
An attention mechanism is integrated into neural network-based equalizers to prune the fully-connected output layer. For a 100 GBd 16-QAM 20 x 100 km SMF transmission, this approach reduces the computational complexity by ~15% in a CNN+LSTM model.
© 2021 The Author(s)
PDF ArticleMore Like This
Mohannad Abu-romoh, Nelson Costa, Antonio Napoli, João Pedro, Yves Jaouën, and Mansoor Yousefi
SpM5C.5 Signal Processing in Photonic Communications (SPPCom) 2021
Zhaopeng Xu, Shuangyu Dong, Chenxin Jiang, Jonathan H. Manton, and William Shieh
M5H.4 Asia Communications and Photonics Conference (ACP) 2021
Pedro J. Freire, Yevhenii Osadchuk, Antonio Napoli, Bernhard Spinnler, Wolfgang Schairer, Nelson Costa, Jaroslaw E. Prilepsky, and Sergei K. Turitsyn
M5H.5 Asia Communications and Photonics Conference (ACP) 2021