Non linear effects play a major role in hindering the progress of optical communication systems in terms of higher data rates and long haul transmissions. Maximum Likelihood Sequence Detection (MLSD) has been proposed to combat the nonlinear effects in optical channels. The main objective is to extract the original signal from the received signal which is distorted due to the non linear effects arising in the fiber. MLSD is an optimum detector as it uses the Viterbi detection through the Trellis structure. In this paper, the impact of SRS in the transmitted signal and its mitigation by MLSD are analyzed. MLSD is implemented for DWDM systems with 4 and 8 channels and its performance is compared with the direct detection receivers.
Maximum Likelihood Sequence Detection (MLSD), WDM systems, Stimulated Raman Scattering, Viterbi detector