Dispersion in WDM
Hi all,
I have implemented a WDM system using OptiSystem. When I increase the bit rate (and therefore I have to increase the channel spacing), the channels that are further from the optical fiber reference wavelength have worse constellations than the channels near the fiber reference wavelength.
I have attached a screenshot from the WDM spectrum after the fiber link as you can see. I have also attached an image showing constellation diagrams of an eight-channel WDM (channels one to eight, from left to right), while the fiber reference frequency is the same as the central frequency of simulated band. As you can see, the signal is more corrupted as the channel as farther from the central frequency of the fiber, even though I have used a DCF in the line.
I would like to know why this signal degradation is happening while I have used a fiber as DCF with the same characteristics and when the dispersion becomes noticeable in the main fiber, dispersion in the second fiber with the same (yet negative) slope and dispersion value should compensate for it. Am I right?
Thank you all in advance.
Regards,
Alistu
Responses (7):
Login You must be logged in to reply to this topic.