One of the key metrics for long-haul systems, particularly transoceanic submarine systems, is service latency. Fibre latency is trivial to calculate but an estimate of the latency added by DWDM transponders operating with FEC schemes, DSP for impairment correction and in some cases cross-connect fabrics is much more difficult to estimate. I am interested to know how different modulation formats, FEC schemes, DSP impairment correction and cross-connects affect service latency.
Exploiting greater capacity through more powerful DSP and FEC is presumably at the expense of higher latency (?). A datasheet for the 6500 packet-optical platform states that the "WaveLogic 3-based transceivers can be programmed to quickly respond and adapt to changing requirements for capacity, reach, and latency". Clearly by lowering the bit rate, impairments will be reduced, leading to lower latency, but how much of a difference does it make? An indication of the typical latency figures for a 100Gbit/s DWDM carrier (transmit-receive pair) and the typical contribution to total latency of the various transmission elements mentioned above would be both helpful and interesting to assist in estimating route latency.