Optoelectronics, Dense Wavelength Division Multiplexing
The cluster of technologies that are dense wavelength division multiplexing (DWDM) emerged during the last decade of the twentieth century. Multiplexing is the sending of separate signals with one transmitter in one optical signal, to increase data capacity. The pioneers of optical fiber communications had very early understood the potential of telecommunication fibers to carry more than one communication channel if the channel were on different optical wavelengths (‘‘colors’’) but it took an army of researchers many years to realize the necessary technical solutions.
The advantages of DWDM are very clear—the technology offers a relatively straightforward path to exploiting the enormous inherent bandwidth of optical fiber. DWDM involves an array of modulated light sources at discrete wavelengths, which are combined (‘‘multiplexed’’) onto a single transmission fiber.
The signal may travel many hundreds of kilometers, being amplified every 80 to 100 kilometers by erbium-doped fiber amplifiers (EDFAs), which are continuations of the transparent path and require no separation of the channels. Ultimately, the channels are demultiplexed at the terminal equipment and each channel is individually detected and its modulated data converted to digital electronic format.
Until EDFAs emerged at the beginning of the 1990s, displacing electronic regenerators, DWDM was an expensive luxury. Electronic regeneration required the demultiplexing of channels at each node, so DWDM had little or no advantage over the use of a different fiber for each channel. The single channel-one fiber approach at least avoided expensive optical multiplexing, and allowed the use of any signal laser source, without tight wavelength control, for each channel, thus saving on the cost of maintaining a large inventory of expensive spare sources. The EDFA made it most economical to put all channels on the same fiber, saving cost at the nodes at the expense of terminal multiplexing and demultiplexing.
A number of technologies were required to make DWDM successful, and we will touch on each in turn.
- Since the channels are defined by the filter passbands in the optical multiplexers, light sources—semiconductor lasers—had to be stable in wavelength and available on a well-defined grid of wavelengths. They also had to be capable of modulation in a format that would be stable over the transparent links. In particular, they had to have acceptable ‘‘chirp’’ (phase distortion over the pulse) so that dispersion in the optical fiber would not degrade the pulse. The initial solutions were directly modulated diodes with integral grating-based distributed feedback (DFB) to lock the lasing wavelength.
- The multiplexers (which in reverse also served as demultiplexers) were of course key to the entire DWDM approach. Their function was to take a number of channels of different wavelengths each on a separate fiber and put them out on a single fiber. While this could be done with simple couplers such as fused biconic taper (FBT) couplers, the loss at each coupler is a factor of 2 in power, so high channel count DWDM would require very high laser powers. Since beams of light at different wavelengths can in principle be combined without loss, a more elegant and more scalable approach was to use some kind of resonant coupling structure, for example, grating-assisted couplers or dichroic thin film filters.
- EDFAs suited for multichannel amplification had to be designed for high power, which required the development of more powerful pump lasers. The uniformity of gain across the wavelength band was also a challenge. While this was initially solved by careful positioning of the signal wavelengths in the flattest part of the erbium gain spectrum, the inevitable demand for more and more channels forced the use of gain-flattening filters in the amplifiers to tailor the gain spectrum. GFFs are optical filters with spectral loss curves engineered to match the erbium spectrum. Initially researchers manufactured them from carefully designed multilayer thin film filters, and later they used fiber Bragg grating (FBG) technology, when the grating community had developed techniques for fabricating complex chirped gratings.
The first DWDM systems were installed in submarine cables in 1990. Terrestrial systems followed later—they initially sported just four channels, each channel operating at 2.5 gigabits per second (Gbps), and were installed in about 1994. However, the expectations of long-distance telecommunications carriers quickly exceeded this unprecedented single fiber bandwidth of 10 Gbps, and an explosion of activity in the next few years pushed channel data rates quickly up to the next level—OC192 or 10 Gbps—and channel counts from 4 to 16 to 32 to 96, while the separation between adjacent channel wavelengths decreased accordingly.
The capabilities of EDFA pump, multiplexer and GFF technologies had to be continuously upgraded to support this revolutionary increase in communication capacity. At OC192 rates, fiber dispersion became a problem and the new technology of dispersion compensation was also introduced, based on dispersion-compensating fibers that had been invented earlier in the decade. Dispersion compensation modules (DCMs) are typically sandwiched between EDFAs to compensate for their loss and thus minimize their impact on signal-to-noise ratio.
Finally, the design of the fiber itself was modified to deliver better performance for the high-density high data-rate communications on the horizon. By the end of this explosion the potential capacity of new systems was about 1 terabit per second (1012 bits per second) per fiber, and the rediscovery of Raman amplification was set to extend system reach from 600 to 5000 kilometers.
With this breathtaking success under its belt, the telecommunications engineering community in 2000 confidently anticipated the continued rapid evolution of channel data rates to 40 Gbps, a doubling or more of the useable fiber bandwidth and the imminent fulfillment of the Raman promise. New challenging problems would have to be tackled—optical nonlinearities in the fiber (self-phase modulation (SPM), four-wave mixing (FWM), cross-phase modulation (XPM), etc.) and polarization mode dispersion (PMD) were the next hurdles to be overcome.
New signal modulation formats and enhanced forward error-correction schemes would help to overcome these challenges. Unfortunately, the technological revolution had been accompanied by a massive expansion in fiber plant and high-performance system installation, fueled by the same expectation of continually exploding demand for bandwidth and solid revenue streams that had driven the investments in technology.
While bandwidth continued to grow, albeit more slowly than predicted, the revenue did not materialize, and the fiber buildout turned out to have delivered a glut in capacity that devastated the telecommunications market as the bubble burst in the early years of the twenty-first century.
Date added: 2024-03-05; views: 178;