DWDM, Dense Wavelength Division Multiplexing
A WDM system uses a multiplexer at the transmitter to join the signals together, and a demultiplexer at the receiver to split them apart. With the right type of fibre you can have a device that does both at once, and can function as an optical add-drop multiplexer. The optical filtering devices used in the modems are usually etalons, stable solid-state single-frequency Fabry-Perot interferometers.
The first WDM systems combined two signals and appeared around 1985. Modern systems can handle up to 160 signals and can expand a basic 10 Gbit/s fibre system to a theoretical total capacity of over 1.6 Tbit/s over a single fiber pair.
WDM systems are popular with telecommunications companies because they allow them to expand the capacity of the network without laying more fibre. By using WDM and optical amplifiers, they can accommodate several generations of technology development in their optical infrastructure without having to overhaul the backbone network. Capacity of a given link can be expanded by simply upgrading the multiplexers and demultiplexers at each end.
This is often done by using optical-to-electrical-to-optical translation at the very edge of the transport network, thus permitting interoperation with existing equipment with optical interfaces.
Most WDM systems operate on single mode fibre optical cables, which have a core diameter of 9 µm. Certain forms of WDM can also be used in multi-mode fibre cables (also known as premises cables) which have core diameters of 50 or 62.5 µm.
Early WDM systems were expensive and complicated to run. However, recent standardization and better understanding of the dynamics of WDM systems have made WDM much cheaper to deploy.
Optical receivers, in contrast to laser sources, tend to be wideband devices. Therefore the demultiplexer must provide the wavelength selectivity of the receiver in the WDM system.
The introduction of the ITU-T G.694.1 frequency grid in 2002 has made it easier to integrate WDM with older but more standard SONET systems. Today’s DWDM systems use 50 GHz or even 25 GHz channel spacing for up to 160 channel operation.
Recently the ITU has standardized a 20 nanometre channel spacing grid for use with CWDM (Coarse WDM), using the wavelengths between 1310 nm and 1610 nm. Many CWDM wavelengths below 1470 nm are considered “unusable” on older G.652 spec fibres, due to the increased attenuation in the 1310-1470 nm bands. Newer fibres which conform to the G.652.C and G.652.D standards, such as Corning SMF-28e and Samsung Widepass nearly eliminate the “water peak” attenuation peak and allow for full operation of all twenty ITU CWDM channels in metropolitan networks. For more information on G.652.C and .D compliant fibres please see the links at the bottom of the article:
DWDM systems are significantly more expensive than CWDM because the laser transmitters need to be significantly more stable than those needed for CWDM. Precision temperature control of laser transmitter is required in DWDM systems to prevent “drift” off a very narrow centre wavelength. In addition, DWDM tends to be used at a higher level in the communications hierarchy, for example on the Internet backbone and is therefore associated with higher modulation rates, thus creating a smaller market for DWDM devices with very high performance levels, and corresponding high prices. In another word, they are needed in small numbers and therefore not possible to amortize their development cost among a large number of transmitters.
Note: The term “Lambda” is also used interchangeably when referencing a specific wavelength of light.
From Wikipedia Wavelength-division multiplexing article.