SPIE Membership Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
SPIE Photonics West 2019 | Call for Papers

2018 SPIE Optics + Photonics | Register Today



Print PageEmail PageView PDF

Optoelectronics & Communications

Stressing Out

Successful production of 10-Gigabit-Ethernet components requires stressed-receiver sensitivity testing.

From oemagazine June/July 2005
31 May 2005, SPIE Newsroom. DOI: 10.1117/2.5200506.0010

Driven by the rise in voice over IP (VoIP) deployment in the network, demand for 10 Gigabit Ethernet (10 GbE) will rise to support the increasing traffic load VoIP will place on the network. Initial 10 GbE designs were tested under jitter-tolerance guidelines specified by ITU Recommendation O.172, originally developed for synchronous digital highway/synchronous optical network (SDH/SONET) jitter/wander testing. Testing under O.172 is insufficient, however; proper testing of 10 GbE systems requires a stressed-receiver sensitivity (SRS) testing approach, as defined in IEEE standard 802.3ae.1

Noise sources typical for stressing 10 GbE systems are not found in the O.172 specification, leading to misleading test conclusions. In other words, there are designs that pass O.172, yet fail SRS testing. On the other hand, designs that pass SRS will always pass O.172 jitter-tolerance tests.

The SRS testing approach is more stringent in defining a variety of noise-source effects, looking at pulse-width shrinkage, power, and simulated channel penalties, along with a swept-frequency sinusoidal jitter element. The O.172 jitter-tolerance test only specifies increasing the jitter amplitude at the different jitter frequencies until bit errors exceeding a defined threshold occur at the output of the receiver test point. The jitter amplitude just before the output crosses this error threshold is defined as the maximum tolerable jitter of the input under test.

With the SRS testing approach, each jitter type is completely quantifiable, ensuring a repeatable test methodology. This is very important in multi-site design/manufacturing environments or in companies that choose a contract manufacturer for their 10 GbE network element production lines.

The suite of test equipment required to perform the necessary PHY layer measurements is common to most link testing scenarios. The two key test points consist of TP2, defined at the point in the optical path at which the PHY layer transmitter tests are made; and TP3, defined as the point at which the optical transmission path terminates at the medium-dependent interface of the receiver, where the receiver test is made.

At TP2, the goal is to validate transmitter performance. Standard measurements include transmitted power over the optical link at high amplitude and at low amplitude, with transmission performance typically specified by extinction ratio and average optical power. Particularly for multimode systems, it is important to ensure that some minimum length of fiber is included in the test in order to ensure adequate optical mode fill. For transmitter characterization, the performance and repeatability of the waveform analyzer is the key feature.

Receiver measurements take place at TP3. Testing the receiver's ability to successfully reconstruct an attenuated signal and provide an adequate representation of the digital data under the SRS is the key. The O.172 standard calls for analysis by bit-error-rate tester alone. Given the precision now required in the specification of the stressed waveform in SRS testing, oscilloscopes play a critical role in calibration and monitoring of the network performance as defined by the eye diagram.

Unlike SONET/SDH testing, which placed a premium on jitter performance of the scope, SRS testing places a premium on noise measurement of the zero level and is more jitter compliant. Poor zero-level characterization will result in dramatic errors in the level of stress provided to the device under test; therefore, the proper test strategy must consider the use of a noise source to stress the signal. In addition, the standard places stringent requirements on the electrical-to-optical converter—it must provide digital-quality waveforms while also passing improvements distortion free. In addition to receiver standards, 802.3ae explicitly specifies jitter and amplitude modulation interference levels, eye closure, and rise-/fall-time degradation.

A successful stressed-receiver test ensures compliance with the 802.3ae eye mask after a review of signal performance, while introducing the vertical-eye closure penalty and stressed sensitivity to the signal itself. Given the interoperability requirements of 10 GbE data communication systems, it is critical that the measurement setup provide the ability to ensure compliance through a number of stressed-signal conditions. oe


1. http://grouper.ieee.org/groups/802/3/ae

Christopher Loberg
Christopher Loberg is market development manager at Tektronix Inc., Beaverton, OR.

Matthew Adams
Matthew Adams is business unit manager for test instrumentation at JDS Uniphase, San Jose, CA.