SPIE Membership Get updates from SPIE Newsroom
  • Newsroom Home
  • Astronomy
  • Biomedical Optics & Medical Imaging
  • Defense & Security
  • Electronic Imaging & Signal Processing
  • Illumination & Displays
  • Lasers & Sources
  • Micro/Nano Lithography
  • Nanotechnology
  • Optical Design & Engineering
  • Optoelectronics & Communications
  • Remote Sensing
  • Sensing & Measurement
  • Solar & Alternative Energy
  • Sign up for Newsroom E-Alerts
  • Information for:
    Advertisers
SPIE Photonics West 2018 | Call for Papers

SPIE Defense + Commercial Sensing 2018 | Call for Papers

SPIE Journals OPEN ACCESS

SPIE PRESS

SPIE PRESS

Print PageEmail PageView PDF

Optoelectronics & Communications

Calming the Jitters

A universal testing technique brings greater accuracy to verification of pattern-dependent jitter.

From oemagazine August 2004
31 August 2004, SPIE Newsroom. DOI: 10.1117/2.5200408.0007

Jitter is one of the biggest concerns for designers of optical networks, which include optical components such as transmitters, receivers, and transponders. As transmission data rates increase, so do jitter management concerns. Recognizing this, the International Telecommunication Union-Telecommunication Standardization Sector (ITU-T) ratified Recommendation G.783 to specify the maximum jitter for error-free communications on synchronous digital hierarchy (SDH) networks.

Recommendation G.783 specified three required jitter measurements—jitter generation, jitter tolerance, and jitter transfer—to ensure error-free connectivity between transmission systems. Jitter generation is the jitter present on the output of a transmitter, which should be less than a pre-defined amount. Jitter transfer characterizes the way jitter propagates through a system as a function of jitter frequency. Jitter tolerance represents the maximum amount of jitter that a system can tolerate without bit error rate (BER) degradation.

The ITU specification of maximum jitter began a jitter nightmare for network, system, and device vendors. The industry lacked a universal process to verify jitter measurement accuracy, so no method existed to resolve inconsistencies when various test instruments reported huge differences. In addition, the document never identified a reference source with a known jitter amount. As a result, since the ratification of Recommendation G.783 in 2000, wide variances in jitter measurements have existed, depending upon the test instrumentation used.

Jitter Basics

Accurately measuring jitter requires understanding it. We can define jitter as the time or phase difference between the data signal and the ideal clock. The transition points of the data vary over time and the degree of jitter is reported as a fraction of the original signal bit time. For jitter analysis, the bit time of the signal is called its unit interval (UI).

Jitter is a function of frequency, and jitter with a frequency of 0 to 10 Hz is called wander. Industry standards usually specify maximum jitter magnitude in relation to jitter frequency. We define jitter performance as the maximum permissible jitter at a certain frequency or range, and the maximum tolerable jitter of a certain magnitude and frequency with a specified error rate.

The jitter generated in a network can be divided into two types—pattern-dependent jitter and random jitter. Also known as deterministic jitter, pattern-dependent jitter is generated by the non-scrambled bytes in all framed synchronous optical networking (SONET)/SDH signals. The main cause of this type of jitter consists of the transmission characteristics in the high- and low-frequency range of the electrical-to-optical (E/O) converter. In OC-192/STM-64 frames, for example, the framing bytes A1, A2, and J0/Z0 are unscrambled and their periodic repetition causes a pattern-dependent jitter component with a center frequency of 3.24 MHz.

Examining the waveforms of an OC-192/STM-64 signal shows that the unscrambled bytes in the section overhead create significant jitter. This type of waveform evaluation provides a quick and easy way to determine the influence of the unscrambled bytes on the jitter, although it does not take into consideration the jitter frequency.

Single sideband (SSB) noise from the clock generator generally causes random jitter, which is characterized by a Gaussian distribution. Random jitter generated from each network element occurs differently and individually; therefore, the effects that influence the transmission quality are not additive. Especially in the case of OC-192, which specifies a very narrow jitter transfer function (a 120-kHz cutoff frequency), the accumulation effect of nonsystematic jitter is greatly reduced.

As a result, verifying pattern-dependent jitter is much more important than random jitter because pattern-dependent jitter is significantly larger in network elements. In fact, because random jitter is much smaller than its pattern-dependent cousin, it can be ignored, provided system designers use a low-phase-noise synthesizer as the reference clock source.

Taking Measurements

The ultimate objective in jitter measurements is to determine the effect on the BER and make sure the network BER falls below an established maximum value, typically 10-12. Network and component designers were unable to achieve these goals because measurement results varied greatly depending upon the test instrument used; for example, the three leading jitter analyzers measuring the same signal would report jitter ranging from 80 to 120 mUI.

The roots of the wide discrepancy lay in the methods used to measure the signal and verify the accuracy. All data in the payloads is scrambled, but framing bytes in the overhead consist of fixed data patterns and thus are not scrambled. Some test instruments did not measure the jitter of the framing bytes correctly while others did. Since these framing bytes generate a great deal of pattern-dependent jitter, the measurement techniques that excluded them in their analysis reported much lower jitter rates.

Recognizing this inconsistency problem, the ITU-T created a universal test procedure stating how a signal should be tested and defining the evaluation process. The methodology was developed around three requirements: the need to avoid existing jitter analyzers, the need to incorporate commonly available instruments such as sampling oscilloscopes, and the need to report the correct value.

ITU-T also recommended that the jitter verification technique accurately determine pattern-dependent jitter. This recommendation is based on the conclusion that random jitter is inconsequential in the verification method when a pure clock source is used as a reference clock; for example, a 10 Gb/s signal has random jitter of only 0.15 mUI rms, which is approximately 1.5 mUI peak to peak (p-p).


Figure 1. The phase analysis technique to verify pattern-dependent jitter consists of a phase pattern generator, a sampling oscilloscope, and phase analysis software.

The evaluation system recommended to measure pattern-dependent jitter consists of a programmable pulse pattern generator (PPG) that outputs a SONET/SDH framed signal (see figure 1). An E/O converter produces an optical signal containing the pattern-dependent jitter. A sampling oscilloscope with an optical-to-electrical converter that contains a fourth-order Bessel-Thompson filter monitors the signal. Simultaneously, the PPG clock signal is measured as a jitter-free reference signal. The frame pattern is synchronized to the reference clock using a pattern trigger from the PPG.


Figure 2. The unscrambled framing bytes (shown in blue, green, and yellow) create most of the pattern-dependent jitter (PDJ) in a SONET/SDH signal. In this example, the PDJ is 110 mUI p-p.

To ensure accuracy, at least eight measurements should be taken, which averages out residual jitter created by the oscilloscope trigger circuit. The user sets the first edge of the A1 pattern and the edge of the clock by adjusting the delay on the sampling oscilloscope to measure the time between the data edge and clock edge (see figure 2). After measuring the times of all edges, the pattern jitter of the transmitter can be calculated using digital signal processing filtering technologies.


Figure 3. To measure jitter with the phase analysis technique, first adjust the delay relative to the reference clock, using the skew function on the oscilloscope. Next, measure the phase difference between the rising/falling edges of the data signal and that of the reference clock. To eliminate the random jitter of the oscilloscope trigger circuits, set averaging to eight traces.

Using this procedure enables a highly accurate jitter measurement (see figure 3). In the example, a 10 Gb/s signal is measured using a recommended sampling oscilloscope. The phase difference in the screen shot is the jitter. The results clearly indicate the presence of significant pattern-dependent jitter on the transmitter.

While pattern-dependent jitter constitutes the vast majority of jitter, random jitter should also be analyzed for better accuracy. Doing so requires measuring the SSB noise.1 Using a signal with an unframed 1010 pattern converted from a SONET signal, we calculate the random jitter of this sample as only 0.15 mUI rms in the jitter measurement range of the high pass and low pass filters (20 kHz to 80 MHz). This is equivalent to 1.5 mUI p-p.

Just for Reference

An additional benefit of the phase analysis technique recommended by the ITU-T is the establishment of a reference signal that can be used as a "golden transmitter" or known jitter value to calibrate jitter analyzers.

This standard can be a true benefit for production engineers who have multiple jitter analyzers on the manufacturing floor. Given the history of varying jitter measurement results, the ability to calibrate all jitter analyzers based on a standard helps ensure repeatable measurements. The result is increased production yield. During its last regular meeting, the ITU-T WG4 agreed to send liaison statements to the U.S. National Institute of Standards and Technology (NIST; Gaithersburg, MD), and NIST is very interested in this method.

Designers of optical components and networks have had a difficult time accurately determining jitter. The importance of jitter measurements, particularly at high bit rates, required a universal testing procedure. The phase analysis technique recently recommended by the ITU-T draft O.172 brings improved accuracy, traceability, and repeatability to jitter testing. It also recognizes pattern-dependent jitter as the main component of jitter and determines that the most accurate jitter analysis method concentrates on measuring pattern-dependent jitter. oe

References

1. Anritsu Technical Note, "PDH/SDH Jitter & Wander Measurement," Rev. October (1996).


Hiroshi Goto

Hiroshi Goto is an optical design engineer at Anritsu Co., Richardson, TX.