Why Serial Links?

Wednesday, May 13, 2020

Author: Donald Telian, SI Guys - Guest Blogger

Over the past couple decades every style of high-speed interface switched to serial transmission. I/O, Graphics, Disk, Backplanes, CPU, Long-Reach, Short-Reach, you name it - everything except memory. Serial technology is reliable, extensible, and will be with us for many years to come. In practice, Serial Links are now the primary focus of Signal Integrity.
Yet serial links brought significant disruption to the industry. New signaling, new route methods, new equalization techniques, new model formats, new standards, new design tools, new terminology, new data rates - so much change I've been able to focus my business on this transition for decades. Indeed, the change has affected not only SI Engineers, but also Layout, Hardware Design, IC Design, and even Software/Firmware Design. Given all this change, why did we switch to serial links? Simply put, three reasons:
  • Gates are Free
  • Signals are Messy
  • Stubs are Bad 


Gates are Free

Silicon integration changed everything from what we design, to the way we design, to the size and efficiency of our products. Examining 30-years of digital electronics history reveals IC density increased 100,000x while PCB density increased 3x. So it's not surprising that the advancement of "high-speed" capitalized on integration and the excess of available transistors or "gates".

We might say the turning point occurred in 2002 when Jim Pappas of Intel announced (EETimes 2/18/2002 page 92, commenting on PCI's switch from parallel to serial):

"We're now at the point where it's getting cheaper to put more gates behind a fast serial line than to lay down copper traces."

There you have it: gates cost less than IC pins and PCB traces. And this from a guy who helped pioneer the original parallel PCI bus. Indeed, serial links require lots of gates, not just to serialize/de-serialize, but also to align, deskew, encode, unscramble, buffer, equalize, recover, and so on. It's not overstatement to say the simple push-pull digital I/O buffer transformed itself into a digital signal processor (DSP). But this is not a problem in a world where gates are re-usable and readily available. Integration has moved signal integrity inside the chip, while outside the chip signals are increasingly messy.


Signals are Messy

I guess if this wasn't true, I wouldn't be able to write all these articles on Signal Integrity (SI). Faster speeds bring faster edges, which bring higher frequencies, which bring ever-increasing design concerns and challenges. The messier the signal, the harder it is for a receiver to properly detect a correct logic level.
Before the era of serial links, single-ended multi-point DDRx interfaces required processing the dozens of waveform measurements shown in Figure 1 (and on page 7). Complicated, right? And that's just to determine the "quality" of the waveform; timing calculations - along with edge rate derating - happen later. Good thing computers can parse data and perform calculations. But the problem with this form of digital signal processing is that it is performed at design time, provoking design iterations and schedule delays.

Dozens of Measurements Processed to Determine Digital Signal Quality and Timing
Dozens of Measurements Processed to Determine Digital Signal Quality and Timing

In contrast, serial links do the "digital signal processing" real-time while the system is running. Do you see? High-speed signals are messy and must be processed one way or another, so modern ICs - where gates are free - have learned to put the DSP inside the chip. In contrast, those interfaces where gates must be used for capacity (e.g., memory) place the burden on the design team instead of the IC. Look at any DDRx layout or SI Report and you'll see what I mean.
Inconsistent power references also make signals messy. If power levels "bounce" or are inconsistent between ICs, PCBs, or systems chassis the logic level becomes indeterminant as well. To combat this, serial links use differential signaling. In reality, all signals are "differential" in that their logic level is determined by some kind of "difference". But the so-called single-ended signal derives that difference from power-referenced circuits, while the differential receiver compares the difference between the two signals in the pair. Serial links solve the inconsistent power problem by sending a reference along with the signal - a reference that encounters the same inconsistencies along the way.
Serial links further cleaned up signal messiness by using point-to-point connections; each Transmitter (Tx) connects to only one Receiver (Rx). This was imperative, because however you connect a multi-point system you wind up with stubs. And stubs are bad.

Stubs are Bad

Stubs are any branch off the signal path that is not headed to the Rx receiving the signal. Unintended stubs are caused by vias, testpoints, ESD (electro-static discharge) devices, IC packages, surface mount pads, and other interconnect anomalies. Anyone who implements serial links has learned to combat these and other aberrations in the signal path. Use any signal topology other than point-to-point and you invite stubs. But why are stubs so bad?
Actually, stubs aren't bad at lower frequencies. But raise the frequency, and a 30 mil stub will cause your system to fail. In my "7 Steps" series I show how a ¼" stub makes a 12 Gbps eye opening disappear (Step 5, Figure 2). To predict the frequency where stubs cause failure, Eric Bogatin introduced the 3/Gbps Rule-of-Thumb (RoT) for problematic stubs on PCBs. This RoT not only explains why the 12 Gbps signal disappears (0.25"=3/12) but also explains why around 1 Gbps we switched to point-to-point interconnects. Gbps signaling cannot tolerate stubs caused by branches in multi-point interconnects.

Why do stubs cause a signal to disappear? The answer is a ¼ wavelength. A signal in a stub of this length reflects off the end and returns 180° out of phase, cancelling that frequency. Doing the math, assuming PCB signal propagation is 6"/ns, we derive Bogatin's problematic stub in inches to be 3/Gbps (= ¼(2*6"/ns)/bit-rate). Figure 2 plots this Rule-of-Thumb (red dashed line) along with the length of a stub that becomes acceptable (solid green line) for each data rate - generally accepted to be 10x smaller, or 0.3/Gbp

Bad and Acceptable Stub Lengths versus Data Rate

Bad and Acceptable Stub Lengths versus Data Rate

Figure 2 reveals why stubs became relevant in standard 0.062" thick PCBs as we passed through 6 Gbps, and much sooner in 0.25" thick backplanes. As stub lengths move from green to red they become increasingly detectable and problematic in signal performance.

The RoT plotted in Figure 2 presumes 6 in/ns propagation, or a stripline dielectric constant of 3.87. Other values throughout the range of common dielectrics (Dk=4.5 to 3.0) change the values shown in Figure 2 by about plus or minus 10% respectively.

The fact we cannot have stubs in Gbps signal paths hastened the migration to point-to-point interconnects. Even without stubs high-speed signals are still messy, yet the availability of excess gates provided a way to effectively transmit and receive reliable data. The reasons for Serial Links makes sense, however they have significantly changed the way we do Signal Integrity.


What's New with Serial Links?

Serial links brought numerous concepts, tools and terminology to the practice of SI. Now that we're decades into the dominance of serial links, the many changes they brought are commonplace. Nevertheless, here the list of what's new with serial links:

SerDes - The Serializer/Deserializer replaced the simpler digital "I/O Buffer". Each SerDes has a Transmitter (Tx) and Receiver (Rx). Modern SerDes do a lot more than serialize and deserialize, however they are still called a "SerDes".

Channels, Lanes, Links - The interconnect between a Tx and Rx is called a "channel". Adding another channel in the opposite direction forms one "Lane". Binding together one or more lanes forms a "Link".

Loss - Loss has always been present, but at lower frequencies we weren't concerned about it. The combination of higher frequencies, longer interconnects, and lack of characterization of PCB dielectrics caused SI engineers to suddenly care about loss. Over time lower-loss materials were introduced, providing a 10x reduction in dielectric loss.

Equalization - Became another solution for handling loss. Ever-increasing forms of equalization have paved the way to very high-speed serial links. Equalization is a vast topic with its own concepts and terminology.

Frequency Domain - Obviously had always been there, but channel loss and equalization caused SI engineers - who had previously focused on the time domain - to become better acquainted with it.

deciBels - SI inherited the dB as part of the language associated with the frequency domain. SI may have been better served to use non-logarithmic ratios, but the dB is here to stay. Commit these values to memory: -3dB = 70%, -6dB = 50%, -12dB = 25%, and -40dB is 1% and the point where we typically no longer care or trust the data anymore.

S-Parameters - A useful "black box" way to characterize, model, and understand interconnect. Send frequency-rich stimulus into a passive structure, measure what makes it through versus what bounces back ("Scatters"), and you have S-Parameters. SI tools added support for S-parameters in the 2000s, encountering challenges similar to adding support for IBIS models in the 1990s - both tasks requiring 10 years achieve reliable model generation and use.

AMI Models - While S-Parameters handled the nuances of passive structures, IBIS added AMI (Algorithmic Modeling Interface) constructs to handle equalization in active devices. AMI had its own adolescence, which occurred in the years surrounding 2010.

Eye Diagrams - Capture the timing element of the serial bit stream by wrapping the waveform back on itself every bit time or "Unit Interval" (UI), revealing an opening that looks like an "eye". An eye "mask" defines the necessary size of the eye opening to ensure reliable signal transmission.

BER - The introduction of probabilistic Bit Error Ratios brought interconnect architectures that presume some data will not transmit successfully. These ideas were also not "new," yet previously an SI Engineer's system would crash if data was not received correctly. In contrast, serial links, by design, expect and recover from bad data. I fear this attribute has masked some bad SI designs, while giving the illusion the system is "working".

Inter-symbol Interference (ISI) - When one bit (or "symbol") interferes with a neighboring bit you have ISI. Reflections caused by discontinuities cause ISI, and so does pulse spreading due to loss and edge rate degradation. At any moment in time a channel has numerous bits bouncing around inside it; a 12" 28 Gbps channel holds more than 50 bits, and every inch in a 6 Gbps channel holds one bit. A bit's energy that does not make it to the Rx in the correct UI has become ISI to its surrounding bits.

Jitter - Much of the time variation of serial data is lumped into this concept, which has also become a science unto itself. The three elements in a channel - Tx, Rx, and the interconnect between them - each have their associated jitter, which is typically budgeted into thirds. Within an eye diagram, in addition to horizontal time jitter some refer to vertical "voltage jitter". Thinking this way, everything outside the eye opening is "jitter" working against you to compress the eye.

Clock Recovery - Whereas source-synchronous systems send the clock next to the data, serial links hide the clock inside the data. As such, many gates are deployed inside the SerDes to "recover" the clock so the data can be latched correctly.

There you have it, more than a dozen concepts that accompanied the wide-spread adoption of Serial Links.

In Conclusion

High-speed digital electronics is well into the era of Serial Links. Though they introduced significant change to the practice of Signal Integrity, Serial Links significantly extended the life of copper interconnects on PCBs. While point-to-point differential signaling improved the stability of PCB signaling, IC integration and equalization enabled encoding and recovery of very high-frequency signals. Be sure to get acquainted with the new terms, tools, and techniques that provide ever-increasing data rates and system performance. Serial technology is robust and will be with us for years to come.

Donald Telian, SiGuys - Guest Blogger 5/13/2020

Add your comments:

Items in bold indicate required information.