The Experts weigh in on Serial-Data Link Testing

Today I read an interesting article in Electronic Design magazine about the complexity of testing third-generation serial-data technology. With comments from Agilent, Tektronix and LeCroy, the article highlights all of the problems associated with oscilloscopes, BERTs, AWGs and other “heavy-metal” products, and not surprisingly doesn’t say anything about the elegant alternative of using embedded instrumentation…

The title of the article was catchy: “Look Third-Generation Serial-Data Link Testing In The ‘Eye’”: And it starts off with the basic premise that buses are getting faster, creating complexity for functional and compliance test requirements. It rightfully observes that receiver eye-pattern test has emerged as a critical requirement for serial-data link testing. And there’s a striking visual that shows “sleepy” eye diagrams where high data rates yield an eye diagram with no discernible eye:

After the introduction, the article quotes the experts from Tektronix, Agilent and LeCroy on how the test industry is addressing some of these issues. I’ve excerpted some of these comments and provided some insight below each.

“Instrumentation is costly and represents a huge investment for the designers,” said Michael Fleisher-Reumann, Agilent Technologies’ strategist for high-speed test.

This is absolutely true. The cost of validation and test using multi-GHz oscilloscopes and other such gear is increasing exponentially, with no end in sight. How much does it cost to validate PCIe Gen3 designs versus PCIe Gen2 or Gen1? The purveyors of hardware-based test equipment have little motivation to lower the cost of test for their customers – they’ll only do the minimum needed to remain competitive with each other.

“…eye diagrams and jitter analysis are done at the far (receiver) end of the channel. In reality, they’re still done at the transmitter but you move your reference point through emulation. Then the instrument extrapolates the effect of the channel,” said Chris Busso, product marketing manager for LeCroy’s high-speed serial data solutions.

This is again very true. But what Mr. Busso is really saying is that we want to measure at the receiver itself, but we can’t so we use S-parameters and simulation to pretend we’re seeing the real signal. Wouldn’t it be great to be able to use embedded instrumentation within the receiver silicon to see the true signal?

“In the past, …you would measure as close as possible to the input to the device to see if the signal you sent has the right amplitude and vertical eye opening,” says Agilent’s Michael Fleisher-Reumann. “You had to take care to measure correctly but it was possible. Today the specification relates to a point inside the ASIC after equalization is done. When you measure input, you have to take the signal through simulation to see if the eye opening is right.”

 “The intrinsic noise and jitter of scopes from all vendors is so dominant and significant that the eye opening you see on the scope is much smaller than it really is,” says Fleischer-Reumann. 

This sounds like a direct admission that the oscilloscope cannot see the eye correctly because of its own noise and jitter. So simulate out the effect of the channel, and simulate out the ‘scope itself, and you’ve got a simulation of a simulation. Not all that reassuring.

The good news is that with embedded instrumentation, such as Intel IBIST, no simulation is required. ScanWorks HSIO software sees what the silicon sees. And the cost is a fraction of that of hardware-based products. Read more here: