Category: Intel® IBIST

In early 2011, Intel discovered a design issue on their Cougar Point chipset, and took an approximately $700 million charge against earnings to repair and replace affected parts and systems. What may have been the root cause of this, and how may it have been prevented?
We know from empirical evidence that a system’s operating margins are as sensitive to the chips on the board, as they are to the board’s design and manufacturing process itself. Why is this so?
In a previous blog, I described how fixed and adaptive equalization techniques are used within chips to ensure signal integrity even in adverse system conditions. Why is it important to tune these parameters within a chip?
Modern high-speed I/O equalization schemes typically include both fixed (programmable) and adaptive components to ensure signal integrity even in adverse system conditions. What tools are available to ensure that these equalization techniques are working properly on a given system?
Ever wonder if a stray cosmic ray or alpha particle might double your bank account, due to an undetected RAM error?
What’s cheaper, faster, and more powerful than an oscilloscope, when it comes to validating high-speed signal integrity? Why, a software application using embedded instruments, of course. How is this possible?
In my last few blogs, I’ve talked about the challenges of testing QPI, PCI Express, SATA 3, and DDR3 memory. These buses are common to many Intel Sandy Bridge and Ivy Bridge motherboard designs. Should test engineers take chances and just not test them?
Archives