When Should I Simulate?

Wednesday, September 5, 2018

This question is not asked as much as it should be. Companies that simulate routinely or never at all can both benefit by asking it. As a Consultant, asking and answering this question for my customer is part of my value add; defining effort and expense commensurate with the task at hand – no more, no less. Learning how to answer this question is one of engineering’s “soft skills” and the subject of this post.

 

To be sure, inventors and purveyors of technology (IC companies, standards bodies, IP suppliers, etc.) already did lots of Signal Integrity (SI) simulation before their products reached your door – unless, of course, it is copycat or “me too” products that leverage the simulation work of the inventors. Either way, significant simulation has already been done to ensure a technology is viable and robust. (Side note to inventors of new technology: YES, you need to simulate!) Indeed, it is very hard to sell electronic devices that carry a caveat “you must simulate this device in your system to ensure it will work correctly”. Say what? Well, I’ll buy from your competitor then. And so IC companies dance on the line of making sure their product operates correctly in its intended application without simulation, while also telling you simulation is “recommended”. Which brings us back to the original question: “When Should I Simulate?”. The answer depends on the following three questions.

 

The Answer Lies in 3 Questions

When looking at a project, learn to step back and ask yourself these questions:

 

1. Is this new technology?

2. How close is this to something I’ve done before?

3. Am I using these devices the way the manufacturers recommend and expect?

 

While there’s some overlap, nuances require that we ask each question separately. Let’s take a deeper look.

 

Question 1: Is this new technology?

While everything somewhat builds on what has been done before, it is not uncommon – especially in electronics – to be confronted with something that is entirely “new”. For example, a decade ago I was asked to help with a project that had an unwieldy amount of “newness”; the data rate had never been achieved, as such it had to comply with an open standard that was not finalized using new transceivers on new components implemented at a new process node – all of which had never been used. And it had to be simulated using a new and untested model format implemented in simulators that may or may not be interpreting the format’s new industry standard correctly. One equation with six unknowns. Indeed, here was something “new” – so I was brought on-board to navigate this conflux of technology. Yes we need to simulate, but can we even believe the results?

 

While that example is extreme, any one of those attributes would have qualified the project as “new” and warranted simulation. Furthermore, if all the items I listed were not new, yet we needed to apply the proven technologies to a form-factor (including very short connections) and/or transmission medium never used before, all things are again “new” and we need to simulate. This brings me to the next question.

 

Question 2: How close is this to something I’ve done before?

The answer here is fuzzier because it changes with both engineer and company. For example, a company might task a young engineer with simulating a DDRx interface they have implemented many times. In this case, simulation is warranted primarily as a training exercise, requiring the new engineer to wade through analysis setup complexities and learn how to interpret the results – which no doubt reveal many “false” failures that must be reconciled against the fact that the interface has a very high probability of working acceptably. This situation is similar to the more experienced engineer who is asked to change simulators, as tools run “hot” or “cold” rendering it important to calibrate simulation results in each new environment. While these are examples of “newness”, they relate primarily to levels of experience.

 

One of the main reasons companies search for signal integrity consultants is because their design requires something they have never done before. As a consultant, their scenario is often something I have done many times. In these situations I may say – while we certainly can simulate – it is likely not necessary as long as you do this, that, and the other (come back for my next post to learn what these items are). Indeed, if the components can’t do the application the customer is attempting it would not be on the market; the implementation is new to the customer yet not new to the industry as a whole. Familiarity breeds confidence, and so your answer to this second question is found in how many times you have successfully implemented the interface in the given configuration. Which brings me to the next question.

 

Question 3: Am I using these devices the way the manufacturers recommend and expect?

Answering this question can be more challenging. While many devices offer “layout guidelines” that describe a device’s anticipated design and implementation space (aka, “the box”), others do not. Assuming you can’t discuss your application in detail with the device’s manufacturer, skill is required when perusing their documentation to determine to what degree your application is inside or outside their “box”. Signal Integrity Simulation’s primary application and added value over the years has been to help systems companies differentiate products by intelligently going outside “the box”. Which somewhat brings us back to Question #1; you are attempting something “new” with this device – be it longer lengths, different materials, more loads, etc. and you need to simulate.

 

However, should you discover your application is well within the manufacturer’s “box” AND the manufacturer can demonstrate a substantial amount of correlated simulation was performed to derive and validate their “box”, in this case you may decide not to simulate – particularly when buying from vendors known to have significant SI staff and experience. Notice, I said “correlated simulation”. While I am a long-time simulation user and have even built my career around it, this also means that many times I have discovered its weaknesses as well. And those discoveries came through correlating simulation results with hardware measurements. There have been surprises brought on by higher frequency effects, manufacturing quirks, model insufficiencies, and other problems we didn’t plan for. And so the best companies and engineers understand the need for hardware correlation, and should be able to demonstrate the same.

 

Hmm, maybe I won’t simulate this time

If you’ve made it through the three questions and you’re entertaining the idea of not simulating, there’s one more thing we need to discuss: NOT simulating implies you are completely confident that you know how to implement the high-speed signals in question correctly. “Implement” means you understand the nuances of how to handle the signals with regard to their layout, construction/materials, topology, lengths, matching, spacing, etc. Recall that what makes a signal “high-speed” is that you can’t simply connect it in an arbitrary fashion; there’s something required in the way the connection is made. And so not simulating is your declaration that you know what that is. It’s possible your company has adequately captured the net’s unique requirements after a variety of successful implementations of the interface that have withstood the challenges of volume manufacturing. Or, perhaps your silicon vendor has persuaded you their layout guidelines are reliable because they’re based on correlated simulation as well as successful implementations on demo boards and other customer designs. Or maybe you downloaded the applicable layout rules from the internet (yikes!) and are inclined to believe their source. Whatever the case, be advised that layout guidelines often contain folklore and “it worked last time”, while not including the original assumptions they were based on. Lack of confidence in what has been handed down is another great reason to simulate. Simulation removes folklore.

 

As always, successful implementation is the end goal. Figure 1 shows a flowchart to lead you there through the steps we’ve discussed.

 
 

 

 


Figure 1: A process to help you decide when to simulate

 

 

In Conclusion

Yes, there are situations where it might not be a good use of time and resources to simulate. The goal of this post is to help you identify when that might be true by asking three questions: (1) is this “new”, (2) has it been done it before, and (3) it is outside “the box”. Good engineering judgment must be deployed on any project, which – based on your answers – may lead you to the conclusion that simulation is necessary. If you do not already have the capability, both SiSoft and SiGuys possess simulation know-how and are experienced at augmenting both large and small design teams. And if you’re in the market for simulation tools, have a look at SiSoft’s products.  They have been rigorously correlated to yield accurate results for their targeted applications.

Donald Telian - Guest Blogger, SiGuys 9/5/2018

Add your comments:

Items in bold indicate required information.