I need to vent a little.
I’m a big believer in Formal. I’ve been using it productively since the tools were barely strong enough to fight their way out of a paper bag. And we’ve come a very long way since then.
So I find it incredibly frustrating when I ask colleagues who aren’t using Formal why they're not, and for the umpteenth time I hear “Well we did an evaluation and we found it wasn’t worth the overhead.”
And I never understood it, because I KNOW what Formal can do. How are the design shops missing the boat, and how are the vendors letting it happen?
Recently, I got a close view of an evaluation performed at a colleague’s company.
I was torn between laughing and crying, at the irony of the good effort and intentions juxtaposed with the result. I did some of both, then decided to write about it, in hopes that I might provide a little productivity insight for future evaluations.
Here’s my story.
About 3 months ago, I was contacted by friend of a friend whose company was interested in Formal. I was invited to visit by her manager, and prepared to give them a basic vendor-agnostic understanding of the concepts. Why? Well let’s face it, Formal is a major paradigm shift, which is often counter-intuitive. Not every able Verification Engineer is capable of making the shift easily.
I wanted them to know what they were getting into.
When I got there, they weren’t interested in the concepts. This was a confident group of self-starters, who didn’t have time for that. The manager had a list of questions on which of their designs would be appropriate for an evaluation, and I tried to answer as well as I could. What I really should have done was to insist that they understood the concepts better. This was the first warning sign, which I ignored. Mea culpa.
So here is my Observation #1: There’s a bit of a Catch 22 here.
The groups that are brave enough to try Formal are by definition the ones who are self-confident enough not to be too intimidated to try it. But that same self-confidence leads them to underestimate the mental preparation required to be effective at using the technology.
They ended up picking a bridge with an industry standard interface on one side and a proprietary interface on the other. This is a classic design for Formal, and I agreed that it would make an excellent test case. Rather than asking, I just assumed (yes, I know) that their method of attack, and success criteria, would magically spring into existence somewhere between them and the Formal vendor. Mea culpa #2.
Their next step was to contact one of the Formal vendors (no, I’m not going to say which one) and arrange the licensing, as well as FAE support, for the evaluation.
Observation #2: Here’s where we dig a deeper Catch 22 hole.
The vendor’s sales guy has worked very hard to get his foot in the door.
The very last thing he’s going to do is to put off the customer by warning him of challenges in getting Formal to be effective. He’s going to promise the sun, the moon, and the stars, and minimize any fears of being able to use Formal.
If he does have any educational material, it’s all going to be devoted to the features that differentiate his tool from his competitors, nothing deep on the concepts of how to use the technology.
So the FAE came, and they spent a few months doing the evaluation. As a professional courtesy, they let me take a look at the Formal environment and properties they had written during this period. It was clear a lot of work went into it, mostly by the Verification Engineer, with some supervision from the FAE.
I would argue that a long evaluation is a good thing. It gives the engineers a chance to get some real education. With Formal there is a definite point of critical mass where enough expertise is acquired to do some really good work with the technology, inspiring confidence and a willingness to do more. However, evaluations are expensive, so inevitably time limited. This issue needs to be solved.
Observation #3: An FAE’s life is not an easy one.
I should say that I’ve worked with some true supermen in this position, and some who didn’t quite achieve this level. Their primary training, and purpose in life, is to show the customer how to use the tool, and to resolve issues when they get stuck.
Conceptual training for Formal is both beyond the scope of what they’re allotted time for, as well as possibly beyond the scope of their training.
They had written tens of constraints, and tens of assertions.
They even found a few bugs using those assertions.
All of their properties either passed, or timed out, which justifiably concerned them.
Success, right? Wrong.
So I dug into the property files, almost causing me to fall off my chair. Here are some of the things that I saw just on the surface:
Observation #4: Verification Engineer on the ball
I should note that the last thing I’m doing here is faulting the Verification Engineer. I’ve got independent confirmation that she’s very capable, and very sharp. So what went wrong?
In the absence of close, conceptual guidance, she forged ahead with the methods that worked for her in her non-Formal past, and DID achieve results. So why was I bothered by those results?
Let's take a look at why these items are issues for a Formal environment:
So to best exploit Formal, you should start off with NO constraints, then only add them as necessary to deal with false failures (i.e. scenarios which, by definition, can never occur). If you implement a testbench (e.g. a state machine which drives the stimulus to the DUT) then you're reducing your search space (i.e. the scenarios that will be checked) to what simulation would execute, SO YOU MIGHT AS WELL BE RUNNING SIMULATION!
But there are a number of gotcha’s with this:
- This can result in much longer run times
- This can result in MASKING REAL BUGS.
‘Forever’ is a really long time. In the case of this evaluation, the property was identifying and accepting the response to the next request, and not flagging cases where the current request never received a response. Once we replaced the $ with a reasonable finite number, a bug jumped right out.
As an example, assume we are working on a 64-wide by 32-deep FIFO. As such, there are 2K bits for the Formal tool to handle. To reduce the flops to a manageable number, we could reduce the depth of the FIFO to 4, say, leaving only 256 flops. But at a depth of 4, some of the problem scenarios we're hoping to catch may never be examined. On the other hand, if we can identify that all columns of the FIFO are treated identically, end-to-end, in our DUT (as is often the case), we can cut our width to 1 and leave the FIFO depth at 32. This allows deeper scenarios to be examined, while still only having a FIFO size of 32 bits. Note that this is a gross over-simplification that avoids issues such as sequential depth, but I use it to illustrate the point.
So back to the story… The customer’s perspective after the evaluation was somewhat lukewarm. They still had hopes of using Formal for their next project, but didn’t quite see it as an indispensable wonder tool. Only time will tell if that’s enough to get it introduced into their process, given all the other resource pressures.
From my perspective, this was a golden opportunity wasted.
After much thought, here are my criteria for a successful Formal evaluation:
- Positive steps in the user’s “HOW to use Formal” education. While absorbing the concepts of Formal is a large “ask,” and I wouldn't expect the user to get it all in just one evaluation, I would hope that the user would have picked up the correct methods for writing SVA assertions for Formal, writing SVA constraints for an interface, etc. This clearly wasn’t done here. Worse, they were taught some bad habits. Almost like teaching BASIC to a fresh programmer.
- Ending up with a clean design. If the design picked for the evaluation is reasonable in character and scope for Formal, I'd expect us to end up with a set of Formal constraints and properties which guarantees that the design is clean. We weren’t even close in this case. It might help if there was agreement beforehand as to the design functionality that should be verified as part of the Formal evaluation.
- Positive steps in the user’s “WHEN to use Formal” education. I’ll admit it’s not reasonable from a short evaluation to expect the user to see ALL the benefits of Formal, or to be able to judge EXACTLY when to apply Formal. These both come with time and experience. But certainly the user should have come away with the answers for these as relates to THIS design. And that didn’t happen either.
- Ensure a rousing success. This was a design that was well suited to Formal. The user should have come away from it with the feeling that Formal is a wonder tool, that it cleaned up their design in a way that simulation could never have done, and that the overhead required to get there was small enough that they could never live without Formal again. But they ended up still on the fence.
Coming back to my question above, everyone involved in the effort were sincerely motivated professionals. So what went wrong?
There’s probably material here for a good “Game Theory” paper, but at the simplest level, everyone went for short-term goals, without considering any long terms opportunities. The customer was worried more about the limited time and resources in doing the evaluation, than about getting the necessary training for long-term success in Formal. So they didn’t prepare themselves for the mindset intrinsically required for Formal. The vendor was worried more about losing the sale, and his FAE’s time, than about making sure the customer would achieve long term success in Formal. So they skipped a few steps. And naturally we'll skip talking about my mea culpa's ;-)
Even if the sale is made, the odds of the customer continuing to use Formal effectively enough to justify its existence, let alone receiving the full advantages it can offer, are small.
And that’s how these opportunities go by the boards.
Someone in the process, be it the vendor, the customer, or likely both, need to make sure that evaluations of this tool also address what it takes for the customer to be successful at using Formal long-term, not just running it for a few weeks.
I'm told by one of my astute vendor contacts that while many evaluations historically have been mismanaged, it's beginning to change. The vendors are learning from experience and making sure that evaluations are set up properly for success. They are also engaging in training programs for the entire team. Sadly, this change isn't commonplace [yet].
One answer that seems to have worked for Formal, as well as other new technology introductions, is a healthy level of expert consulting help to get the first project moving in the right direction. This could start at evaluation time and continue through the first project.
The vendor could provide this, or better still an independent contractor with a deep knowledge of formal can help train the verification team in the right direction, bridging the learning curve gap while also accelerating the curve itself. Some vendors have tried to employ their own consulting teams to fulfill this purpose, whereas others have formed a consulting partners program with experts that are trained on their tools.
I've re-read the previous paragraph several times, hoping for a way to make it sound less self-serving. Sadly, I wasn't able to. In my defense, I've really seen this work, at customers of mine who started with zero experience in Formal, and today are successfully using it as an integral part of their methodology, at multiple development sites, using their own in-house engineers, with no outside help.
To sum up, Formal can provide a huge benefit if applied correctly. While it is becoming easier to use, some knowledge of how it may be effectively applied turns it from a useful addition to simulation into a wonder tool. Investing in a wise use of time and effort during the evaluation can pay off in spades, providing long-term value and benefit to the customer, as well as to the rest of the eco-system.
Elchanan Rappaport is the founder of Gila Logic, Inc. He has 30 years experience in both simulation and formal-based verification, having worked with a number of semiconductor companies, including IBM, TI, National, Microsoft and others. For the last 15 years he has focused on high-end formal verification projects and is a common speaker at formal conferences and events. He may be contacted at email@example.com.