Quantcast
Channel: Jason Andrews Blog
Viewing all articles
Browse latest Browse all 33813

Formal Verification Sign-Off...and the First Text Message

$
0
0
Recently, it was the 10th annual Jasper User Group meeting (see my earlier post Jasper User Group 2017 for more background and a summary of the two best papers presented, at least as judged by the audience). There were also two invited papers. To open the first day, the extensively-named M V Achutha Kiran Kumar presented on Formal Verification Lead—An Interesting Career , and the second day was opened by Vigyan Singhal of Oski Technology presenting The Evolution of Formal Verification Sign-Off . See my post JUG: How to Be a Formal Verification Lead for details on the first day's invited paper. Vigyan Singhal is the CEO of Oski Technology. Prior to that he was the founding CEO of Jasper and became CTO when Kathryn Kranen joined. Here is your trivia question of the day. When Jasper was founded, it wasn't called Jasper. What was it called? Prior to founding Jasper, he was a researcher at Cadence. He has been working on formal verification since 1993, which is about as long as it is even possible to make that claim. Hardware Design is Harder Than Software The productivity of hardware design, in terms of lines-of-code, is 10X lower than software design. All programs are what software engineers would call "massively parallel" since every always-block is a thread. But there is so much more to worry about than just functionality: power, silicon area, low-level performance (within a clock cycle), a lack of high-level abstractions like synchronization, or object-oriented programming. Of course, a worse problem is that you can't just build the chip and see if it works, so verification is critical. Simulation vs. Formal Verification Verification requires a different mindset since you can't verify a design the way that it was designed. Simulation cannot cover all cases and there is always the possibility of subtle corner-case bugs surviving. Formal is exhaustive and, if done properly, no bug will be left behind. The challenge with formal is that there is a third answer between proving correctness and finding a counter-example, the tool runs out of time or space before getting a definitive result. Formal is very different from simulation. Simulation is what Vigyan calls zero-plus. You start with nothing, and you add some vectors. Then you add some more. It is hard to reach your verification target, even with a mixture of random and directed tests. On the other hand, formal is infinity-minus. You start with 100% by virtue of the technology, but there are false failures and failures to converge, so you add constraints. You can reach the point where the design is actually over-verified. A big challenge is that the algorithms are exponential. It may take two hours to verify 36 cycles, but four hours to verify 37, and to push up to 47 cycles, it may be measured in years. Formal verification has 24 years of evolution. It started with Chrysalis doing flops one clock at a time in the 1990s. Then came cones of logic in the 2000s, moving up to blocks in 2010s, with whole systems starting about now, and expected as the technology develops further in the next few years. Algorithms have gotten better, and computers have gotten faster, but also the has been the creation of a formal methodology. At the beginning, it was just equivalence checking, but even that was a big step in getting formal technology out of academia and into the industry. It became must-have technology as we moved to static signoff after logic synthesis and place & route. These tools (such as Cadence's Conformal LEC) have gotten more powerful, and have kept track with the limited sequential optimizations that are done by synthesis tools, and are still a crucial part of signoff today. Assertion-Based Verification The other type of formal is assertion-based verification. Initially, this handled internal RTL assertions (such as one-hot flops in a state register), protocol assertions, and interface assertions. The big challenge in making formal less of a niche technology was moving the methodology up to blocks, to functional units such as memory controllers, or instruction schedulers, the type of block a single designer would create. These were too complex for push-button formal and required a methodology. The four C's of the assertion-based methodology are checkers, constraints, coverage, and complexity. Each of these four requires effort and creativity. Blocks are usually best checked with end-to-end checkers by creating a reference model, and then verifying that the behavior of the block matches the reference model. Developing these models is expensive and not trivial for most blocks. The reference model can have its own internal state-machines, assertions and so on. Ideally, you don't look at the RTL to write the reference model (to avoid making the same mistake twice). In practice, you need to look in the RTL to see what might occur in the way of complexity, but the checkers should not be influenced by the RTL. At the same time, constraints are required to restrict the input space to only legal input sequences. It can be possible to over-constrain (it is a common newbie mistake to so over-constrain so that all input sequences are illegal and the block is trivially correct). The best is to start with an intentionally under-constrained environment and add constraints only when required (false counterexamples are the main way to identify this). This both avoids over-constraint and helps keep proof complexity low. Another relatively recent but very powerful development is proof core. In effect, this is like asking “Mr. Checker, which lines of code were needed to verify these 35 cycles?” That makes it possible immediately to see holes in coverage. Sometimes the proof core is not exact and precise, but in practice, anything inside the core is almost always inside. It does take some time, and the results can be demoralizing since you can see immediately what you have not done, and thus how much is left to do. Sign Off The ultimate question is "Can we sign off?" Checkers: Do we have enough checkers to cover complete functionality? Constraints: Are there unintentional over-constraints in the design? Coverage: Is it complete enough? Complexity: Is the achieved proof depth sufficient? These aren't just yes/no questions. It should all be tracked through the project as in the table below: It is challenging since it requires systematic effort, patience, formal expertise, and creativity. But the rewards are that the formal testbench is usable for future design revisions, formal sign-off offers more confidence than simulation sign-off, and quality blocks lead to quality chips. Systems Going up a level from blocks to systems divides has two aspects. There are the blocks that can be verified using the methodology from the previous section. But then there are things like networks-on-chip (NoCs) that require a different approach. I described this in more detail in my post, Decoding Formal Club: Arm and Arteris , which uses the Oski methodology. Basically, the verification is broken into two parts. First, build architectural models and verify that the system works correctly. Then verify that the implementation of each block is a subset of the architectural block so that it will behave the same as the architectural block against all valid inputs. Both steps can be time-consuming, but the rewards are confidence in critical system-level behavior that you cannot prove with "traditional methods" (which is a formal engineers name for simulation). You get guaranteed safety, reliability and security. Using formal verification throughout the design gets from designers in bringup, the exhaustive correctness of blocks, architectural validation of the system, and potentially full signoff. The responsibility for quality is with the right stakeholders at the right time. With no increase in resources, the whole schedule "shifts left" compared to simulation-only based verification. But going all the way is hard. Getting from 60% to 90% is easy. Getting from 99.2% to 99.6% is very difficult. Vigyan wrapped up with some personal notes: Formal verification is very rewarding for the committed and you can spend a lifetime going deeper. Most of the engineers in my company were not born when I started doing formal verification, and I still learn things from them. It is a segment that is rich with innovation, not just on the Cadence side with the tools, but on the user side too. There is a well-known joke about two hikers being approached by a bear. One hiker puts on his running shoes. The other tells him not to be silly, that a person can't outrun a bear. "I don't need to outrun the bear," the second hiker says. "I just need to outrun you." Formal verification doesn't have to be perfect, it just has to be better than simulation. The Answer The original name of Jasper Design Automation was Tempus Fugit. Why "Merry Christmas" Was Significant Yesterday Nothing to do with formal verification, but yesterday was the 25th anniversary of the first text message. SMS (the GSM short message system) was really intended for internal use of the network to configure phone settings, and display short communications from the network. Given those requirements, mobile phones could only receive text messages, not send them. The first use was on 3rd December 1992 when Neil Papworth (of Sema Group in the UK) used a PC to send the message "Merry Christmas" to Richard Jarvis, who was at a party organized to celebrate the event. Who knew just how significant it would become? It would be another couple of years before Radiolinja (in Finland) became the first network to offer the capability to send and receive text messages to its subscribers. On a phone with just a numeric keyboard, it was a pain to type a message, but even 25 years ago teenagers had thumbs! Back in that era, nobody expected text messaging (and its later kindred systems like WeChat, WhatsApp, iMessage, Facebook Messenger etc) to become the primary means of communication. Mobile plans were for a certain number of voice minutes, and everything else was thrown in (there was no data networking yet). Networks realized how big a market they were becoming, and that they could sell essentially zero bandwidth for a dime a time. In that era, they also discovered they could charge for ring-tones, and that became a billion dollar business. Now, of course, voice is used so rarely that networks throw it in for free with a data plan. Text messages too. Sign up for Sunday Brunch, the weekly Breakfast Bytes email.

Viewing all articles
Browse latest Browse all 33813

Trending Articles