About Verification 3.0
Verification 3.0 represents a new plateau in the continuing evolution of semiconductor solutions in general — a new balance in the combination of the various solutions that make up modern verification environments that can be clearly observed.
Verification 1.0 was characterized by HDL simulation on workstations, directed tests as a primary testbench methodology, block level semiconductors integrated at the board level with off-chip processors, and perpetual license business models. Verification 2.0 saw the advent of SystemVerilog, Constrained Random test generation, simulation farms, emulation, small System-on-Chips (SoCs) with processors on board running basic software stacks, static techniques such as linting and early formal, the advent of Verification Intellectual Property (VIP) and time based licensing.
Verification 3.0 has four legs for foundation, not all of which are fully developed today. They are: the continuum of verification engines, the intelligent testbench, the merging of hardware and software, and the expanding role of verification. Each of these legs have multiple facets and so while some themes may sound familiar, each one of them is seeing significant transformation.
Verification 3.0 has seen the creation of hybrid verification platforms combining the best of simulation and emulation, with formal accelerating components of the flow. FPGA prototyping and real silicon must be brought into the continuum of verification engines, and we will see combined front-ends and back-end processing wrapped around the different engines. The manner in which these technologies are delivered must change. No longer will companies own and maintain all the compute resources necessary to meet peak demand, so distributed and Cloud-based models will play an increasingly important role. On-demand business models will also evolve.
The Intelligent Testbench
Development of Verification 3.0 testbenches started about 10 years with what Gary Smith termed the Intelligent Testbench. Today, we are not only seeing mature tools in this area, but the imminent release of the Accellera Portable Stimulus Standard will enable portability of test intent amongst vendors. For the first time we are witnessing a true executable intent specification driving the entire verification process. Many aspects of the flow, such as debug, will be transformed.
Merging of Hardware and Software
Today Moores Law is slowing down such that companies can no longer rely on scaling to pack in more functionality. They must start getting a lot more creative. A typical chip contains a significant number of deeply embedded processors that rely on firmware to provide their functionality. SoC verification is driving a shift from hardware to software driven testing where the tests are built into C code running on the processors. Portable Stimulus has a large role to play here. Verification at this level is not just about hardware execution (e.g. cache coherency), but also software functionality.
Expanding Role of Verification
Verification teams are taking on additional roles in power verification, performance verification, and increasingly have to look at safety and security requirements. Systematic requirements tracking and reliability analysis are central verification tasks for many designs. Formal has an increasing role to play as well. In addition, it is a technology that can offload tasks from dynamic execution engines and prune the state space that tools have to cover. Design debug and profiling, together with multi-run analysis, are moving up in abstraction in an attempt to leverage artificial intelligence on large data sets.
No one company can tackle the entirety of Verification 3.0 challenge. Small companies have innovative ideas and the ability to turn on a dime in response to market needs and business models. These are the companies most likely to spearhead the direction that the technology takes, and Verification 3.0 will see greater cooperation among them.