IC Verification Steps Up Amid Rising Chip Complexity

Siemens’ Mark Olen explains how modern cloud-based software can bolster chip design and verification engineering.

Integrated circuit (IC) design verification has undergone significant changes in recent years due to the increasing complexity of ICs and the need to ensure their functionality, reliability and security. Traditionally, verification involved manually writing and running test cases, which could be time-consuming, error-prone and often inadequate to ensure complete verification coverage.

As digital technology has progressed, IC verification has become more automated and sophisticated. Advanced verification software tools, such as Siemens’ recently launched Questa Verification IQ, utilize AI, machine learning and cloud computing to enhance the efficiency and effectiveness of the verification process.

These software tools can automatically generate tests, identify coverage gaps and analyze complex data sets, making the verification process faster, more accurate and more comprehensive. In addition, these tools provide designers with real-time feedback on the effectiveness of their verification plans, enabling them to make adjustments and optimize the verification process.

Verification is following the broader trend of digital transformation to become more integrated with the IC design process. Verification is becoming a critical aspect of the design process, not a separate and isolated step. As with other engineering software in digital transformation initiatives, this has led to more collaborative and parallel approaches to IC design, with verification being conducted concurrently with design rather than after the design is complete.

Engineering.com spoke with Mark Olen, Siemens EDA’s Director of Product Management for IC Verification, to understand how advanced verification software can further digital transformation in the semiconductor industry. Olen shared his insights on Questa Verification IQ and how verification software can help designers automate development, identify coverage gaps and analyze complex data sets.

The following interview has been edited for brevity and clarity.

Engineering.com: What’s the motivation behind Questa Verification IQ?

Mark Olen: Many years ago, when designs were much simpler, the primary investment in IC design was in the design part. We’ve been studying this trend over the last 15 to 20 years very carefully, and maybe five years ago the average number of verification engineers on a chip design project is now more than the number of designers. So instead of 10 designers to 2 verification engineers, it’s more like 9 to 15.

The predominant methodology was to run a bunch of simulations, measure the results (typically referred to as coverage) and figure out whether I’m covering all the functionality and design. If not, then I have to go back and write more tests and run simulations. I could see that I’m only getting 35 percent coverage of my functionality, which is clearly a problem, but there was no way of seeing what was wrong. There was nothing prescriptive.

So, the intent of the entire project was to provide help in terms of diagnostics and analytics and eventually predictive analysis, hence the name Verification IQ.

How does the software use AI, machine learning and cloud computing?

The software tells you why your coverage is low and what you need to do to fix it. It has the ability to triage and diagnose. Machine learning has been applied to be able to give you predictive analytics.

The user interface is also much more modern and flexible. Most existing verification management systems work on a traditional file structure on a desktop called database management system (DBMS). We have moved to a repository or relational database management system (RDBMS) and transitioned the entire product line to a browser user interface. This also allows us to provide collaboration, product lifecycle management or bug tracking.

For instance, there might be an engineer doing block level design in Ottawa. They might be doing system level design in San Antonio and maybe another block level or software verification in the UK. All that information goes into the central repository that is accessible by everybody. The software will now be able to give me advice on how I can shift the design based on progress made by the other engineer, even if it was on a related block, but it impacted my block.

This also leads to a non-technical benefit, as Questa Verification shows trends and compares progress to other engineers’ work. I can look at a trend over a period of time and see someone is making great progress in increasing their coverage on their design block, as compared to another making slow progress. However, it would also show who has the bigger or more complex block, and that is part of the analytics to figure out where the roadblocks are.

It is also able to fulfill the role of multiple tools. Older platforms were designed for one particular generator, which might have a coverage tool that reads simulation coverage, or one that reads emulation coverage, or another one that reads formal verification assertion coverage. Questa Verification, however, is specifically designed to read in information from all those different technologies, formal static, simulation, emulation, even prototyping.

You mentioned that Questa now has a browser-based user interface. Does that not raise concerns about security?

Security of the data is a major issue, especially for industries such as aerospace and defense. Questa takes care of this by offering the option of putting up firewalls, so they do take advantage of data sharing, but only within a specific environment. Some of the original intellectual property (IP) development work will happen in more of a general environment. Once they deliver their IP to a specific program, they stay in a confidential environment because that development now is applying secret sauce into utilizing the standard blocks that came from outside.

Some customers are also moving towards self-managed clouds by setting up their own server farms with their own hardware and sharing it across their own company. Sometimes they will say that they will run certain processes with some of the cloud providers, but analysis of the data will only be done on-site.

Are there sentiments among engineers about how this technology might possibly make them redundant?

The customers we are working with are not looking at this a way of cost cutting by reducing the number of verifiers. They are looking at using this technology to drive higher quality into their products and shorten time to market. One thing we have studied is that, on average, 30 percent of all designs finish on schedule. This means that a majority of projects are late, which costs money. There is a trade-off between complete verification and getting done on time but not finishing verification and putting quality at risk, and quality is not something that can be sacrificed just to get to market on time.

What we are seeing is actually the technology is giving engineering, verification and even management improved tools to get their job done better, which means that verification engineers still have plenty to do.

Additionally, while the software has in theory taken over the role that was done manually, often those engineers just develop new skills. I would actually say that those who have had experience are probably the best people to run the tools because they know what a good output looks like.

What are your plans for the future?

One of the things we are still working on is prescriptive debug analytics. We have talked about coverage and verification management, but what happens when one of these many tests fail? Now what needs to happen is the tool looks at that information and triages, diagnoses and ultimately repairs the design. We have not completed this technology yet, but we have plans to.