Letβs quickly go through the terminology here:
ASIC and FPGA
An ASIC is an "Application Specific IC", a chip specifically designed for a specific purpose. An FPGA is a "Field Programmable Gate Array". In essence, it is an ASIC but extra wiring and logic allows you to reconfigure the FPGA to contain a digital circuit over and over again. Both types of chips need a description of the implemented circuit. This description is in a language called a "Hardware Description Language" or HDL. It looks like software syntax but it is actually describing hardware.
RTL syntax
RTL code is a subset of any HDL language. The syntax specifically describes logic that a tool (software) can automatically translate into cells. Two examples:
- Nested "if then else" implements a priority encoder.
- A "case" statement is a non-priority encoder.
We say that RTL code is synthesizable. A synthesis tool takes this RTL code and a library of cells as input to produce a wired up circuit. But the syntax of any HDL is much broader than RTL code alone. File IO to read and write files has no equivalent in hardware. A synthesis tool will always tell you that code like that is not synthesizable. Hence the whole syntax of the language can be used for functional verification of the RTL code. A testbench does not need to be translated into logic.
Verification
Whenever a graduate starts working as a dVLSI (digital VLSI) engineer, the full syntax is part of the training program. That is exactly the reason that any resource starts as a verification resource. Because the current RTL synthesizable subset changes over time. Every HDL language has a few revisions over the past decades. It means that the language was extended step by step with extra features. But the tools lag the standard. And different vendors do not all support the same subset. Some are slower rolling out new features. And unfortunately, if one vendor's synthesis tool supports feature X of language Y, it might be that this feature X is not yet supported by formal verification tool Z of the same vendor.
So, most companies assume that engineers get a feeling of RTL coding by looking at the code they verify. Some nasty old geezers will say "Garbage In, Garbage Out", something that plagues the AI hype as well. If one sees the spaghetti that is called RTL, verification engineers that move to design, will create the same sort of RTL code. A weakness in the industry, I prefer to frame it that way because I am a positivo after all. π In contrast, there is much more demand for verification then for any other front-end design task (design, synthesis, DFT, STA).
Twofold conclusion (underlined!)
Above all, times have changed. Three decades ago, a chip was a one resource project. When I started out, it was a multi-resource project. Two to 10 people. Fifteen years ago, it was 10 to 30 people. Today it is 150 to 200 people. Hence, the whole front-end design is partitioned into designers, verification engineers, DFT engineers, STA and synthesis engineers, ... . The decoupling of those tasks creates a Babel like project structure. Design doesn't know how to avoid synthesis + DFT + STA issues anymore. Creating a back and forth that is neither pleasant nor efficient. But it is clear that one truth still stands. The pareto principle applies to design versus verification. About 20% of time is spent in design and 80% in verification. That implies a project needs four times the number of verification resources compared to designers.
Second, verification in the past was the wild west. Everyone had its own method of verifying a "Device Under Test" (DUT) or "Unit Under Test" (UUT). While this sounds bad, it was an opportunity. With the right strategy, a design services company could make this a major game changer versus the competition. First time right was a great differentiator between "bad" methodology and "good" methodology. Today, first time right is still possible, but price is the deciding factor now, not quality. To counter the chaos, EDA companies started promoting a verification methodology. At first, there were two main competitors, today only UVM remains. As a result, UVM based verification needs people that can build the infrastructure and people that use the UVM testbench structure to verify. The latter are the engineers that actually verify RTL. They take the verification plan (for the seniors reading this, I know) and implement the test cases. The former category deals with classes, inheritance, agents, scoreboards, assertions, ... . Unfortunately they are drifting away from hardware design fast and they adopt more software engineer habits. While some are good, some are definitely bad. If a person does not care that a test case passes with UVM_VERBOSITY = UVM_HIGH and fails when this is set to UVM_LOW, that is quite troublesome in my book. But my book is still printed on good old trees and today is the age of the ebook π.
In short, verification is in much higher demand. But due to the high demand, this is becoming commodity fast. Meaning cheap, cheaper, cheapest. And decoupling is a thing! I can't stress that enough. It is the root cause that makes ASIC's in 7nm go over budget with considerable slip. Budgets of hundreds of millions USD to tape-out a "System-On-Chip" (SoC). In contrast, startups led by silicon experts, do not have that kind of budget (think divide by ten). Still they tape out silicon that works. Weird, isn't it?
Applause for ourselves!!! ππππ
Thank you for reading until the end.
This article was posted on
Linkedin as well.
Feel free to share this article and connect with me on Linkedin!