Why You Should Care About Portable Stimulus
The Accellera Portable Stimulus Working Group (aka PSWG) has just released a draft of the Portable Stimulus Standard (PSS) for public review and comment. Formally known as the Portable Stimulus Standard Early Adopter Release, the new spec can be downloaded for your perusal. The comment period is open until September 15, 2017. The Accellera website (accellera.org) has a press release describing the details of the release and links to download a copy of the spec and to a forum where you can comment on it.
According to the spec, "The goal is to allow stimulus and tests, including coverage and results checking, to be specified at a high level of abstraction, suitable for tools to interpret and create scenarios and generate implementations in a variety of languages and tool environments, with consistent behavior across multiple implementations."
The PSWG has been working on this standard for about two years. This includes all the time for the various administrative details associated with running a standards committee as well as writing the spec itself. There was much discussion on topics from major ones such as the scope of the Portable Stimulus Standard to arcane ones such as the difference between an action and an activity. It all seems somewhat obscure. Should you care?
To answer that question we have to step back a little and look at the history of EDA tools and design methodologies. Over the last several decades much of the advances in design and verification methodologies has been driven by the ever-shrinking transistor and the number of them that can fit on a single die. More transistors means more functionality; more functionality means we have to deal with the increasing scale. The primary tool for managing the exponential increases in design scale has been abstraction.
When there were 100 transistors on a die an engineer could keep them all in his head. As a design methodology this was fine until designs got a little bigger and it was increasingly difficult to hold all the details of a hardware design in one's brain. Gate level design methodology allowed designers to deal with designs larger than a few hundred transistors. The gate-level abstraction provided a means to abstract functionality and time as well as electrical characteristics such as voltages and currents. Designers could think in terms of logic and less about transistors. Designers worked more with ones and zeros and less with volts and amps. As designs scaled up even further and gates became no longer feasible, RTL, synchronous design methodologies, and synthesizers took over. These provided the next abstraction. With the RTL abstraction engineers could design in terms of clock cycles, registers, and boolean functions, and not even worry so much about gates. This abstraction and tools to support it ushered in the ASIC era.
Now, in the SoC era, A lot of design and verification is done using the transaction-level abstraction. Transactions enable designers to move away from clocks and registers to work in terms of transfers of control and data. However, even in SoCs much design work is still done in RTL using ASIC era tools. With design scales reaching into the billions of transistors the pressure to find a new abstraction is mounting.
Another factor in the need to move to a new abstraction is software. In the ASIC era software was an afterthought. It was added to a hardware system after the hardware system was built in much the same way that paint was added to a car. You can't ship a car without paint, but the engineers designing the engine or the suspension don't think too much about it. In SoC chips the software is part of the complete system and is not separable. An SoC hardware design cannot be created without knowing about the software that will drive it and vice versa. An analogy is a modern digital SLR camera with interchangeable lenses. The design of the camera body and the design of the lenses that work with the body were done in tandem.
The next abstraction that is appearing on the horizon is about software processes and resources. I'm not sure this new abstraction has a name yet. Perhaps we can call it process abstraction or simply system-level abstraction. Maybe ESL will work. The elements of this new abstraction are processes which decompose to various actions or activities (I'm not sure which) and which utilize resources. The processes are scheduled on one or more processors and synchronized through resources. Resources are allocated to processes. Resources could be hardware resources, such as registers or caches, software resources can be data structures or threads. The abstraction doesn't really differentiate between hardware and software resources. They are all resources.
The standard that the PSWG is producing operates at this new process-level abstraction. The elements of the standard include graphs representing actions, resources and resource pools, and semantics for static and dynamic scheduling. These are the elements necessary to form the process-level abstraction. While the PSWG is positioning the PSS as a way of defining stimulus for tests, it's really much more than that. It will be the first publicly available representation of the next design and verification abstraction.
The PSS represents the first germ of the next abstraction level available in an industry-wide standard form. Some companies and some EDA vendors have been using or offering tools in this space for a long time. None of them have been standardized or have widespread usage. Gary Smith EDA has been predicting the rise of ESL for a long time, perhaps they have been ahead of their time. In the SoC era ESL methodologies are no longer optional. Working at the system-level perspective is necessary to create the humongous SoCs that are driving the next generation of applications. The PSS presents a system-level view of a design, with inter-related software and hardware components. It's a way to view complete systems in all that entails.
The PSS is not yet fully formed, the draft is just a trial balloon to get community feedback at its current stage of development. Further, it was conceived as a test medium, not so much as a modeling language. Yet, its system-level perspective should be viewed beyond just writing tests.
This is why you should care. At this point it's difficult to predict how the nascent standard will succeed or fail, or how it will evolve. However, a new abstraction and a new approach to design and verification is emerging.
In future blog posts we'll look at the details of the standard and talk a bit more about what it means.