Hardware-software co-design and co-verification
In many cases where the application is Joris Van Den Hurk. Ralf Niemann. Many of the modern applications of microelectronics require hugeamounts of computations. Despite all recent improvements in fabrication technologies, Embedded Software Verification and Debugging.
Patrick R. This service is more advanced with JavaScript available. Sanjaya Kumar, James H. Aylor, Barry W. Johnson, Wm.
Wulf, Ronald D. Pages Aiguier, J. Benzakki, G. Bernot, S. Beroff, D. Dupont, L. Freund et al. The solution is co-verification. Co-verification has its roots in logic simulation. The HDL logic simulator has been used since the early s as the standard way to execute the representation of the hardware before any chips or boards are fabricated.
As design sizes have increased and logic simulation has not provided the necessary performance. Examples of hardware methods include simulation acceleration, emulation. Here we will examine each of these basic execution engines as a method for co-verification. Co-verification borrows from the history of microprocessor design and verification. In fact. The microprocessor verification application is not exactly co-verification since we normally think of the microprocessor as a known good component that is put into an embedded system design, but nevertheless, microprocessor verification requires a large amount of' software testing for the CPU to be successfully verified.
Microprocessor design companies have done this level of verification for many years. Companies designing microprocessors cannot commit to a design without first running many sequences of instructions ranging from small tests of random instruction sequences to booting an operating system like Windows or UNIX.
This level of verification requires the ability to simulate the hardware design and have methods available to debug the software sequences when problems occur. As we will see, this is a kind of co-verification. I became interested in co-verification after spending many hours in a lab trying to integrate hardware and software.
I think it was just too many days of logic analyzer probes falling off, failed trigger conditions, making educated guesses about what might be happening, and sometimes just plain trial-and-error.
I decided there must be a better way to sit in a quiet, air-conditioned cubicle and figure out what was happening. Fortunately for me, there were better ways and I was fortunate enough to get jobs working on some of them. These products appeared on the market within six months of each other in the time frame and both were created in Oregon. Eagle Design Automation Inc. The Eagle product was later acquired by Synopsys, became part of Viewlogic, and was finally killed by Synopsys in due to lack of sales.
In contrast. Mentor Seamless produced consistent growth and established itself as the leading co-verification product. Others followed that were based on similar principles, but Seamless has been the most successful of the commercial co-verification tools. Russ Klein documented the use of an instruction set simulator ISS co-simulating with an event-driven logic simulator. As we will see in this chapter, the paper also detailed an interesting technique of dynamically partitioning the memory data between the ISS and logic simulator to improve performance.
When he first got the idea for a product that combined the ISS a familiar tool for software engineers with the logic simulator a familiar tool for hardware engineers and used optimization techniques to increase performance from the view of the software, the value of such an idea wasn't immediately obvious.
To investigate the idea in more detail he decided to create a prototype to see how it worked. Testing the prototype required an instruction set simulator for a microprocessor, a logic simulation of a hardware design, and software to run on the system. Of course, none of the source code for the software was available, but Russ was able to extract the data from the ROM and the first couple of tracks of the boot floppy using programs he wrote.
From there he was able to get it into a format that could be loaded into the logic simulator. Working on this home-brew simulation, he performed various experiments to simulate the operation of the PC, and in the end concluded that this was a valid co-simulation technique for testing embedded software running on simulated hardware.
In certain modes of operation, the simulation ran faster than the actual computer! Russ turned his work into an internal Mentor project that would eventually become a commercial EDA product. In parallel. Eagle produced a prototype of a similar tool. While Seamless started with the premise of using the ISS to simulate the microprocessor internals, Eagle started using native-compiled C programs with special function calls inserted for memory accesses into the hardware simulation environment.
At the time, this strategy was thought to be good enough for software development and easier to proliferate since it did not require a full instruction set simulator for each CPU only a bus functional model. After they pitched the product to Mentor Graphics, Mentor was faced with a build versus buy decision.
Should they continue with the internal development of Seamless or should they stop development and partner or acquire the Eagle product? According to Russ, the decision was not an easy one and went all the way to Mentor CEO Wally Rhines before Mentor finally decided to keep the internal project alive. The other difficult decision was to decide whether to continue the use of instruction set simulation or follow Eagle into host-code execution when Eagle already had a lead in product development.
In the end, Mentor decided to allow Eagle to introduce the first product into the market and confirmed their commitment to instruction set simulation with the purchase of Microtec Research Inc.. The decision meant Seamless was introduced six months after Eagle, but Mentor bet that the use of' the ISS Would be a differentiator that would enable them to win in the marketplace. Another commercial co-verification tool that took a different road to market was V-CPU.
It was engineered by Benny Schnaider, who was working for Cisco as a consultant in design verification, for the purpose of early integration of software running with a simulation of a Cisco router. As V-CPU was being adopted by more and more engineers at Cisco, the company was starting to worry about having a consultant as the single point of failure on a piece of software that was becoming critical to the design verification environment.
Cisco decided to search the marketplace in hope of finding a commercial product that could do the job and be supported by an EDA vendor. At the time there were two possibilities, Mentor Seamless and Eaglei. After some evaluation, Cisco decided that neither was really suitable since Seamless relied on the use of instruction set simulators and Eaglei required software engineers to put special C calls into the code when they wanted to access the hardware simulation.
In contrast, V-CPU used a technique that automatically captured the software accesses to the hardware design and required little or no change to the software. Paul, MN, named Simulation Technologies Simtech and gave them the rights to the software in exchange for discounts and commercial support. Dave Von Bank and I were the two engineers that worked for Simtech and worked with Cisco to receive the internal tool and make it into a commercial co-verification tool that was launched in at the International Verilog Conference IVC in Santa Clara.
V-CPU is still in use today at Cisco. Over the years the software has changed hands many times and is now owned by Summit Design. It means running the software on the hardware to make sure there are no hardware bugs before the design is committed to fabrication. As we will see here, the goal can be achieved using many different ways that are differentiated primarily by the representation of the hardware, the execution engine used, and how the microprocessor is modeled.
But more than this, a true co-verification tool also provides control and visibility for both software and hardware engineers and uses the types of tools they are familiar with, at the level of abstraction they are familiar with.
A working definition is given in Figure 6. Co-verification is often called virtual prototyping since the simulation of the hardware design behaves like the real hardware, but is often executed as a software program on a workstation.
This broad definition includes physical prototyping as co-verification as long as the prototype is not the final fabrication of the system and is available earlier in the design process. A narrower definition of co-verification limits the hardware execution to the context of the logic simulator. Benefits of Co-Verification Co-verification provides two primary benefits. It allows software that is dependent on hardware to be tested and debugged before a prototype is available.
It also provides an additional test stimulus for the hardware design. This additional stimulus is useful to augment test benches developed by hardware engineers since it is the true stimulus that will occur in the final product.
In most cases, both hardware and software teams benefit from co-verification. These co-verification benefits address the hardware and software integration problem and translate into a shorter project schedule, a lower cost project, and a higher quality product.
The primary benefits of co-verification are:. Project Schedule Savings For project managers, the primary benefit of co-verification is a shorter project schedule. Traditionally, software engineers suffer because they have no way to execute the software they are developing if it interacts closely with the hardware design.
They develop the software, but cannot run it so they just sit and wait for the hardware to become available. After a long delay, the hardware is finally ready. By getting all the trivial bugs out, the project schedule improves because the amount of time spent in the lab debugging software is much less.
There is no substitute for being able to run software in a simulated world and see exactly the correlation between hardware and software. We see what is really happening inside the microprocessor in a nonintrusive way and see what the hardware design is doing. Not only is this useful for debugging.
We will see in future examples that co-verification is an ideal way to really learn how an embedded system works. Co-verification provides information that can be used to identify such things as bottlenecks in performance using information about bus activity or cache hit rates. It is also a great way to confirm the hardware is programmed correctly and operations are working as expected.
When software engineers get into a lab setting and run code, there is really no way for them to see how the hardware is acting. They usually rely on some print statements to follow execution and assume if the system does not crash it must be working. Co-Verification Improves Communication For some projects, the real benefit of co-verification has nothing to do with early access to hardware, improved hardware stimulus, or even a shorter schedule.
Sometimes the real benefit of co-verification is improved communication between hardware and software teams. This results in negative attitudes and finger pointing. It may sound a bit farfetched, but sometimes the introduction of co-verification enables these teams to work together in a positive way and make a positive improvement in company culture.
A similar term to co-verification is co-simulation. In fact, the first paper published about Seamless used this term in the title. Co-simulation is defined as two or more heterogeneous simulators working together to produce a complete simulation result.
This could be an ISS working with a logic simulator, a Verilog simulator working with a VHDL simulator, or a digital logic simulator working with an analog simulator. Some co-verification techniques involve co-simulation and some do not. Co-verification versus Codesign. Often co-verification is often lumped together with codesign, but they are really two different things. Earlier, verification was defined as the process of determining something works as intended.
Design is the process of deciding how to implement a required function of a system. In the context of embedded systems, design might involve deciding if a function should be implemented in hardware or software.
For software, design may involve deciding on a set of software layers to form the software architecture.
0コメント