Simulation provides key to explosive automotive design challenges
October 12, 2009 by John Day
By Darrell A. Teegarden, System Modeling and Analysis Business Unit Director, Mentor Graphics Corporation
Today’s automotive challenges are similar to those experienced in the telecommunications industry more than ten years ago. New technologies, such as hybrid-electric and fuel cell vehicles, are hotbeds of research and development activity comparable to what we saw during the evolution of cell phones into multimedia devices. In the same way that the telecomm industry faced power and chip size limitations, automotive designers are being stretched to include more and more technology into what used to be basically a mechanical device.
Electronic, electrical, mechanical, hardware, and software components and the networks that connect them are densely populating vehicle designs to an ever-greater degree. The amount of in-vehicle electronics, currently at 40%, is on the rise with an ever increasing number of electronic control units (ECUs) being distributed across the system to control the sophistication and complexity of new applications. ECUs can have hundreds of software components; multiplexing among systems is proliferating, and communication requirements are escalating.
The system integration phase is a frequent bottleneck. The very nature of distributed systems—wherein data comes from different internal resources, ECUs come from different companies, and algorithms from different ECUs must be synchronized—requires heavy coordination.
Not only has the typical system design grown in overall size to accommodate ever-increasing demands for functionality and performance, but these designs must fluently integrate analog and digital hardware, as well as the software that controls it. Successfully integrating system components and verifying that they work in concert with each other often proves to be costly in terms of time, money and engineering resources. And, at the same time, there is increased pressure to reduce development cycle time.
In order to keep pace with these new realities, new processes and development tools are required. In particular, the development and intelligent use of computer models of these complex systems—once considered a luxury— is becoming critical to the success of the overall development process.
Computer simulation key to design process
Computer simulation (or just “simulation”) is generally recognized as a tool to help automate the design of a particular aspect of a system. Using specialized algorithms to analyze a model of a system, it provides the ability to successively verify new designs from concept to implementation. When employed during the beginning of the design process, simulation provides an environment in which a system can be tuned, optimized, and critical insights can be gained – before any hardware is built. Useful for many reasons, perhaps the most obvious use of simulation is to “virtually test” the system model in order to reduce the risks of unintended behaviors—or even outright failures—in the actual system.
In order to create a system model, each component in the real system needs to have a corresponding “component model” (although it is often possible to combine the function of multiple components into a single component model). These component models are then connected together (as their physical counterparts would be), to create the overall system model. What lies at the heart of any computer simulation, therefore, are the component models. The “art” of creating the models themselves, and sometimes more importantly, of knowing exactly what to model and why, are the primary keys to successful simulation.
To create an overall system model out of the component models, a developer must generate a graphical symbol for each component model and then connect the pins of the symbols using a schematic capture environment. It is also possible to construct the system model without a schematic capture environment, by creating a netlist or testbench using a text editor. The next step is to parameterize the component models, if necessary (typically via their corresponding symbols). A generalized motor model, for example, may be given explicit parameters so it will behave as a specific type of motor.
With the system model complete, various simulations can be performed to determine the overall system performance. To actually debug a real system, several simulations would be performed throughout the design process and beyond. Example results for both time-domain as well as statistical analyses are given in Figure 1.
Figure 1 – Typical system simulation results
Figure 1 illustrates how simulation-based “virtual testing” can be used to test basic system functions, as shown by the mixed-analog/digital waveforms in the upper-left quadrant of the figure. It also shows how virtual testing can be taken to the next level by performing a Monte Carlo analysis and generating scatter plots and histograms that help correlate performance measures to various component parameters. These are shown in the upper-right and lower-left quadrants of the figure. Finally, the sensitivity of various performance measures to specific components can be determined using Sensitivity Analysis, results of which are shown in the lower-right quadrant of the figure. All of these analyses and more are available once a system model has been developed.
A possible next step is to consider “upgrading” various component models to account for the eventual physical implementation of the system. After determining that the topology is indeed sound, a choice is required: build the actual design, or increase the value of the system model even more to reduce risks in the physical system later. However, it is possible to do both concurrently.
Modeling overall system variation helps prevent designers from optimizing a component at the expense of the system. For example, the ripple effect of reducing the cost of one component, perhaps by loosening its tolerance, may ultimately require a more expensive modification to another component, to compensate for overall variation. By understanding this impact, this change can be rejected before it becomes irreversible.
With a system model available, “what-if” tradeoffs can be made, performance as a function of component tolerances can be determined, and many other analyses can be performed. This is the pay-off for the model development effort. The system model can be used now to anticipate and fix any design issues – before actual hardware is built.
Resolving the gap from simulation to physical prototyping
Part of the “art” of designing a system is to be able to make the transition from simulation model to physical prototype at some reasonable point in the design process. Some designers jump as quickly as possible into “real hardware” and may even forego the use of simulation as a tool altogether. Others tend to put off building the physical system as long as possible and continue refining the system model. Where should the line be drawn? It really depends on the amount of risk that is acceptable for the specific design. For safety-critical systems and sub-systems, highly-refined system models are often used in conjunction with physical prototypes to minimize risks.
On the other hand, non-critical systems do not need to be so rigorously developed, but they still benefit from the development and simulation of corresponding system models. Such models can help minimize system performance degradation due to part tolerances stemming from manufacturing variability. Part costs can be minimized as well. Additionally, having access to a simulatable system model can help to improve overall system robustness and foster a better understanding of the system in general.
For example, designers can simulate a system operating at excessive voltage or temperature levels, or view current, flux and other state variables internal to a device. Another example is the ability to exercise an embedded controller running in the context of its hardware peripherals (e.g. A/D, D/A, timers, etc.). This is similar to using an in-circuit emulator in the real world, but, in the virtual world, the user can actually stop time at a breakpoint, not just stop the execution of the code.
The challenge lies in resolving the gap that exists between hardware and software across the stages of design. Designers need a tool capable of virtual modeling and integration that enables true electronic systems design and analysis. Such a tool would not only support functional modeling of electronic systems but also virtual integration and simulation of combined hardware and software to properly verify that the system requirements are met. This tool is available today in SystemVision from Mentor Graphics.
SystemVision is a design and analysis tool suite comprised of three integrated tools: a schematic capture program, a mixed-signal simulator, and a waveform viewer. SystemVision also integrates with other tools used for advanced analytical functions in today’s design flows. SystemVision provides an intuitive and easy-to-use virtual modeling, simulation, and system integration environment that supports multiple levels of system abstraction across multiple engineering technology domains. Analog, digital, and mixed-signal circuits can be abstracted and analyzed along with mechanical, electrical, thermal, and hydraulic systems, along with continuous and sample-data control systems (and many other engineering effects) in an environment that allows for the fluent, multi-level integration required for true electronic system-level design and analysis.
Using SystemVision, engineers can quickly and easily create and analyze designs that include all levels of abstraction from math-based behavior down to circuit implementations. SystemVision can be used as a numeric analysis engine, providing a very high level of modeling fidelity that helps pinpoint problems through waveform display of signals, amplitudes, and timing. In this environment, high-level behavioral models can be combined with lower-level device and effects models, allowing designers to rapidly examine design tradeoffs.
The SystemVision core includes a single-kernel engine that ensures accurate mixed-signal simulation results. The simulator core accepts multi-language system and circuit descriptions and generates viewable waveforms in a technology-aware format. Block diagrams and transfer function blocks assist designers with high-level concept verification. Hierarchical schematics and circuit elements assist them in verifying electronic and mechatronic system designs.
Figure 2: SystemVision provides powerful design, modeling, analysis, and viewing tools around a core simulation engine
SystemVision technology gives designers powerful tools for managing mechanics, electronics, software, and controls all in one system with the capability to integrate the significant intersections between them. Figure 3 illustrates how certain design parts inhabit the intersections of the multiple design domains that SystemVision can address. For instance, sensors and actuators inhabit the intersection of mechanics and electronics, control circuits inhabit the intersection of controls and electronics, and microcontrollers inhabit the intersection of electronics, controls, and software. Using SystemVision, designers enjoy a modeling environment that can handle all of these disciplines simultaneously.
During the verification phase of the design, simulation technologies can again be employed to verify intended system operation. It is a common mistake to completely design a system and then attempt to use simulation to verify whether or not it will work correctly. Simulation should be considered an integral part of the entire design phase and continue well into the manufacturing phase.
Virtual system-level integration and verification, with ties back to the original executable specification, is primarily an exercise in providing a rich tool integration environment. System integration can begin before physical hardware is available by creating a system model that incorporates a combination of the various technologies. This may include the mechanical, magnetic, hydraulic, thermal effects or any other technology that can be described using algebraic or differential equations. Though obviously of high value, these benefits will only accrue for automotive system design when a large number of designers working at the system and component levels, and across the spectrum of engineering disciplines, begin to use system modeling techniques.