With the rapid and continuous growth of the semiconductor industry, the use of tools that assist engineers by simplifying and automating the design, development and integration process has propelled the development and manufacture of electronics and embedded systems in the last 35 years. The use of Electronic Design Automation (EDA) tools has been a key enabler for developing chip complexity and functionality. It has helped to push electronics-based industries forward to meet the ever increasing demands from the consumer and industrial markets.
EDA is used in the majority of modern commercial and industrial electronics design. The EDA tools help engineers to push the boundaries by using the latest features and functionality of the assembled components, while modelling power consumption and system performance, which both impact on commercial viability of the design. As there are significant production costs in Electronic Systems, considerable effort is put into modelling the system and simulating it in an EDA tool prior to going into production. With the complexity of modern electronics today, testing, and re-testing changes to the electronic design quickly are critical.
In parallel with increasing product performance, software quality has become a huge responsibility. Software Design tools perform the same function as EDA tools by allowing engineers to quickly, easily and visually define the static architecture and dynamic behaviour of the embedded system software and applications.
The Unified modelling language (UML) is an example of a software design specification language (see Figure 1). EDA tools help designers to specify and simulate how IC devices will operate together through verification and simulation. In a similar way engineers must test the output from the UML is viable for use through complete and regular testing using both static and dynamic approaches.
One of the benefits of using a software design specification language such as UML is the availability of tools that enable automatic generation of source code from the design language specification, however this code still requires thorough testing. Engineers and developers need tools that afford them the visibility of testing completeness and auto generating test cases for code snippets that are not covered by existing test cases. Integration is a key part of the design and development process so developers need to be able to run integration tests as easily as they can run unit tests. Whilst using software design tools helps to build code faster, it does not guarantee bug-free code. Additionally, in today’s modern development environments, changes in the form of bug fixes may have to be integrated directly into a device’s code base from support operations. The subsequent update often needs to be deployed immediately to resolve quality and reliability issues, spawning the need for a continuous integration approach.
Methods 1 and 2 as shown in Figure 2, do not address the challenges of updating software from bug fixes or incremental feature enhancements. To better address this we can look at a process known as DevOps (“development” and “operations”), a term for describing a collaborative method of developing and deploying software. It is an extension of Agile Test Driven Development methodologies which includes continuous integration, continuous delivery, and rapid deployment practices. The goal of this methodology is to enable better collaboration and improve communication. A key requirement of DevOps is the confidence that code changes will function as expected, hence automated testing is an essential component to ensure overall quality. Following a DevOps process can reduce time to market and the prevalence of technical debt. Key benefits delivered through DevOps are:
- Improved Quality
- Faster Time to Market
- Optimised Infrastructure
- Minimised Risk
- Shortened Release Cycles
- Lowered Costs
The majority of the problems with an end product are caused by inefficient and incomplete software testing. When using automatically generated code, a view of code coverage is vital. To keep ahead in the time to market race, using testing tools that utilise automatic test case generation (ATG) can save days of time and mitigate against a public software disaster using ATG for ‘Fuzz Testing’. Some modelling tools allow users to generate test vectors by carrying out analysis of previously defined requirements, which is a great first step to ensuring quality. However, without visibility into the underlying code, it may miss edge cases in the translation of requirements or model into source code. The results could lead to exposed and undiscovered software vulnerabilities.
Before committing the complete hardware and software design to production, engineers need to be able to run tests on a simulation of device to verify the system’s behaviour and compliance with the requirements that have been specified.
Furthermore, as products develop over their life cycle with upgrades or availability of new features, changes will need to be made. These changes have code implications and engineers need the ability to understand where source code changes will have knock-on effects and be able to create new tests to test the new code. Using a designated software testing platform allows engineers to overlay intelligence. As an example, identifying the smallest number of tests to be re-run by a change to the source code, also known as Change Based Testing. Furthermore, if the floating point instructions are modified in the hardware design, engineers need to be able to re-run all the tests associated with that function and, any impacted firmware that is reliant on this capability will need to be re-verified.
Due to the speed of development in today’s markets, a synchronisation is needed between the use of Software Design tools (e.g. UML) and software testing tools to release a product that is truly market ready. This process can be simplified by using a platform where integrations with Software Design, and Continuous Integration tools are readily available.