Putting design and production tests in their place
Recent high-profile product recalls and failures have shown that even the most experienced and well-funded company can suffer from the limitations of product testing when testing embedded hardware.
A product goes through two fundamental types of test. The first is of the design itself and the second is production testing of a product. These two categories have different aims and this article sets out to explore the differences.
Increasing pressure on timescales and the relentless pace of new technology introductions can pressurise companies to reduce design testing to a minimum. Conversely, production testing can be over-engineered to no great effect, resulting in increased production costs and a false sense of security, with testing of a design carried out in production, rather than testing of the quality of repetition of that design.
Within the realm of testing embedded hardware, the aim of design testing is primarily to check that the design meets the requirements specification. Typically an engineer will check each feature of the design one at a time and then end up with some form of system testing. The software team will then take over the design and report back any bugs that they find whilst they develop their code. Frequently large portions of hardware can’t be tested without software support, so the bring-up of the hardware is a combined effort. However, because the engineer might only have a matter of weeks to test a design that may be in service years, and manufactured in large quantities, there are aspects of the design testing that need to be done to build confidence, e.g.:
- Temperature and humidity range tests.
- Voltage input range and quality: Does the device work across the full voltage range it can be exposed to?
- EMC (ESD, immunity, emissions): This is a requirement for CE marking but is more than just box-ticking. It can reveal design weaknesses particularly with regard to immunity and ESD, that could cause the type of random field errors that engineers hate.
- Mechanical fit: As well as the first PCB fitting in the case, will the production versions all fit when tolerances of case and PCB are taken into account?
- Signal integrity: Searching a board for signals that look wrong may seem like a scattergun approach, but it can reveal mid-level voltages, or dodgy clock waveforms that mostly work, but will cause problems under stress.
Thorough design testing should yield a design that will be robust across the full range of component tolerances it can be built with, and in environments that the device is expected to inhabit. This design can then be put into production, where a production test regime is required.
Testing embedded hardware – production testing
Production testing aims to test that each produced board is the same as the original “gold” board. It is different from design testing in that the assumption is that the design is good. Ideal production testing is fast with high coverage. There are a variety of tools in the production test tool-box which do different things.
|Automatic Optical Inspection (AOI) / X-Ray||Joint quality, component placement||QuickCan pick up problems that may degrade over time, or not be seen by other methods e.g. dry joints||Limited by what is visible|
|JTAG||ConnectivityMemory tests||QuickThorough, where coverage is good||Many designs don’t support much JTAG test coverage.|
|Bed of Nails||Connectivity including analogue.||QuickThoroughRequires little operator expertise||Big impact on the PCB layout if high coverage is to be achieved. This renders it unsuitable for high speed designs.|
|Flying probe||Connectivity including analogue||ThoroughDoesn’t require PCB to be highly tailored for it.||SlowExpensive|
|Functional||Tests the function of the product.||CheapMost similar operation to the product||Coverage can be lowMay require operator expertise.|
|Burn-in||Early life failure||Potentially picks up early life failures that might escape the other tests. Based on the bathtub curve of component reliability showing that a significant proportion of field failures happen early on. It attempts to weed these out.||Expensive in terms of real-estate in a factory.The number of bugs that would be picked up is very limited in most cases.Can reduce product lifetime.Frequently of dubious value.|
Some of these test methods complement each other and are designed for different purposes, so AOI is rarely used as a sole test, but can be in conjunction with another method whereas burn-in is only an add-on test, rather than a replacement for any other test method. Some form of functional test is normally carried out on an assembled product, even if every component has already been tested in another way.
Production testing can be slowed by the retesting of components that have already been 100% tested by the supplier. If the manufacturer has confidence in the supplier, these tests are effectively redundant assuming that damage in shipping is not possible. Likewise the production testing should not seek to test whether the design is good eg by testing all the limits of voltage and temperature, that should have been covered by design testing. Such extra testing may unduly stress the board and decrease product lifetime as well as slowing the testing down.
A matrix should be drawn up examining what is tested by each method to minimise overlaps and thus reduce test time, whilst ensuring that coverage is high.
So, when testing embedded hardware is brought under close scrutiny, it is fair to conclude that production and design testing are both required but have different objectives. Design testing needs to be thorough enough to ensure that all produced boards work reliably, whilst production testing needs to ensure that each board is an exact copy of the original.
Notes to Editors
About ByteSnap Design (www.bytesnap.co.uk)
ByteSnap Design is a specialist in innovative embedded systems development encompassing hardware and software design. ByteSnap Design has an international client list and was awarded Consultancy of the Year in The British Engineering Excellence Awards 2013 and Design Team of the Year in the Elektra awards 2011.
The consultancy also has experience of electronic circuit design, microcontroller design, Linux and embedded software development, designing hardware products from PDAs to smart meters with multiple software projects such as developing Windows CE BSPs to signal processing applications.
The award-winning team was recognised by its peers at the 2013 British Engineering Excellence Awards, beating stiff competition to receive the Consultancy of the Year accolade. ByteSnap was also 2011 Design Team of the Year in the Elektra European Electronics Industry Awards.
For more media information including images, contact Jaspal Sahota, at Vitis PR, on 0121 242 8048 and email@example.com