Anyone developing in the automotive sector knows the stamps on every specification: QM, ASIL-A, ASIL-B, ASIL-C, ASIL-D. The acronym stands for "Automotive Safety Integrity Level" and comes from ISO 26262 β the standard for functional safety in road vehicles. Between QM and ASIL-D lies a universe of process overhead, documentation volume, and development cost.
In practice, ASIL-B is the most common level for production ECUs. Engine controls, brake electronics, camera preprocessing, battery management β many of these fall under ASIL-B. And that's exactly where things get concrete for the C developer.
This article describes what ASIL-B has actually meant in projects I've worked on β beyond the norm quotations. No theory, but the stumbling blocks and what matters in an audit.
What ASIL-B requires from the norm
ISO 26262-6 (Product Development at the Software Level) defines tables with "highly recommended", "recommended", and "no recommendation" per activity and ASIL level. For ASIL-B, this means in short:
- Coding guidelines: MISRA-C is practically mandatory (Table 1 of the norm: "highly recommended")
- Static code analysis: highly recommended
- Code review: highly recommended, with appropriately qualified reviewers
- Unit tests with branch coverage: highly recommended (statement coverage alone is not enough)
- Integration tests: highly recommended
- Error injection: recommended
- Defensive programming: highly recommended
And β importantly β every tool used that can influence the code (compiler, code generator, test framework) must be qualified or classified according to ISO 26262-8. More on that later.
MISRA-C: The underestimated gatekeeper
MISRA-C is not the C developer's enemy β even if the approximately 170 rules and directives of the current MISRA-C:2012 feel overwhelming at first. Most rules are simply reasonable: no implicit conversions, no backward gotos, no variable name shadowing, no dynamic memory.
What makes things difficult in practice is something else: MISRA distinguishes between Required, Advisory, and Mandatory. And every deviation needs a formal deviation record β a documented, reviewed, and safety-manager-signed exception.
"In one project, we had about 400 deviations at closure. Every single one needed a justification, an analysis of the safety impact, and a counter-signature. That's not paper-pushing β that's real work."
Typical traps:
- Rule 10.x (implicit conversions): Almost unavoidable in bit manipulation. Every line like
uint8_t a = b & 0x0F;gets critically reviewed. - Rule 11.x (pointer conversions): Hardware access almost always requires pointer casting. Solvable, but requires documentation.
- Rule 21.3 (no dynamic memory): Often surprising β even static structures with pointers are sometimes flagged when the lifecycle isn't cleanly documented.
Practical recommendation
Anyone planning ASIL-B code from scratch should think MISRA from the beginning. Making an existing codebase MISRA-compliant retroactively is three to five times more expensive than doing it right the first time. Tools like Polyspace, LDRA, or Helix QAC check automatically β but no rule is 100% automatically checkable. Some rules require human judgment.
Static analysis is more than a green checkmark
The norm requires static analysis β this is often misunderstood as "run the MISRA checker". In fact, ISO 26262 means significantly more:
- Control flow analysis: Find all unreachable code paths.
- Data flow analysis: Find variables used but not initialized.
- Abstract interpretation: Find potential array overruns, divisions by zero, integer overflows.
- Semantic analysis: Find race conditions in interrupts, data dependencies across module boundaries.
Tools like Polyspace Code Prover make the first three points formally provable β not just heuristically. The tool traverses all possible execution paths and colors each operator green (safe), red (error), or orange (undetermined). Orange is the enemy: every single such spot must be manually analyzed.
From practice
A Polyspace run over a medium-sized engine control component can easily take several hours. The analysis typically identifies 50β200 orange spots per 10,000 lines of code. Each must be assessed: is this a real bug, or does the tool simply lack context?
Most orange spots turn out to be false positives. But every run also finds 2 or 3 real bugs that no code review and no unit test would have caught.
Test coverage: statement coverage is not enough
ASIL-B requires, per ISO 26262-6 Table 12, at least branch coverage at the unit level. For ASIL-C and ASIL-D, MC/DC coverage (Modified Condition/Decision Coverage) is added.
The difference matters:
| Coverage type | What is checked | ASIL level |
|---|---|---|
| Statement coverage | Each statement is executed at least once | QM, A |
| Branch coverage | Each branch (if/else) is traversed in both directions | B |
| MC/DC | Each condition in a combination influences independently | C, D |
A small example to illustrate:
if (sensor_ok && (rpm > 1000)) {
trigger_action();
}
For statement coverage, one test case is enough where both conditions are true and trigger_action() is called.
For branch coverage, you additionally need a test case where the overall condition is false β regardless of which of the two parts tips the balance.
For MC/DC, you need to show: both sub-conditions influence the result independently. In this case, three test cases are required. For combinations with five or six sub-conditions, the number of required test cases explodes.
Practical consequence
ASIL-B unit tests are extensive but manageable. Tools like Cantata, VectorCAST, or TestWell CTC++ instrument the code and measure coverage at the object code level. You typically need 3β5 test cases per function.
Jumping to ASIL-D (lane keeping assist, airbag deployment) becomes painful: 8β20 test cases per function, many with creative input values to close MC/DC gaps.
Defensive programming: what does that mean concretely?
ISO 26262 requires defensive programming without saying exactly what it is. In automotive practice, several patterns have become established:
Check input parameters
void set_rpm(uint16_t rpm)
{
if (rpm > MAX_RPM) {
rpm = MAX_RPM; /* clamp instead of failing */
/* Optional: fault memory entry */
}
/* ... */
}
Don't trigger an assertion failure β that brings the vehicle to a halt. In ASIL-B code, faulty input is usually masked and optionally logged.
Runtime plausibility checks
uint32_t read_temperature_sensor(void)
{
uint32_t t_raw = adc_read(TEMP_CHANNEL);
if ((t_raw < TEMP_MIN_RAW) || (t_raw > TEMP_MAX_RAW)) {
set_fault(FAULT_TEMP_SENSOR);
return TEMP_DEFAULT; /* fallback value */
}
return t_raw;
}
No infinite loops
Every while(1) loop (except in the main task scheduler) needs a timeout counter. Every do..while needs a clean exit.
Watchdog not just at start and end
A typical beginner mistake: watchdog triggers before and after the main task. Better: inside long operations with multiple trigger points, so a stuck task is detected.
Tool qualification: the often overlooked effort
Every tool whose output flows into the product must be assessed per ISO 26262-8. This concerns:
- Compilers (GCC, IAR, Green Hills)
- Code generators (Simulink Embedded Coder, TargetLink)
- Assemblers, linkers, debuggers
- Static analysis tools
- Test frameworks
- Version control (Git, PTC Integrity)
Classification is done via TCL (Tool Confidence Level) 1, 2, or 3. A compiler typically reaches TCL3 β the highest level β and must either be qualified (by the vendor, with safety manual) or secured through compensating measures (reverse compilation, disassembly comparison).
Cost driver number one
Underestimating tool qualification is the most common cost driver in ASIL projects. Discovering mid-development that the chosen compiler has no qualified build means either switching (million-euro effort) or introducing massive compensating measures.
Practical recommendation: Before the first line of code, list the toolchain and request qualification documents from the vendor. That's not a formality β it's project-critical.
From a real project
I remember a mechanically-focused automotive supplier who traditionally delivers mechanical components and had decided to develop the control software for their park-lock actuator in-house. The software team was based in Detroit, USA. Development proceeded on schedule for months β until shortly before the milestone delivery to the European customer, the question of the compiler safety manual suddenly came up.
The answer: The C compiler used in Detroit was not qualified for ISO 26262. No TCL rating from the vendor, no safety manual, no documented list of compiler issues relevant to safety requirements. The panic was enormous. The park-lock is not a secondary feature β it is safety-critical, because unintended release of the park lock with a vehicle standing on a slope can lead to unwanted vehicle movement.
What was eventually done was the classic compensating-measures package: disassembly comparison of selected key functions, additional integration tests with increased fault injection, documented tool-usage analysis. All because at the start of the project nobody had asked the simple question: "Is our compiler actually qualified for the ASIL we need to deliver?"
The lesson: If you're a mechanical house developing software for the first time, don't treat tool qualification as a detail. And if you distribute development across multiple geographic locations, make sure the end customer's process requirements are known and practiced at every site.
What concretely distinguishes ASIL-B from QM
Many beginners ask: "What's the difference between QM code and ASIL-B code? The same C code does the same thing." Technically, that's true. Organizationally, it's night and day:
| Activity | QM | ASIL-B |
|---|---|---|
| Requirements traceability | recommended | mandatory (tool-supported) |
| Code review | recommended | mandatory, documented, defined reviewer qualification |
| MISRA-C compliance | nice-to-have | mandatory, formal deviations |
| Unit test coverage | statement | branch |
| Tool qualification | not required | mandatory |
| Hardware-level fault injection | no | recommended |
| Documentation per LOC | 1Γ | 5β10Γ |
The factor in the last row is no exaggeration: the documentation produced around the actual code in an ASIL-B project exceeds the code volume many times over. Safety plan, safety case, HAZOP analysis, FMEA, validation plan, verification plan, integration plan, test reports β each of these documents is not a form but a standalone work.
What I recommend to newcomers
- Read the norm. Not entirely, but ISO 26262-6 (Software Development) and -8 (Supporting Processes) are basic equipment. Those who only read the specification don't understand why the specification looks the way it does.
- Introduce MISRA-C early. On day one of the project, not after the first integration test.
- Fix the toolchain at project start. Compiler version, build server, static analysis tool β these are decisions that can barely be reversed later.
- Tests are mandatory, not optional. Those who save testing "for the end of the project" won't deliver the project. Tests are written together with the code.
- Establish a review culture. Reviews are not a formality. A safety auditor examines review protocols carefully: were real questions asked, or just checkmarks set?
- Requirements are the anchor. Without clean, linkable requirements, there is no safety case. DOORS or Polarion are the standard here.
Conclusion
ASIL-B is not hard β it's extensive. Those who don't see the added value perceive the process as harassment. Those who have once experienced how static analysis finds a real race condition bug that would have caused an accident in the field understand why the norm is so detailed.
In 35 years of automotive development, I've learned: quality doesn't come from testing at the end, but from process, tools, and discipline throughout development. ISO 26262 is less an obstacle than a checklist of things I would expect for good automotive code even without the norm.
The biggest leap for beginners is not the code β it's understanding that ASIL-B is quality, and quality has a price: in hours, in documentation, and in the commitment of every single step. In return, you get code you can rely on. And in the automotive world, that's worth the price.
Planning an ASIL-B project or looking for support?
Whether embedded C development, MISRA-C introduction, static analysis, or test automation β I support your team on safety-critical automotive projects. Initial consultation free of charge.