DO-178C Best Practices for Avionics Development

DO-178C Best Practices – For Engineers and Managers

Practice: we’ve all engaged in it: piano, math, golf, flying … Usually practice involves a modicum of coaching, self-help, and repetition. In avionics development however, there is little time for “practice”; instead, everything counts. And the result has little margin for error: schedules, budgets, and particularly safety are all on the line. How then can “practice” be reconciled with “avionics development”? The best answer is to understand the breadth of worldwide development and glean the best knowledge and solutions from the aviation ecosystem. Welcome to DO-178C Best Practices.

In flying, there are tradeoffs between payload, range, speed, and costs. The vast breadth of aircraft types for sale today belies the simple fact that many persons prioritize these tradeoffs differently. However, the variations in avionics software development practices are much more constrained: everyone wants to minimize the following attributes:

  • Cost
  • Schedule
  • Risk
  • Defects
  • Re-use Difficulty
  • Certification Roadblocks

The following pages provide the DO-178C Best Practices which can minimize all six of these important attributes in your development.

Certain good avionics software development practices are self-evident.  Similar to improving human health, educated persons know that improved diet, exercise, sleep, and stress relief are all “best practices”.  For software, the obvious good practices include utilizing defect prevention, experienced developers, automated testing, and fewer changes.  This paper isn’t about the obvious, as it is assumed the reader is educated by virtue of making it to this page.  Instead, the DO-178C Best Practices identified herein are subtler and considerably “less practiced”.

The following figure summarizes the Top 10 not-always-obvious DO-178C Best Practices:

Improved LLR Detail

Requirements are the foundation to good engineering.  Detailed requirements are the foundation to great engineering.”

Smarter researchers than this author long ago proved that most software defects are due to weak requirements.  In the book Mythical Man Month, Brooks opined that assumptions were a leading cause of software defects.  DO-178C was intentionally strengthened over its predecessor DO-178B to ensure acceptable requirements via 178C’s mandate to trace structural coverage analysis to requirements-based tests (RBT). Remember:  DO-178C doesn’t provide strict requirements standards, but for DAL A, B, and C, the developer must.  Those standards should define the scope and detail associated with High-Level Requirements (HLR’s) and Low-Level Requirements (LLRs).  Ideally the Requirements Standard will include examples of HLR’s versus LLR’s.  Requirements review checklists should likewise contain ample criteria for evaluating the level of detail within low-level requirements.

Parallel Test Case Definition

If a Tester cannot unambiguously understand the meaning of a software requirement, how could the developer?

DO-178C is agnostic regarding cost and schedule:  the developer is freely allowed to be behind schedule and over budget.  While transition criteria must be explicitly defined for all software engineering phases, it is normal for companies to define their test cases after the software is written.  However, great companies define test cases before code is written. Why?  Because it’s better to prevent errors than detect them during testing. If a Tester cannot unambiguously understand the meaning of a software requirement, how could the developer?  Good companies verify requirements independently by having the software tester define test case as part of the requirements review, before any code is written.  Requirements ambiguities or incompleteness are corrected earlier, yielding fewer software defects and expedited testing.

Automated Regression & CBT

In the non-critical software world, testing is a “great idea”.  Purchasing an exotic car or yacht can also seem like a great idea at acquisition time, as this author has personally experienced.  However, unlike luxury cars and yachts, software testing should not be considered a luxury but rather a necessity. DO-178C requires a variety of necessary testing, with increased rigor per increased criticality.  The basic types of testing are depicted below:

DO-178C requires regression analysis whereby software updates are assessed for potential impact to previously tested software with mandatory retest required where potential impact exists. Over the project life, and absolutely over the product life, more time will be spent on testing than on development. Many consider software testing to be the largest line-item expense in DO-178C. Devoting upfront time to develop a test automation framework can provide the single largest expense reduction ability.  And continuous-based testing (CBT), per DO-178C which automatically retests changes continuously, is the best means to meet DO-178C’s regression objectives. Why? By continuously retesting all software the regression analysis is greatly simplified:  just repeat all the tests by pressing a button.  Voilà.

Implement Testing Standards

Requirements Standard.  Design Standard.  Coding Standard.  Testing Standard …                  Wait, there ISN’T a Testing Standard?!?

DO-178C explicitly requires standards for DAL A, B, and C.  Which standards? Requirements, Design, and Code.  Why doesn’t DO-178C require a verification or testing standard?  Supposedly there should be less variation within testing, compared to the preceding lifecycle phases which are admittedly more variable between companies and projects. No one has ever accused DO-178C of requiring too few documents; given the traditional waterfall basis (inherited two decades prior from DO-178A), ample documents are required already. However, efficient companies recognize that verification is an expensive and somewhat subjective activity best managed via a Software Test Standard.  Since not formally required, it would not have to be approved or even submitted. What would such a hypothetical Software Test Standard cover?  At a minimum, the following should be included, as excerpted from AFuzion’s basic DO-178C training 2-day class provided to over 23,500 engineers worldwide:

  • Description of RBT to obtain structural coverage;
  • Details regarding traceability granularity for test procedures and test cases;
  • Explanations of structural coverage assessment per applicable DAL(s);
  • Definition of Robustness testing, as applied to requirements and code (per applicable DALs);
  • If DAL A, explanations of applicable MCDC and source/binary correlation;
  • Coupling analysis practices including role of code and design reviews;
  • Performance based testing criteria;

DO-178C Optimal Engineering Route

While the DO-178C engineering path is seemingly vague, hundreds of avionics DO-178C software projects affirm that the following software lifecycle  is optimal for DO-178C:

Figure:  Optimal DO-178C Engineering Route per AFuzion

Parallel Traceability/Transition Audits

“Why do it right the first time when it’s fun to keeping doing it over and over…”  – Anonymous

Where the amateur athlete focuses on the end result, the professional instead focuses upon optimizing the technique since the end result depends upon that technique.  Amateur and professional DO-178C software engineers both know minimizing defects is a goal, but the professional knows that technique matters: in avionics that is best summarized via DO-178C’s traceability and transition criteria. While the amateur avionics team assesses traceability and transition criteria at the end, e.g. DO-178C’s SOI-4, the experienced team instead deploys proactive SQA and tools to monitor bi-directional traceability continuously.  Emphasize audits of transition criteria early, fix process shortfalls, and record the audit results.  Remember, each type of DO-178C artifact review constitutes a “transition”:  engineers must follow the defined transition criteria and QA must audit to assess process conformance.  For information on DO-178C audits and auditing companies, see here:  http://certegic.com/certification-resources/auditing/

An example of a DO-178C  Software Code Review transition is depicted below: ensure all the inputs and outputs are perfectly utilized, under CM, and referenced in the DO-178C review results:

Figure:  Optimal DO-178C Code Review Transitions per  AFuzion

DO-178C Requirements Decomposition

DO-178C requires sequential requirements decomposition as depicted in the Figure below.  The relationship of DO-178C software requirements to ARP4761A Safety Requirements, ARP4754A Aircraft/System Requirements, and DO-254 Hardware Requirements is depicted in this Figure:

Figure:  Optimal DO-178C Requirements Decomposition per AFuzion

The following top 10 DO-178C Best practices will typically yield a 15-20% cost reduction (download the rest of this free AFuzion DO-178C Best Practices paper to read further).

To download the remaining 9 pages of this technical DO-178C AFuzion Best Practices  whitepaper, please download below:

Free: Download Remaining 10+ Page Paper Here