MOSA, certification, and security challenges driving avionics software designs

Story

April 26, 2022

John McHale

Editorial Director

Military Embedded Systems

U.S. Navy photo.

In this issue, Military Embedded Systems hosts an online roundtable with avionics software experts, discussing military avionics technology, design trends, and challenges.The panelists discuss the complexities of multicore architectures, safety-certification challenges, securing of flight-control systems, and how modular open system approach (MOSA) initiatives like The Open Group’s Future Airborne Capability Environment (FACE) Technical Standard impact military manned and unmanned avionics.

Our panelists are Dr. Benjamin Brosgol, member of the senior technical staff at AdaCore; Gary Gilliland, technical marketing manager, DDC-I; Will Keegan, chief technical officer, Lynx Software Technologies; and Alex Wilson, director of A&D solutions, Wind River.

MIL-EMBEDDED: What are the critical design challenges facing embedded software suppliers in defense avionics applications today?

BROSGOL: Computing technology has advanced dramatically over the years (faster and cheaper hardware, new programming languages, powerful integrated development environments), but the underlying design challenges for suppliers of embedded software have basically remained the same: how to build reliable, safe, secure, and efficient software that can be maintained/enhanced and ported to new platforms as requirements evolve. Exacerbating these challenges, requirements such as safety and security are system-level properties. You have to show not only that the individual components have the necessary assurance properties, but also that the component interactions do not create safety hazards or security vulnerabilities.

These would be daunting enough issues for any kind of software, but embedded avionics applications face an added challenge: they need to meet hard real-time deadlines (on the order of milliseconds), running on hardware that typically has severe storage constraints. Furthermore, a technology that helps meet one quality objective might interfere with others. A multicore architecture can improve application throughput, but demonstrating that concurrent threads of control do not incur data races, deadlocks, or other anomalous behavior requires advanced verification techniques.

And last but definitely not least, avionics software developers need to do all this while realizing management goals (minimizing costs, meeting delivery schedules) and demonstrating conformance with relevant software quality standard(s) such as MIL-STD-HDBK-516C or DO-178C.

GILLILAND: Providing safety-critical avionics based on multicore platforms presents some challenges. While these new CPUs provide increased performance and high integration of I/O capabilities, they are also very complex and share resources such as cache and memory subsystems across all cores. This sharing of resources can cause applications running on different cores to contend for these resources. The interference caused by this contention can impact the application’s execution time such that timing deadlines are missed, resulting in unsafe failure conditions. The system integrator must understand these interference patterns and deploy methodologies to mitigate the interference utilizing what means are available (hardware features, RTOS [real-time operating system] features, integration techniques, etc.).

Security concerns are driving avionics manufacturers to make their systems more secure. Left unchecked, cybersecurity attacks could leave the avionics infrastructure vulnerable, including systems deployed in aircraft. A generic solution to any security problem does not exist, nor would it be practical. Rather, a security solution for a given system is the product of an airworthiness and security-risk process. This process establishes the security risks and, if they are acceptable, how to mitigate the risks and [then] validation that the risks are mitigated.

KEEGAN: [The] complexity of em­bedded systems is increasing at rates that assurance activities cannot match. Assur­ance validation methods are glazing over buried hazards in COTS [commercial off-the-shelf] black boxes; quality-assurance budgets are insufficient in unpacking complexity – multicore interference in real-time systems is a good example of this crisis.

Virtualization provides a recognized path for offering strong isolation between processes, but uses a large amount of memory and can make systems more challenging to manage and update. Con-tainers offer some benefit via more efficient memory usage and work well in IT infrastructure on (largely) homogenous platforms – the embedded industry is not comfortable with the security of containers and their use across highly heterogenous embedded platforms is challenging.

Designing for an unpredictable supply chain [is also a challenge regarding]:

  • Processor architecture choice (various factors going into picking between Intel/Arm/PowerPC/RISC-V and resulting hardware features, firmware certification approach, cost)
  • Stand-alone RTOS or mixed criticality using a hypervisor
  • Application language choice (C++, Ada, Rust, etc.)
  • 32-bit versus 64-bit (harnessing certified codebases from 32-bit world while providing a path to embracing benefits of increased memory capabilities and processing performance)

WILSON: There is always the challenge of how new technology and innovation can be applied to a highly regulated market, where safety is a primary concern. Some of the more challenging areas include:

  • Multicore processors: Although various guidance exists now regarding the path to safety certification, this is still an area of huge complexity when it comes to system level design. The flexibility of a multicore device allows for a lot of innovation, but that must be balanced with the need to meet the guidance in A(M)C 20-193, and A(M)C 20-152A.
  • AI/ML [artificial intelligence/machine learning] are helping to drive incredible innovation, but AI, by its nature, does not currently produce code that is deterministic, and so would be challenging to certify under current regulations. AI makes use of cloud technology to scale both the data available and the processing power. This presents a further challenge when implementing AI on intelligent edge devices for avionics.
  • Many companies are going through a digital transformation process, which affects all areas of operations, including software development, deployment, operations, and lifecycle upgrades. This means that avionics software development teams are moving from older software processes to a modern agile or DevSecOps [development, security, and operations] approach. This also allows them to bring new talent into the software teams, including new graduates who have been trained on the latest processes at university. These techniques allow for greater automation and testing but will still have to fit into the regulatory requirements for safety and security.

MIL-EMBEDDED: How have modular open system approach (MOSA) initiatives like the Future Airborne Capability Environment (FACE) Technical Standard and others impacted avionics applications? What challenges remain?

BROSGOL: The FACE effort is focused on reducing the costs for new avionics systems through reuse of software components and avoidance of vendor lock-in. Conforming with the FACE Technical Standard affects a project in various ways:

  • The software needs to fit in with the reference architecture defined by the FACE Technical Standard, which partitions the system into layers (“segments”) that separate the portable components from the platform-specific components.
  • The software needs to use open standard APIs or standard programming language syntax (for C, C++, Ada, and Java) to access operating system services.
  • The software needs to use FACE defined interfaces for intersegment communication, I/O, and related functionality.
  • The software needs to supply a data model for data interchanged with other components, to ensure that both sides of the communication share a consistent view.
  • The software needs to pass a conformance verification test suite, to show that it does not use APIs or language features outside the targeted subset (General Purpose, Safety Extended, Safety Base, Security).

Some challenges:

  • The portability that is demonstrated by FACE conformance comes from the use of standard APIs or language features. However, full source-code portability requires additional care (for example, not using language features that have implementation dependencies).
  • An organization intending to achieve FACE conformance for an existing software component may need to do significant refactoring, depending on the original design.
  • Defining a data model involves a variety of standards, including Open Universal Domain Description Language (Open UDDL), Object Constraint Language (OCL), and others. Avionics software developers are not necessarily familiar with these standards, which are more generally used for enterprise applications.
  • The FACE conformance verification procedures are oriented around software written in C or C++ and are not easily used for Ada or Java.

These challenges can be addressed through source-code analysis tools; training resources are available for the FACE data-modeling requirements. Work is [also] in progress to extend the FACE conformance procedures to better support Ada and Java.

GILLILAND: I believe that these initiatives have forced competitors, partners, and customers in our industry to work more closely together to come up with common solutions that benefit everyone. Ultimately, the customers will benefit the most from having a common framework to use to develop new products.

From a software perspective, in order to get everyone to work together and try to provide common interfaces, abstraction layers have been created. These abstractions are new to many of the system integrators, which cause learning curves which increase cost, when trying to adopt this new approach. These should be short-term and over time the value of these initiatives should reduce the cost to field new technology.

KEEGAN: [They’ve] standardized system config specifications, separate BSP [board-support package] dependencies for RTOSs, and standardized low-level stack component integration.

The main benefits of FACE are in the abstraction of apps and decoupling it from a specific RTOS (as intended). As to how much it has actually influenced vendor lock-in, [that] is still in play.

The core tenet of MOSA – which is to allow designers to coalesce loosely coupled modules from various vendors – all sounds great. The challenges are in the hardware/OS interface as there are no canned solutions for this.

Certification cost is still an issue.

WILSON: The open architecture approach will always give benefits to the end user over a proprietary approach, by opening the design to competitive solutions while allowing for updates, upgrades, and lowering cost of ownership. These allow primes to choose best-in-class suppliers and be assured of relatively easy integration. Interoperability demonstrations, such as the FACE TIM [Technical Interchange Meeting], allow for industry to integrate and demonstrate capability for their customers.

Most of these standards solve the problem by having an architecture framework and defined APIs for interoperability between architecture layers. While this does enable interoperability, it does not address the challenges of performance, safety, and security certification. This means there is still a burden on the systems integrator to analyze and tune performance, and make sure that the relevant safety and security requirements are met.

In many ways, initiatives like MOSA and standards such as FACE promote collaboration across the industry and enhance the ability of the aerospace and defense ecosystem to tackle increasingly complex technical challenges and deliver more capable solutions.

MIL-EMBEDDED: What technology or standard will have the most impact on the avionics world in the next five to 10 years and why?

BROSGOL: No one standard or technology stands out; rather it is likely to be a combination.

  • Model-based engineering will play an increasing role, since it allows expressing flight-control algorithms through an application-oriented graphical representation. Using a trusted code generator (for example, one that has been qualified under DO-178C) in a model-based engineering tool can save considerable effort, since verification performed at the model level can replace or reduce verification work needed for the generated source code.
  • Static analysis tools will become more important in avionics development, since they are essential in helping to verify critical software properties (absence of references to uninitialized data, adherence to code standards, etc.). Tools based on formal analysis show particular promise, especially when security properties need to be verified. With the maturation of proof engine technology and the availability of programming languages such as SPARK, formal methods-based approaches are moving into the mainstream for critical software development and verification.
  • Dynamic analysis tools (code coverage analysis, test case generation, fuzzing) will likewise see increasing usage, since they complement static analysis for verification of high-assurance software. Fuzzing in particular is becoming a method of choice for detecting security vulnerabilities.
  • The FACE approach will become more widespread, as defense procurements are more frequently requiring FACE conformance. Evidence of successful component reuse will encourage increased adoption.

GILLILAND: I believe that the UAS will have the biggest impact on our future. Currently, [they] are used extensively in the military arena, but are not highly used in the civilian airspace except by special permission. As a result, they are not typically developed to the rigor required by certification authorities. Although development to high design assurance is a significant increase in cost, it also protects the aircraft and anything that it flies over. In the long run, replacing crewed aircraft with UAS for jobs like border patrol, traffic control, surveillance, or even air taxis will be a lower cost to operate and safer.

KEEGAN: AADL [Architecture Analysis & Design Language]. The AADL software architecture standard with the behavior and error model annexes can serve as a common model to support various important development and assurance activities. Examples include code and parameter generation, requirements traceability, verification of security policy, verification of timing properties, [and] hazard analysis.

We are in the nascent stages of hypervisor-related consolidation, so that has the runway to significantly influence avionics designs in the next five to 10 years. [Lastly,] the adoption of open source technology into some elements of mission-critical systems.

WILSON: AI/ML are technologies that work effectively using the scalable compute resources of the cloud. High-performance computing allows these to operate closer to the edge, such as onboard an unmanned vehicle or within manned aircraft. As these compute resources become more powerful, AI/ML software improves, and the regulatory guidance is put in place for safety and security, this will enable even greater functionality to be deployed on avionics systems.

At its heart, AI requires three elements: data, a model for generating predictions, and an inference engine to apply the model to the data and reach conclusions. In an increasingly intelligent world, it is key to work with the right foundation, so that customers who have the data and the model can begin building their AI solutions.

Featured Companies