Compiled by the editorial staff of


EASA, Daedalean continue collaboration on concepts for AI certification

The Swiss artificial intelligence (AI) startup Daedalean is continuing its collaboration with the European Union Aviation Safety Agency (EASA) to develop guidance for the use of AI and machine learning in aviation.

EASA Daedalean learning assurance life cycle
EASA and Daedalean developed this W-shaped learning assurance life cycle for machine learning applications during their first IPC collaboration. EASA Image

In March of this year, EASA published the results of its first innovation partnership contract (IPC) with Daedalean. Called “Concepts of Design Assurance for Neural Networks,” the report, among other things, outlines a learning assurance life cycle that could form the basis for future certification of AI systems.

The nature of machine learning algorithms — which arrive at their results in ways that aren’t always obvious or explainable to their programmers — poses considerable certification challenges compared to more traditional, deterministic software. The report suggests performance and safety assessments that could guarantee the safety of such algorithms in aviation at appropriate levels of criticality.

While many of the report’s concepts apply to machine learning algorithms broadly, it focuses specifically on deep neural networks for computer vision systems. These types of algorithms form the basis for the computer vision-based autopilot Daedalean is developing for commercial and general aviation and future eVTOL aircraft.

Now, EASA and Daedalean are taking the next step in drafting guidance for safety-critical machine learning applications. Under a new IPC signed on July 1, EASA and Daedalean will work together to refine the learning assurance concepts presented in their first report and develop usable guidance for level 1 AI/machine learning applications.

As defined by EASA’s “AI Roadmap,” level 1 applications would provide assistance and augmentation for a human crew. In the future, applications could advance to level 2 — human-machine collaboration, with the human retaining full responsibility — and level 3, essentially full autonomy.

EASA and Daedalean held a kick-off meeting for their second IPC collaboration in late July. By early next year, they aim to publish another report containing proposed high-level guidelines for neural network-based systems, including the use of safety assessments for neural networks and a practical concept of what it means for them to be “explainable.”

In a press release, Daedalean CEO and founder Luuk van Dijk said the company was happy to continue partnering with EASA on their joint effort.

“In the first project, EASA has shown a firm commitment to moving this topic forward in the interest of all of the aviation industry,” he stated. “And for the coming IPC, the agency stepped up with an increased team of strong specialists. We will be taking a concrete in-flight system through a certification trajectory to find the open questions, with the intent to provide concrete usable answers.”

Leave a comment

Your email address will not be published.