1. Introduction¶
Software for space applications must meet unique and formidable requirements. Hard real-time deadlines, a constrained target execution environment with limited storage capacity, and distributed functionality between ground and on-board systems are some of the challenges, with little margin for error. The software needs to work correctly from the outset, without safety or security defects, and the source code needs to be amenable to maintenance over the system’s lifetime (which may extend over decades) as requirements evolve.
To provide a common approach to addressing these challenges, the European Cooperation for Space Standardization (ECSS) was formed in the mid-1990s in a joint effort conducted by the European Space Agency (ESA), individual national space organizations, and industrial partners. As stated in [1]:
The European Cooperation for Space Standardization (ECSS) is an initiative established to develop a coherent, single set of user-friendly standards for use in all European space activities.
The resulting set of standards, available from the ECSS web portal [2], addresses space activities as a whole and complement the relevant country-specific standards.
The ECSS standards specify requirements that must be satisfied (although project-specific tailoring is allowed) and fall into three categories:
Space engineering (the "–E" series),
Space product assurance (the "–Q" series), and
Space project management (the "–M" series).
This document focuses on two specific standards:
ECSS‑E‑ST‑40C Rev. 1 (Space engineering / Software) [3], and
ECSS‑Q‑ST‑80C Rev. 2 (Space product assurance / Software product assurance) [4]
and shows how the Ada and SPARK languages, together with AdaCore's product and services offerings, can help space software suppliers comply with these standards. Unless noted otherwise, all references to ECSS‑E‑ST‑40C and ECSS‑Q‑ST‑80C in this document relate to these cited editions of the standards.
AdaCore has a long and successful history supporting developers of space software, and the company has proven experience and expertise in qualification under ECSS‑E‑ST‑40C and ECSS‑Q‑ST‑80C. Examples include:
The ZFP (Zero Footprint) minimal run-time library for Ada on LEON2 ELF, qualified at criticality category B, for the aerospace company AVIO [5].
The Ravenscar SFP (Small Footprint) QUAL run-time library for Ada on LEON2 and LEON3 boards, prequalified at criticality category B, for ESA [6].
An important update in the 2025 versions of ECSS‑E‑ST‑40C and ECSS‑Q‑ST‑80C is the explicit attention paid to security issues. Memory-safe languages like Ada and SPARK, and formal analysis tools such as SPARK Pro, help reduce the effort in demonstrating security properties in space software.
The remainder of this chapter summarizes the ECSS‑E‑ST‑40C and ECSS‑Q‑ST‑80C standards, and the subsequent chapters have the following content:
Programming Languages for Space Software describes the Ada and SPARK programming languages and relates their software engineering support to the relevant sections / requirements in the two standards.
Analogously, Tools for Space Software Development presents AdaCore's various software development and verification toolsuites and relates their functionality to the relevant sections / requirements in the two standards.
In the other direction, Compliance with ECSS-E-ST-40C surveys the individual requirements in ECSS‑E‑ST‑40C and shows how a large number of them can be met by a software supplier through Ada, SPARK, and/or specific AdaCore products.
Compliance with ECSS-Q-ST-80C does likewise for the requirements in ECSS‑Q‑ST‑80C.
For ease of reference, the Abbreviations chapter contains a table of acronyms and initialisms used in this document, and bibliography lists the various resources cited.
Although this document is focused on specific ECSS standards, the Programming Languages for Space Software and Tools for Space Software Development chapters explain how the Ada and SPARK languages / technologies and AdaCore's products benefit software development in general for large-scale safety-critical systems. These chapters may thus be applicable to software that has to comply with regulatory standards in other domains.
1.1. ECSS-E-ST-40C: Space engineering / Software¶
As stated in ECSS‑E‑ST‑40C ([3], p. 11):
"This Standard covers all aspects of space software engineering including requirements definition, design, production, verification and validation, transfer, operations and maintenance."
"It defines the scope of the space software engineering processes and its interfaces with management and product assurance, which are addressed in the Management (–M) and Product assurance (–Q) branches of the ECSS System, and explains how they apply in the software engineering processes."
ECSS‑E‑ST‑40C defines the following space system software engineering processes:
Software-related systems requirements process (§ 4.2.2)
This process links the system and software levels and "establishes the functional and the performance requirements baseline (including the interface requirement specification) (RB) of the software development" ([3], p. 27).
Software management process (§ 4.2.3)
This process "tailors the M standards for software-specific issues" and produces "a software development plan including the life cycle description, activities description, milestones and outputs, the techniques to be used, and the risks identification" ([3], pp. 27, 28). It covers the joint review process, interface management, and technical budget and margin management.
Software requirements and architecture engineering process (§ 4.2.4)
This process comprises software requirements analysis (based on system requirements) and a resulting software architecture design. Activities associated with the latter include selection of a design method, selection of a computational model for real-time software, description of software behavior, development and documentation of the software interfaces, and definition of methods and tools for software intended for reuse.
Software design and implementation engineering process (§ 4.2.5)
This process covers the detailed design of the software items (including an analysis of the dynamic model showing how issues such as storage leakage and corrupted shared data are avoided), coding, testing, and integration.
Software validation process (§ 4.2.6)
Software validation entails "software product testing against both the technical specification and the requirements baseline" and "confirm[ing] that the technical specification and the requirements baseline functions and performances are correctly and completely implemented in the final product" ([3], p. 29).
Software delivery and acceptance process (§ 4.2.7)
This process "prepares the software product for delivery and testing in its operational environment" ([3], p. 29).
Software verification process (§ 4.2.8)
Software verification "confirm[s] that adequate specifications and inputs exist for every activity and that the outputs of the activities are correct and consistent with the specifications and inputs. This process is concurrent with all the previous processes." ([3], p. 30)
Software operation process (§ 4.2.9)
This process involves the activities needed to ensure that the software remains operational for its users; these include "mainly the helpdesk and the link between the users, the developers or maintainers, and the customer." ([3], p. 30)
Software maintenance process (§ 4.2.10)
This process "covers software product modification to code or associated documentation for correcting an error, a problem or implementing an improvement or adaptation." ([3], p. 31)
Software security process (§ 4.2.11)
This process "is supported by a software security analysis that is systematically maintained at different points in the lifecycle of the software.... The software security analysis is used to ensure that security risks are properly addressed.... It is also used to assess and drive the design, implementation and operation of secure software." ([3], p. 32)
The standard specifies the requirements associated with each of these processes and defines the expected output for each requirement. The expected output identifies three entities:
the relevant destination file,
the DRL (Document Requirements List) item(s) within that file where the requirement is addressed, and
the review that will assess whether the requirement is met.
The files in question are the RB (Requirements Baseline), TS (Technical Specification), DDF (Design Definition File), DJF (Design Justification File), MGT (Management File), MF (Maintenance File), OP (Operational Plan), and PAF (Product Assurance File).
The reviews are the SRR (System Requirements Review), PDR (Preliminary Design Review), CDR (Critical Design Review), QR (Qualification Review), AR (Acceptance Review), and ORR (Operational Readiness Review).
The tables below, derived from Table A-1 in Annex A of ECSS‑E‑ST‑40C, show the association between files, DRL items, and reviews. Cells with "E" indicate requirements from ECSS‑E‑ST‑40C, and cells with "Q" are the contributions from ECSS‑Q‑ST‑80C.
DRL Item |
SRR |
PDR |
CDR |
QR |
AR |
ORR |
|---|---|---|---|---|---|---|
Software System Specification |
E |
|
|
|
|
|
Interface requirements document (IRD) |
E |
|
|
|
|
|
Safety and dependability analysis results for lower level suppliers |
E Q |
|
|
|
|
|
DRL Item |
SRR |
PDR |
CDR |
QR |
AR |
ORR |
|---|---|---|---|---|---|---|
Software requirements specification (SRS) |
|
E |
|
|
|
|
Software interface control document (ICD) |
|
E |
E |
|
|
|
DRL Item |
SRR |
PDR |
CDR |
QR |
AR |
ORR |
|---|---|---|---|---|---|---|
Software design document (SDD) |
|
E |
E |
|
|
|
Software configuration file (SCF) |
|
E |
E |
E Q |
E |
E Q |
Software release document (SRelD) |
|
|
|
E |
E |
|
Software user manual (SUM) |
|
|
E |
E |
E |
|
Software source code and media labels |
|
|
E |
|
|
|
Software product and media labels |
|
|
|
E |
E |
E |
Training material |
|
|
|
E |
|
|
DRL Item |
SRR |
PDR |
CDR |
QR |
AR |
ORR |
|---|---|---|---|---|---|---|
Software verification plan (SVerP) |
|
E |
|
|
|
|
Software validation plan (SValP) |
|
E |
|
|
|
|
Independent software verification and validation plan |
E Q |
E |
|
|
|
|
Software integration test plan (SITP) |
|
E |
E |
|
|
|
Software unit test plan (SUTP) |
|
|
E |
|
|
|
Software validation specification (SVS) with respect to TS |
|
|
E |
|
|
|
Software validation specification (SVS) with respect to RB |
|
|
|
E |
E |
|
Acceptance test plan |
|
|
|
E |
E |
|
Acceptance test report |
|
|
E |
|
|
|
Installation report |
|
|
E |
|
|
|
Software verification report (SVR) |
E |
E |
E |
E |
E |
E Q |
Independent software verification and validation report |
|
E Q |
E Q |
E Q |
E Q |
E |
Software reuse file (SRF) |
E Q |
E |
E |
|
|
|
Software problems reports and nonconformance reports |
E Q |
E Q |
E Q |
E Q |
E Q |
E Q |
Joint review reports |
E |
E |
E |
E |
E |
|
Justification of selection of operational ground equipment and support services |
E Q |
E Q |
|
|
|
|
DRL Item |
SRR |
PDR |
CDR |
QR |
AR |
ORR |
|---|---|---|---|---|---|---|
Software development plan (SDP) |
E |
E |
|
|
|
|
Software review plan (SRP) |
E |
E |
|
|
|
|
Software configuration management plan |
E |
E |
|
|
|
|
Training plan |
E Q |
|
|
|
|
|
Interface management procedures |
E |
|
|
|
|
|
Identification of NRB SW and members |
E Q |
|
|
|
|
|
Procurement data |
E Q |
E Q |
|
|
|
|
DRL Item |
SRR |
PDR |
CDR |
QR |
AR |
ORR |
|---|---|---|---|---|---|---|
Maintenance plan |
|
|
|
E |
E |
E |
Maintenance records |
Q |
Q |
Q |
E Q |
E Q |
E Q |
SPR and NCR- Modification analysis report- Problem analysis report- Modification documentation- Baseline for change - Joint review reports |
|
|
|
|
|
|
Migration plan and notification |
|
|
|
|
|
|
Retirement plan and notification |
|
|
|
|
|
|
DRL Item |
SRR |
PDR |
CDR |
QR |
AR |
ORR |
|---|---|---|---|---|---|---|
Software operation support plan |
|
|
|
|
|
E |
Operational testing results |
|
|
|
|
|
E |
SPR and NCR- User's request record- Post operation review report |
|
|
|
|
|
E |
DRL Item |
SRR |
PDR |
CDR |
QR |
AR |
ORR |
|---|---|---|---|---|---|---|
Software product assurance plan (SPAP) |
E Q |
E Q |
E Q |
E Q |
E Q |
E Q |
Software product assurance requirements for suppliers |
E Q |
Q |
Q |
Q |
Q |
Q |
Audit plan and schedule |
E Q |
Q |
Q |
Q |
Q |
Q |
Review and inspection plans or procedures |
Q |
Q |
Q |
Q |
Q |
Q |
Procedures and standards |
Q |
E Q |
Q |
Q |
Q |
Q |
Modelling and design standards |
E Q |
E Q |
Q |
Q |
Q |
Q |
Coding standards and description of tools |
Q |
E Q |
Q |
Q |
Q |
Q |
Software problem reporting procedure |
Q |
E Q |
Q |
Q |
Q |
Q |
Software dependability and safety analysis report- Criticality classification of software components |
Q |
E Q |
E Q |
E Q |
E Q |
Q |
Software product assurance report |
Q |
Q |
Q |
Q |
Q |
Q |
Software product assurance milestone report (SPAMR) |
E Q |
E Q |
E Q |
E Q |
E Q |
E Q |
Statement of compliance with test plans and procedures |
Q |
Q |
E Q |
E Q |
E Q |
E Q |
Records of training and experience |
Q |
Q |
Q |
Q |
Q |
Q |
(Preliminary) alert information |
Q |
Q |
Q |
Q |
Q |
Q |
Results of preaward audits and assessments, and of procurement sources |
Q |
Q |
Q |
Q |
Q |
Q |
Software process assessment plan |
Q |
Q |
Q |
Q |
Q |
Q |
Software process assessment records |
Q |
Q |
Q |
Q |
Q |
Q |
Review and inspection reports |
Q |
Q |
Q |
Q |
Q |
Q |
Receiving inspection report |
E Q |
E Q |
E Q |
E Q |
Q |
Q |
Input to product assurance plan for systems operation |
Q |
Q |
Q |
Q |
Q |
E Q |
File |
DRL Item |
SRR |
PDR |
CDR |
QR |
AR |
ORR |
|---|---|---|---|---|---|---|---|
RB |
Software system specification (SSS) |
E |
|
|
|
|
|
RB |
Interface requirements document (IRD) |
E |
|
|
|
|
|
RB |
Safety and dependability analysis results for lower level suppliers |
E Q |
|
|
|
|
|
TS |
Software requirements specification (SRS) |
|
E |
|
|
|
|
TS |
Software interface control document (ICD) |
|
E |
E |
|
|
|
DDF |
Software design document (SDD) |
|
E |
E |
|
|
|
DDF |
Software configuration file (SCF) |
|
E |
E |
E Q |
E |
E Q |
DDF |
Software release document (SRelD) |
|
|
|
E |
E |
|
DDF |
Software user manual (SUM) |
|
|
E |
E |
E |
|
DDF |
Software source code and media labels |
|
|
E |
|
|
|
DDF |
Software product and media labels |
|
|
|
E |
E |
E |
DDF |
Training material |
|
|
|
E |
|
|
DJF |
Software verification plan (SVerP) |
|
E |
|
|
|
|
DJF |
Software validation plan (SValP) |
|
E |
|
|
|
|
DJF |
Independent software verification and validation plan |
E Q |
E |
|
|
|
|
DJF |
Software integration test plan (SUITP) |
|
E |
E |
|
|
|
DJF |
Software unit test plan (SUITP) |
|
|
E |
|
|
|
DJF |
Software validation specification (SVS) with respect to TS |
|
|
E |
|
|
|
DJF |
Software validation specification (SVS) with respect to RB |
|
|
|
E |
E |
|
DJF |
Acceptance test plan |
|
|
|
E |
E |
|
DJF |
Software unit test report |
|
|
E |
|
|
|
DJF |
Software integration test report |
|
|
E |
|
|
|
DJF |
Software validation report with respect to TS |
|
|
E |
|
|
|
DJF |
Software validation report with respect to RB |
|
|
|
E |
E |
|
DJF |
Acceptance test report |
|
|
|
|
E |
|
DJF |
Installation report |
|
|
|
|
E |
|
DJF |
Software verification report (SVR) |
E |
E |
E |
E |
E |
E Q |
DJF |
Independent software verification and validation report |
|
E Q |
E Q |
E Q |
E Q |
E |
DJF |
Software reuse file (SRF) |
E Q |
E |
E |
|
|
|
DJF |
Software problems reports and nonconformance reports |
E Q |
E Q |
E Q |
E Q |
E Q |
E Q |
DJF |
Joint review reports |
E |
E |
E |
E |
E |
|
DJF |
Justification of selection of operational ground equipment and support services |
E Q |
E Q |
|
|
|
|
MGT |
Software development plan (SDP) |
E |
E |
|
|
|
|
MGT |
Software review plan (SRevP) |
E |
E |
|
|
|
|
MGT |
Software configuration management plan |
E |
E |
|
|
|
|
MGT |
Training plan |
E Q |
|
|
|
|
|
MGT |
Interface management procedures |
E |
|
|
|
|
|
MGT |
Identification of NRB SW and members |
E Q |
|
|
|
|
|
MGT |
Procurement data |
E Q |
E Q |
|
|
|
|
MF |
Maintenance plan |
|
|
E |
E |
E |
|
MF |
Maintenance records |
Q |
Q |
Q |
E Q |
E Q |
E Q |
MF |
SPR and NCR- Modification analysis report- Problem analysis report- Modification documentation- Baseline for change - Joint review reports |
|
|
|
|
|
|
MF |
Migration plan and notification |
|
|
|
|
|
|
MF |
Retirement plan and notification |
|
|
|
|
|
|
OP |
Software operation support plan |
|
|
|
|
|
E |
OP |
Operational testing results |
|
|
|
|
|
E |
OP |
SPR and NCR- User’s request record- Post operation review report |
|
|
|
|
|
E |
PAF |
Software product assurance plan (SPAP) |
E Q |
E Q |
E Q |
E Q |
E Q |
E Q |
PAF |
Software product assurance requirements for suppliers |
E Q |
Q |
Q |
Q |
Q |
Q |
PAF |
Audit plan and schedule |
E Q |
Q |
Q |
Q |
Q |
Q |
PAF |
Review and inspection plans or procedures |
Q |
Q |
Q |
Q |
Q |
Q |
PAF |
Procedures and standards |
Q |
E Q |
Q |
Q |
Q |
Q |
PAF |
Modelling and design standards |
E Q |
E Q |
Q |
Q |
Q |
Q |
PAF |
Coding standards and description of tools |
Q |
E Q |
Q |
Q |
Q |
Q |
PAF |
Software problem reporting procedure |
Q |
E Q |
Q |
Q |
Q |
Q |
PAF |
Software dependability and safety analysis report- Criticality classification of software components |
Q |
E Q |
E Q |
E Q |
E Q |
Q |
PAF |
Software product assurance report |
Q |
Q |
Q |
Q |
Q |
Q |
PAF |
Software product assurance milestone report (SPAMR) |
E Q |
E Q |
E Q |
E Q |
E Q |
E Q |
PAF |
Statement of compliance with test plans and procedures |
Q |
Q |
E Q |
E Q |
E Q |
E Q |
PAF |
Records of training and experience |
Q |
Q |
Q |
Q |
Q |
Q |
PAF |
(Preliminary) alert information |
Q |
Q |
Q |
Q |
Q |
Q |
PAF |
Results of preaward audits and assessments, and of procurement sources |
Q |
Q |
Q |
Q |
Q |
Q |
PAF |
Software process assessment plan |
Q |
Q |
Q |
Q |
Q |
Q |
PAF |
Software process assessment records |
Q |
Q |
Q |
Q |
Q |
Q |
PAF |
Review and inspection reports |
Q |
Q |
Q |
Q |
Q |
Q |
PAF |
Receiving inspection report |
E Q |
E Q |
E Q |
E Q |
Q |
Q |
PAF |
Input to product assurance plan for systems operation |
Q |
Q |
Q |
Q |
Q |
E Q |
1.2. ECSS-Q-ST-80C: Space product assurance / Software product assurance¶
The ECSS‑Q‑ST‑80C standard defines software product assurance requirements for the development and maintenance of space software systems, including non-deliverable software that affects the quality of the deliverable product. As stated in [4], p. 20:
The objectives of software product assurance are to provide adequate confidence to the customer and to the supplier that the developed or procured/reused software satisfies its requirements throughout the system's lifetime. In particular, that the software is developed to perform properly, securely, and safely in its operational environment, meeting the project's agreed quality objectives.
The requirements apply throughout the software lifecycle and cover a range of activities, including organizational responsibilities, process assessment, development environment selection, and product verification. The specific set of requirements that need to be met can be tailored based on several factors:
Dependability and safety aspects, as determined by the software criticality category,
Software development constraints, for example the type of development (database vs. real-time), or
Product quality / business objectives as specified by the customer
ECSS‑Q‑ST‑80C defines requirements in the following areas:
Software product assurance programme implementation
This set of activities includes organizational aspects, product assurance management, risk management and critical item control, supplier selection and control, procurement, tools and supporting environment selection, and assessment and improvement process.
Software process assurance
These activities comprise software life cycle management; requirements applicable to all software engineering processes (e.g., documentation, safety analysis, handling of critical software, configuration management, metrics, verification, reuse, and automatic code generation); and requirements applicable to individual software engineering processes or activities (e.g., requirements analysis, architecture and design, coding, testing and validation, delivery and acceptance, operations, and maintenance).
Software product quality assurance
These activities comprise product quality objectives and metrication; product quality requirements; software intended for reuse; standard ground hardware and services for operational system; and firmware.
As with ECSS‑E‑ST‑40C, the expected output for each requirement identifies the destination file, the DRL items within that file, and the review(s) that assess compliance with the requirement. The table above includes this information for the requirements in ECSS‑Q‑ST‑80C.
The ECSS standards recognize that software systems (and different components of the same software system) may vary in their effects on system safety. The standards accordingly define several criticality categories, denoted A (most critical) to D, which correspond closely to the software levels in the airborne standard DO ‑ 178C/ED ‑ 12C.
1.3. ECSS Handbooks¶
Supplementing the normative standards in the –E, –Q, and –M series, ECSS has published a set of handbooks offering additional support, guidance and practical discussion about the standards and their requirements. They indicate how a customer (the organization acquiring the space software or system) will likely interpret the standards and thus how they will expect the supplier to comply.
Several handbooks complement ECSS‑E‑ST‑40C, including:
ECSS‑E‑HB‑40A (Software engineering handbook) [7],
This document provides guidance, explanations, and examples on how to satisfy the ECSS‑E‑ST‑40C requirements in practice.
ECSS‑E‑HB‑40‑01A (Agile software development handbook) [8].
This handbook shows how to reconcile agile development practices with the formal ECSS space software engineering processes.
ECSS‑E‑HB‑40‑02A (Machine learning handbook) [9].
This handbook provides guidelines on how to create reliable machine learning functions and perform the verification and validation considering the specifics of machine learning development practices.
Several handbooks complement ECSS-Q-ST-80C, including:
ECSS‑Q‑HB‑80‑01A (Reuse of existing software) [10]
This handbook offers guidance on software reuse (including software tools) and also presents a Tool Qualification Level (TQL) concept based on DO ‑ 178C/ED ‑ 12C [11], DO ‑ 330/ED ‑ 215 [12], and ISO 26262 [13].
ECSS‑Q‑HB‑80‑03A Rev. 1 (Software dependability and safety) [14]
This handbook focuses on analysis techniques such as Failure, Mode and Effects Analysis (FMEA) and their application to software; i.e., how to analyze what happens in case of failure due to software. It covers topics such as defensive programming and prevention of failure propagation.
ECSS‑Q‑HB‑80‑04A (Software metrication program definition and implementation) [15]
This handbook offers recommendations on organizing and implementing a metrication program for space software projects.
Bibliography