III. MEETING CHALLENGES FOR A SUCCESSFUL FCAS
Given the large number of domains that it encompasses, the technological leaps to be made, the duration required for its development and the nature of the project led in international cooperation, the FCAS programme is a challenge both for public authorities and for the manufacturers in charge of leading it to completion.
A. DO NOT GET THE PROJECT WRONG
All project stakeholders strongly agree: the FCAS is not a combat aircraft project but a "system of systems" project, of which the aircraft is just one part. While it is a central part, it is not the most innovative, since the innovation lies more in that which connects and drives the platforms toward collaborative combat (the combat cloud, artificial intelligence, as well as sensors, etc.) than in the platforms themselves. Also, while many countries have combat aircraft programmes, very few have such programmes for an air combat system of systems.
In any case, it is important to keep this "system of systems" nature in mind at each step of the project since its added value will mainly come from its ability to embody the concept of collaborative combat in a series of innovative platforms and technologies. Furthermore, it will be necessary to look well beyond 2040 to 2080: the FCAS must not be obsolete as soon as it is commissioned.
1. Putting artificial intelligence and autonomy capabilities at the heart of the FCAS's development
The HLCORD, the single document that expresses all the FCAS's needs, stipulates that the NGF (Next Generation Fighter) can have a pilot on board or will be "optionally" piloted.
For the moment, as already discussed, the role of drones and remote carriers, while important, is designed to be subordinate to the NGF which will, in principle, be manned . For the most advanced drones, the model used is a "Loyal Wingman", i.e. a drone that accompanies or precedes piloted combat devices to carry out a variety of tasks: strikes, surveillance, electronic attacks, decoying and battle damage assessments.
Russia (Sukhoi S-70 Okhotnik-B) and the United States (Kratos XQ-58A Valkyrie in a programme launched in July 2016, the "Low-Cost Attritable Strike Unmanned Aerial System Demonstration" and Boeing's "Loyal Wingman" programme developed in partnership with the Australian Air Force), the United Kingdom ("Lightweight Affordable Novel Combat Aircraft" with, initially, the awarding of three initial design contracts to Blue Bear Systems Research, Boeing Defence UK and Callen-Lenz) are developing such "loyal wingman" programmes.
Considered as a remote carrier or sensor, piloted devices must remain in control of the "loyal wingman".
Being able to fly a drone by itself without support from a piloted device runs up against the fragility of the satellite data link, which can be hacked or jammed in disputed areas. 19 ( * ) The drone would then become uncontrollable. By remaining integrated within a formation directed by the piloted aircraft, the drone can use a local network that, while it too can be jammed, is much more resilient.
However, even in this situation, artificial intelligence is fully required to offload the pilot's simplest tasks, aid in decision-making and avoid the loss of drones in case the data link is severed.
The United States is quickly developing this use of artificial intelligence to support piloted combat aircraft. AI is installed on a loyal wingman, in a combat aircraft transformed into a drone or directly on the cockpit of the piloted aircraft. The Skyborg programme is studying the ability of having an "in-board" piloted combat aircraft (which, in this case, could be an F35 or the new, modernised F15EX) + a "wingman", a drone equipped with artificial intelligence, which could be an XQ-58 Valkyrie.
Furthermore, a way to overcome the difficulties of a long-distance data link is to consider a fully autonomous drone that does not depend on this data link. However, this situation raises two issues:
- an ethical and legal issue (see box below).
The issues raised by Lethal Autonomous Weapon (LAW) systems
On current armed drones, the targeting and shooting are always done by one or more human operators. It is this idea of a "human in the loop" that justified the drone falling within the same legal framework as other weapons systems.
Conversely, "Lethal Autonomous Weapon" (LAW) systems, which do not exist yet, but which are the subject of scientific and military research, raise legal and ethical issues of another magnitude.
Some fear that the risk of armed conflict and the use of military violence will increase with the deployment of truly autonomous systems: LAWs do eliminate the psychological barriers to using lethal force, which is not the case for drones that are piloted by a human being (hence the post-traumatic stress syndrome sometimes seen in drone pilots).
There are also doubts about LAWs' ability to respect the principles of international humanitarian law (or the law of conflict). Given these concerns, a European Parliament resolution recommends prohibiting the development of LAWs.
Article 36 of the First Protocol to the Geneva Convention stipulates that a new weapon may be studied, developed, acquired or adopted only after it has been determined whether it might be contrary to the Protocol or to another rule of international law.
More specifically, respecting the key principles of international humanitarian law (IHL) (distinguishing between combatants and civilians, proportionality and minimising collateral damage, precaution) requires a use of judgment that, for now, is exclusive to human beings. In certain environments, distinguishing between civilians and soldiers can be very difficult. It may be necessary to analyse a person's behaviour to decide if it is some way "good" or "bad". It seems unlikely that algorithms will be able to make such a judgment. Conversely, certain legal experts highlight the risk that human soldiers might violate principles of IHL when under stress or fear, emotions to which LAWs are not subject. However, considering that the current rules suffice because robots are better able to respect them than humans is tantamount to claiming that it is ethically equivalent whether a human or a robot does the killing. On the contrary, we can view the development of autonomous systems as a paradigm shift that requires new rules since IHL was invented to be applied by human beings.
Furthermore, since we do not fear (or at least fear less) for a robot's life, we can imagine that they will ultimately be subject to stricter rules for the use of force than humans: for example, that it is necessary for a person to display a weapon or be unequivocally aggressive for them to be considered as a combatant and become a target, or that the robot be able to incapacitate its human target, but not kill them.
In 2014, the first informal meeting of experts on LAWs was held as part of the Convention on Certain Conventional Weapons (CCWC) at the UN in Geneva with France as chair. The third edition was held in April 2016 with 95 countries, the ICRC, numerous NGOs and experts attending. At these meetings, France's representation committed to not developing or using LAWs unless "these systems demonstrated perfect compliance with international law". However, it also considered that any preventive prohibition on developing LAWs would be premature. Since the debate focuses on the "significant human control" to which LAWs must be subject, the expression "appropriate human involvement", a bit vague but acceptable to all the participants, was adopted under the German delegation's initiative. Finally, some raised questions about the coherence of the concept of LAWs itself: for armed forces, doesn't the total autonomy and lack of a link to a human operator run counter to military command's overriding need for operational control?
In any case, these discussions in a multilateral framework led to the creation of a governmental group of experts. This expert group's work could lead to a code of good conduct and best practices for LAWs. According to certain experts, this code could include:
- limiting the use of LAWs to objectives that are military in nature (and not by location, destination, or use), to certain contexts (non-urban environments and those with few inhabitants) and only in cases where a person cannot make the decision themselves (subsidiarity),
- ensuring that the autonomous mode can be reversed,
- programming the "benefit of the doubt" into the LAW,
- recording the LAWs' actions,
- training LAW operators in IHL.
On 5 April 2019 at the DATA IA institute in Saclay, Armed Forces Minister Florence Parly presented the new strategy on artificial intelligence and defence . During this presentation, she discussed the ethical and legal aspect, stating that " France refuses to entrust the decision of life or death to a machine that could act fully autonomously and beyond any human control. Such systems are fundamentally contrary to all our principles. They have no operational interest for a country whose armies respect international law, and we will not deploy them" . 20 ( * ) The minister added, " we will develop artificial intelligence for defence according to three main principles: respect for international law, the maintenance of sufficient human control, and the permanence of command responsibility ." 21 ( * )
However, we should note that one of the minister's arguments is that artificial intelligence could contribute to a better application of international humanitarian law: " I will cite, for example, the proportionality of the response, distinguishing between combatants and non-combatants, and minimising collateral damage. Artificial intelligence will not change any of these guidelines. On the contrary, artificial intelligence will help us to continue to respect them in future conflicts ."
Furthermore, the Armed Forces Ministry has set up a Defence Ethics Committee which the minister instructed to consider the initial guidelines for applying artificial intelligence to weapons systems by summer 2020.
The ethical and legal issues continue to be the subject of international discussions which, however, do not seem to be offering great results for the moment.
- the issue of tactical efficacy . Some think that AI cannot be more effective than people in an environment heavily contested by sophisticated access denial systems or, more generally, in "tactically fluid" situations where there are many choices and decisions to make.
In her speech, the Armed Forces Minister warned of AI's potential fragility: " The handling of learning data, the cognitive biases that people pass on to algorithms, systems that are disoriented and disabled by simple pieces of tape, systems that can be hacked remotely: the risk factors that we must evaluate and control from the design stage are immense. "
However, these difficulties, while very real, could be greatly overcome in 2040 . Remember that, in 2016, the experienced air force instructor Force Gene Lee could not win a single victory in an air combat simulation against the artificial intelligence "Alpha" installed on a low-cost, low-powered computer. In the same vein, a project of the Air Force Research Laboratory (AFRL) seeks to have an AI-equipped drone (which may initially be an F16) fighting against a manned fighter by July 2021. This project echoes a statement by Elon Musk, CEO of Tesla, on the fact that a fighter equipped with AI could defeat a manned fighter without difficulty. 22 ( * )
FCAS project participants are very aware that one of the challenges they must meet is to integrate between them systems that are 1) piloted by a person onboard the aircraft, 2) piloted remotely and 3) autonomous. It is one of the main issues of the FCAS and one of the main topics of research for the project's partners, and it must be possible, in a certain measure, to vary the proportion of these three elements in the "finished product" according to the needs that will exist starting in 2040 and in the following decades .
The choice of AI is not choosing whether or not it is present; it is a question of degrees . When a missile is approaching an aircraft at Mach 4, the pilot does not have time to make a decision. The reaction is necessarily automated, sort of like when ABS takes control of a car's brakes when the driver brakes heavily before an obstacle. In this case, there is no point in the person being "in the loop". The position defended by the Armed Forces Ministry and shared by the mission is that people are in the loop overall: a machine can be autonomous, but it cannot create or change a mission without asking permission from a human being . People must maintain the responsibility of command and be able to respect international humanitarian law . Many tasks of self-defence, automatic targeting or overall trajectory calculation can be automated without infringing on these three principles, which do not ordinarily appear as self-limits, according to the Armed Forces Ministry.
In any case, AI will have a prominent role, at least, within the FCAS to support pilots within the system formed by the NGWS. Therefore, it seems necessary to continue to invest massively in artificial intelligence since the FCAS will necessarily make extensive use of it, even if it is not exactly predictable today. We should welcome, in addition to the Armed Forces Ministry's drafting of the strategy on artificial intelligence that was already discussed, 23 ( * ) that the Armed Forces Minister stated in her above-cited speech that " obviously the French Armed Forces are investing and will invest in artificial intelligence " and announced an investment of €100 million a year from 2019 to 2025 for AI. The minister also mentioned six priority areas for investment in the matter, including collaborative combat.
Given our adversaries' accelerated development of this technology, we must be ready to respond in the future to countries that do not always respect the ethical and legal standards that France and its allies respect and want to continue to respect. Without this preparation, the French army could find itself against enemies in the same situation as Gene Lee or the world's best chess player who, by all accounts, could no longer win a single game against an artificial intelligence. Meanwhile, we must continue international discussions to arrive at a clear legal framework on these issues that is consistent with our ethics and the principles of international humanitarian law.
Proposal : Consider artificial intelligence as a "transversal pillar" of the FCAS that must be developed with the broadest possible scope of application.
Resume international discussions on lethal autonomous weapons (LAW) to arrive at a clear legal framework that is consistent with ethics and the principles of international humanitarian law.
2. The crucial importance of data links and the combat cloud and sensors pillars
The data links, whether high-speed intra-patrol links, high-speed satellite links, or optical links, as well as their security and resilience to cyberattacks and jamming, will be essential. The information superiority provided by the cloud will allow for decisional superiority .
Additionally, it is imperative that the cloud's scope be as broad as possible, thus encompassing land and naval forces. For example, close air support will need to be connected with land and naval artillery. This involves addressing the integration of the FCAS's tactical cloud and the new SCORPION Command Information System (SCIS), a regiment-to-regiment battlefield command and information system that automatically exchanges data and warnings down to the landed group leader level and optimises fire support requests.
In total, the FCAS's added value likely lies as much, if not more, in the combat cloud, connectivity, and the interoperability architecture than in the combat aircraft and its engine . It is possible to compare this with how cars may evolve, should autonomous cars continue to develop; the software, the connections and the cloud will probably have more added value than the car itself. This is why we must pay the greatest attention to the "combat cloud" pillar as well as to the future "sensors" pillar, directed respectively by Airbus and Indra. In particular, the "combat cloud" pillar must allow Thales and all its defence electronics subcontractors to contribute to the essence and the core of the FCAS.
Proposal : give the "combat cloud" pillar the same priority as the aircraft and the engine.
Start preparing the integration of the FCAS combat cloud with the Scorpion Command Information System (CIS) immediately
3. Which engine for the demonstrator?
The demonstrator of the new engine will not be available before 2027, but the aircraft's demonstrator should fly in 2025 or 2026. Thus, it is expected that the demonstrator will be equipped with an improved version of the M88 until it can be replaced by a demonstration version of the new engine.
However, even this improved version may be insufficient to power a demonstrator on a 1:1 scale. A demonstrator at a scale of 0.8:1 could alleviate this problem, for example. If this option is not chosen, the demonstrator could use an engine already on the market. However, this solution could represent a risk to Safran's participation in the rest of the programme. The Eurofighter's J200, which could be selected, is made by a consortium including Rolls-Royce (a potential competitor with the Tempest), Avio, ITP and MTU Aero Engines. When asked about this issue, Eric Trappier, CEO of Dassault Aviation, specified that an improved version of the M88 remained the main option being considered. This solution, which complies with the initial industrial agreement, is also the mission's preferred solution.
Proposal : Equip the demonstrator planned for 2026 with the M88 engine (the Rafale engine), or a new version of it, and make the necessary investments.
4. The environmental aspect
Protecting the environment is not necessarily what first comes to mind when thinking about combat aviation, a very high-performance field that often goes hand-in-hand with maximum energy consumption. The FCAS's first objective is to outdo potential adversaries with superior performance. Additionally, the NGF's size and weight will very likely be greater than the Rafale, which implies greater fuel consumption. However, the comparison is not totally valid since we should compare the consumption of a current formation of Rafales with the consumption of an NGWS formation, which will consist of as many or more platforms (taking the remote carriers into account) but, no doubt, fewer combat aircraft.
However, looking ahead to the period after 2040 and towards 2080 requires us to consider, for example, the possibility that energy may be less abundant, the need to improve energy independence, or extending certain standards developed for civil aviation to military aviation.
The Armed Forces Ministry has already taken this concern into consideration. Emmanuel Chiva, Director of Agence Innovation Défense , stated 24 ( * ) that " the issues of energy and the environment are topics of research in their own right. Research work specifically on hydrogen is under way, with a project for a hydrogen station for drones...AID does not ignore the challenges to the climate and is involved in the same way as the ministry as a whole ".
Furthermore, on 3 July 2020 the armed forces minister presented the ministry's energy strategy which lays out efforts to save energy in all fields to reduce the armies' energy bills with the additional goal of reducing their dependency on petrol procurement, which sometimes relies on uncertain sea routes.
Finally, in the field of aviation, studies are already under way on using biofuel. Airbus, Air France, Safran, Total and Suez Environnement signed a commitment to green energy with the government in December 2017 on aviation biofuels. The goal is to introduce a proportion of biofuel alongside the kerosene. These biofuels will be able to respect the requirements of military aviation. 25 ( * ) Work is also being done to reduce the electric power that aircraft require.
As for other defence programmes, it seems necessary to take this aspect into account right from the start of the FCAS project's implementation.
Proposal : Include environmental concerns from the start of the FCAS programme while seeking the best performance possible.
* 19 unlike uncontested areas such as the Sahel and Sahara where MALE drones can fly over the theatre of operations without really being threatened.
* 20 Ministry for the Armed Forces, "Speech by Florence Parly, Armed Forces Minister: Artificial Intelligence and Defence", April 2019
https://www.defense.gouv.fr/salle-de-presse/discours/discours-de-florence-parly/discours-de-florence-parly-ministre-des-armees_intelligence-artificielle-et-defense
* 21 Note that in late 2019, Airbus and the Fraunhofer Institute for Communication, Information Processing and Ergonomics (FKIE, Bonn, Germany) set up an independent group of experts whose mission is to define the responsible use of new technologies and propose "ethical and international legal safeguards" within the FCAS.
* 22 Russian President Vladimir Putin also stated about AI in 2017 that " whoever becomes the leader in this field will be the master of the world ", while the company Kalashnikov announced that it had developed several lethal autonomous weapons (LAW). There are similar projects in China.
* 23 Artificial Intelligence in Support of Defence, Report of the AI Task Force, September 2019.
* 24 Interviewed by Michel Cabirol, La Tribune, 11/09/2019.
* 25 However, it seems that we must exclude the idea of electric aircraft, both civilian and military. The power needed would require batteries whose weight would practically equal the weight of the aircraft itself.