Figure 3 – Illustration of the evaluation of strategies in relation to scenarios. It emerges that, in the short term, the national level is the most relevant to respond to the problem (emergency in the face of Russia) by strengthening partnerships in a bilateral framework: this option also makes it possible to promote a common destiny based on democratic values. On the other hand, it seems that an EU-wide approach is more relevant to confront the information war increased tenfold by AI, because it involves heavy investments. This convergence of struggles will be all the easier to set up because it will be based on a network of already existing cooperations. In view of the initiatives underway within the EU, this option also aims to make information technologies as intelligible as possible. The chosen strategy therefore aims to develop a French response in line with a European ambition to combat foreign digital interference, now and by 2035.
First, the adopted strategy “Transforming the digital sword into a democratic shield”: the fight against foreign digital interference represents a complex and pressing challenge, which requires a response that is both targeted and comprehensive. With this in mind, France is proposing a multidimensional and coordinated strategy, in synergy with a broader European approach. This strategy aims to protect both citizens and democratic institutions against information threats, while preserving freedom of expression and the circulation of reliable information in the digital space. The ideal would obviously be to start with a strategy supported by the EU. But it is clear that there is still no consensus on this subject. At this stage, France is one of the main targets of information interference. And if not all EU countries are affected by this problem today, they will undoubtedly be tomorrow. The feedback on the development of cybersecurity at the European level illustrates the relevance of testing a national model before considering (with partners) the credible promotion of a common vision within the Union. The combination of our two strategies thus takes this political reality into account, but implies a stagnation, or even a decline in populism to allow a European consensus: otherwise (scenario “Erasure of borders”), France will benefit from an autonomous combat system, but limited in resources, which will require reviewing its alliances to have access to all digital technologies.
To translate this strategy into recommendations, the following principles are retained:
- digital technology (and its use) is the guiding principle;
- the fight against information interference requires an interministerial, multidimensional, multi-sector approach coordinated with the EU;
- the EU must develop complementary and synergistic responses supported by several nations to facilitate their deployment; and
- the involvement of French and European citizens is essential.
To deploy such a strategy, the recommendations are structured around two axes:
- Axis 1: “The Development of a French response. . .” based on three recommendations:
- Strengthen monitoring, analysis and response by relying on a fully interministerial state system, increased collaboration with the private sector and ARCOM;
- Communicate and raise awareness by informing citizens, journalists and information professionals about the extent of information interference; and
- Develop the commitment of civil society by educating on the uses of digital technology and involving citizens in the defense of the information sphere.
- Axis 2: “… combined with a European ambition” based on three recommendations:
- Cooperate and share information within the EU by promoting a collective response (contributes to the creation of a consensus on the importance of the fight against information interference);
- Develop research and innovation to support the development of new tools and technologies to combat disinformation; and
- Carry out proactive external action in the service of strengthened international cooperation by ensuring the adhesion of Member States to the fight against information interference on the EU political agenda.
This strategy will be detailed according to the six recommendations listed above in the following chapter. The consistency of this strategy with the Alliance’s approach is provided by the initiatives already undertaken. Indeed, the EU is one of the main partners chosen by NATO, which recognizes the importance of cooperation to counter disinformation. NATO and the EU are intensifying their collaboration, focusing in particular on strategic communication, improving situational awareness and holding joint exercises. The European Centre of Excellence for Countering Hybrid Threats in Helsinki aims to strengthen the civil-military capabilities of participating countries, increasing their resilience and preparing them for information interference. In conjunction with the EU and NATO, this centre of expertise will also have the mission of raising awareness among leaders and public opinion in Western countries. It embodies this desire for cooperation.
The implementation of this strategy also relies on renunciations. Given the level of public debt, major national investments are not envisaged, particularly in the field of R&D. The mastery of new technologies for the LMI relies on a European strategy. The active promotion of the French model is also not retained: soft power and counter-narrative remain the bases of the French response to its competitors. As for the resilience of the population (no “Finnish” model), it relies on the discernment of citizens in the uses of digital technology and does not develop the political aspect of the response, in particular societal projects aimed at the well-being of the population to be considered as a “vaccine” against disinformation. Finally, this strategy does not plan to deal with the supporters (“useful idiots”) of our adversaries. It does not call into question freedom of expression, and it is limited to actions where there is an involvement, direct or indirect, of a foreign actor, state or not.
Finally, to assess the relevance of this response, the proposed strategy was compared with the DISinformation Analysis & Risk Management framework (DISARM) Blue matrix, which allows to design a response to an information attack. As a reminder, this matrix was created in 2018 to combat information manipulation campaigns, and the DISARM model uses the structure of the ATTACK matrix used in the field of cybersecurity. There is a DISARM red model to characterize an attack and a DISARM blue model to design the response. The “Transforming the digital sword into a democratic shield” strategy covers the main areas of the DISARM blue model except for the subject of measuring effectiveness. Furthermore, this exercise shows that the DISARM model is perfectly suited to the tactical level, but not to the strategic level addressed by this study.
Part III: Operational Recommendations
A. Preamble
Combating informational interference is a complex challenge. The proposed strategy aims to develop a credible, multidimensional, and coordinated French response, in synergy with a global European approach, to protect citizens and democratic institutions against informational threats, guaranteeing freedom of expression and the circulation of reliable information in the digital space.
B. List of recommendations “Developing a French response . . .”
1. Recommendation A “Strengthen Monitoring, Analysis and Response”
(i) Specific objective: strengthen watch, analysis and response to foreign digital interference, with a state response via interministerial consolidation supported by increased collaboration with private players (media, network platforms) and a strengthening of the regulatory authority (ARCOM).
(ii) Description of the existing system: with VIGINUM under the aegis of the SGDSN (detection, characterization), the involvement of intelligence services in imputation and attribution actions, and a response and influence strategy supported by the MEAE (STRATCOM) and the MINARM (protection of our military operations), France has an organizational arsenal envied by its international partners. The implementation of the DSA, by a group of European regulators under the authority of the Commission, aims to ensure that platforms comply with the imperative to reduce the distribution of illegal content, and to increase the transparency of online platform algorithms. Finally, it involves monitoring the implementation of remedies designed to limit the artificial amplification and de-hierarchization of information.
In legal terms, the July 29, 1881 law on freedom of the press remains central. The year 2018 saw an important addition with the law of December 22, known as the “fake news law,” relating to the fight against information manipulation. This has been shortly supplemented by a law against foreign interference, inspired by the American Foreign Agents Registration Act (FARA) and the United Kingdom (National Security Act 2023, which introduces a Foreign Influence Registration Scheme, or FIRS).
This Act of July 25, 2024 to prevent foreign interference in France provides for the creation of a digital register of foreign-influenced activities to be kept by the French High Authority for Transparency in Public Life (HATVP). It will be separate from the register on interest representatives (lobbies) created by the so-called “Sapin 2” law of 2016.
This register will list, after declaration to the High Authority, the activities of people acting on behalf of a “foreign principal”: foreign powers or entities or foreign political parties or groups outside the European Union. The purpose of these activities must be to influence public decision-making (particularly law-making) or the conduct of public policy, including France’s European or foreign policy. They may involve:
- communicating with elected representatives or public decision-makers (declared candidates in national or European elections, leaders of political parties, ministers, ministerial advisors or advisors to the Head of State, members of parliament, regional and departmental executives, mayors of towns with more than 20,000 inhabitants, former presidents of the Republic or ministers for five years after leaving office, etc.);
- or/and carry out communication campaigns; and
- or/and collect or pay money for no consideration.
Foreign entities or political parties acting to promote their interests or those of a foreign state will also have to declare their activities.
Thus, any individual or legal entity, regardless of nationality, promoting the interests of a foreign power to the French public authorities or the general public, will be subject to a reporting obligation. But diplomatic and consular staff posted in France and officials of foreign states will be exempt.
This directory of foreign-influenced activities will be public and will be shared by the HATVP, the National Assembly and the Senate.
The Act also provides for sanctions: “Individuals refusing to provide the HATVP with information (on their identity, their influence, the people they approach, etc.) risk three years’ imprisonment and a 45,000 euro fine. The penalties for legal entities are more severe: a fine of 225,000 euros, a ban on receiving public aid . . .”.
The scheme will come into force no later than July 1, 2025, and an implementing decree is planned.
The text has been supplemented to require think tanks and institutes to declare foreign (non-EU) donations and payments to the HATVP.
In addition, former ministers, local executives and members of independent authorities will be subject to stricter control over their career transition to the private sector. This control, currently carried out by the HATVP with regard to conflicts of interest, is extended to cover the risk of foreign influence over a five-year period.
It also reinforces penal provisions. A new aggravating circumstance has been created in the penal code when an attack on property or persons is committed on behalf of a foreign or foreign-controlled power or entity. The penalties incurred will therefore be more severe. In such cases, special investigative techniques (wiretapping, etc.) may be used.
The law is applicable overseas, notably in New Caledonia and French Polynesia.
For what regards algorithm technique and asset freeze, the law authorizes intelligence services, on an experimental basis until June 30, 2028, to use algorithmic techniques to detect connections likely to reveal foreign interference or threats to national defense (e.g. cyber attacks). The use of this technique is currently only permitted to detect terrorist threats, and it has been permitted on a permanent basis since the law of July 30, 2021 on the prevention of terrorist acts and intelligence.
The government will have to submit an interim report and a final report evaluating this extension of algorithmic technique.
In addition, the procedure for freezing financial assets, authorized in the case of terrorism, has been extended to cases of foreign interference. Persons engaging in, inciting or financing such acts will thus be able to have their funds and resources frozen in France. An act of interference is defined in the Monetary and Financial Code as “an act committed directly or indirectly at the request or on behalf of a foreign power and having the object or effect, by any means, including the communication of false or inaccurate information, of undermining the fundamental interests of the Nation, the functioning or integrity of its essential infrastructures or the regular functioning of its democratic institutions.”
A better information for Parliament is also included, i.e, before July 1, 2025, and every two years thereafter, the law requires the government to submit a report to Parliament on the state of threats to national security, particularly in terms of foreign interference. This report may be debated in Parliament.
(iii) Means of execution:
- A position of national coordinator for the fight against the manipulation of information will be created under the supervision of the Prime Minister to define the national doctrine (to be implemented in public policy) and coordinate actions to combat digital interference. The “education”, “engagement”, “awareness,” and “communication” components included in the state response will be addressed in recommendations B and C.
- ARCOM, equipped with new resources and missions, will be reinforced by a civilian college collecting initiatives against disinformation while respecting ethics and democracy, particularly in the service of recommendation B. A body coordination with platforms and telecom operators will be created to protect the highlights of democratic life and international collaboration (elections, international summits, etc.), thus perpetuating the initiatives carried out for the 2024 European elections. A body for monitoring the financing of disinformation, under the joint supervision of ARCOM and the Professional Advertising Regulation Authority (ARPP), would be created for the main players in online marketing and advertising (platforms, advertisers, marketing agencies, influencers).
- The protection of economic assets, in particular Operators of Vital Importance (OIV) or Essential Service Operators (OSE) listed on the stock exchange, against manipulation of information will be reinforced via a dedicated coordination body, involving Service de Information and Economic Security (SISSSE) and the Financial Markets Authority (AMF) with awareness-raising missions on the subject among business leaders.
- On the technical aspect of the media, LMI’s public policy must encourage all technologies and practices which contribute to the reliability of information (e.g., Newsguard, etc.) or according to the principles of the Japanese initiative based on media technology “Originator Profile (OP).” In the light of these initiatives, the DSA will have to be re-evaluated in order to determine whether it is relevant to develop it. In the medium term, a public algorithm service will be created to regulate uses and avoid confinement in information bubbles.
- The adoption of a French FARA is now effective, continuing the work already undertaken at the level of the assemblies. By a new Senate report entitled “Combating malicious foreign influence mobilizing the whole nation in the face of the neo-Cold War” was presented on July 25, 2024. It is the result of the work of the commission of inquiry into public policy in the face of foreign influence operations where the author of this article sat as Vice-chairman, some of whose hearings can be viewed online.
This organization is structured around a contribution from three thirds (State, private sector and individual) who meet periodically under the aegis of the coordinator and form a crisis unit in the event of an impactful event. The full implementation of these actions, starting in 2027, will stimulate diplomacy for the preservation of a democratic and transparent information space by aiming to combat information interference (see recommendation F).
(iv) Actors concerned: ministries, SGDSN, media, associations, in coordination, at European level, with the European Committee for Public Media Services (CESM), an independent body bringing together regulatory authorities, including ARCOM and the ARPP.
(v) Implementation schedule:
- 2024: appointment of a national LMI coordinator;
- 2025: publication of a public LMI policy mainly targeting informational interference (in line with the EU approach);
- 2026: strengthening of ARCOM’s missions and resources;
- 2027: creation of a public algorithm service; and
- between 2026 and 2030: strengthening of administrations contributing to the LMI in line with the policy published in 2025.
(vi) Cost estimate:
- Human resources (HR), for the state organization: 40 full time employees (FTE);
- Budget: €10 million/year for financing independent actions (via an audiovisual tax on major digital platforms and/or European funds in addition to what is provided for by the Media Freedom Act).
(vii) Limits and risks of implementation: all actions must be carried out in concert (rule of 3 thirds: state, private and citizens) to effectively combat foreign digital interference.
2. Recommendation B “Communicate and Raise Awareness”
(i) Specific objective: inform citizens about the issues and the reality of informational interference and strengthen their vigilance regarding the mechanisms of online disinformation and promote, particularly financially, the best journalistic practices in the fight against disinformation.
(ii) Description of the existing situation: The National Commission for Information Technology and Liberties (CNIL) and ARCOM have created an educational kit for the education of the digital citizen for trainers and parents. Due to a lack of visibility, this type of initiative is not well known enough by citizens. Furthermore, the Government Information Service (SIG) notably informs the general public of Government actions, coordinates the communication of ministries and monitors media and social networks in order to analyze opinion. Major player in communication which is among the leading advertisers in France, it must speak to all audiences and could further inspire communication professionals, including the private sector, in order to amplify its impact. Some private actors are also mobilized, such as the Descartes Foundation which identifies and maps the multiple actors acting in France, Europe and the world, involved in the fight against disinformation and in the promotion of a public debate based on sincere information.
In addition, a set of practices and tools allow journalists to verify the authenticity or origin of information without their readers necessarily knowing. Designed in 2019, the Journalism Trust Initiative (JTI) of Reporters Without Borders (RSF) is a tool for certifying transparency and compliance with the standards of ethics and professional conduct of the news media. After online training to combat climate disinformation, available since April 2023, Agence France-Presse (AFP) launched, in November 2023, a new course for journalists and journalism students in order to face the challenges growing disinformation surrounding the elections. In January 2024, the Cultural Affairs Committee of the National Assembly launched a flash mission on foreign interference in the media with a view to examining the different avenues through which foreign powers attempt to disseminate their messages to French public opinion, as well as the actions carried out by French and European public authorities to try to contain this interference. Independent and quality media play a crucial role in the dissemination of reliable and verified information, thus guaranteeing the democratic health of society.
(iii) Means of execution:
- Create a unit dedicated to LMI awareness within the SIG to identify the initiatives of the various ministries and private actors under the leadership of the national coordinator for the fight against the manipulation of information; to organize awareness and information campaigns for each age group; promote the public version of VIGINUM’s foreign interference actions database;
- Financially support independent and quality media, in particular by providing subsidies or tax incentives, in order to preserve editorial autonomy and their ability to exercise rigorous and impartial journalism: this system will be financed via an audiovisual tax on media. Major digital platforms and/or European funds in addition to what is provided for by the Media Freedom Act; and
- Study with the French Federation of Press Agencies (FFAP) how to improve the functioning of the media in the context of new digital uses, and promote good practices, based on the example of the JTI.
(iv) Stakeholders concerned: the SIG, ministerial and private stakeholders already involved, as well as journalists and media of all types and all media.
(v) Implementation schedule:
- 2025: creation of an LMI unit in the SIG;
- from 2025: citizen awareness campaigns on LMI issues; and
- from 2026: strengthening of financial support for the independence of the press and media.
(vi) Cost estimate: communication budget of €5M/year. Creation of 5 FTEs for the LMI unit of the SIG. Fiscal impact made neutral by reorientation of current measures on new criteria.
(vii) Limits and risks of implementation: the scope of the LMI unit of the SIG must be clearly defined in order to prevent the idea of state “propaganda:” the transparency of the system must contribute to better confidence in the institutions. Freedom of expression is not called into question. The priority of communication campaigns will be foreign digital interference.
3. Recommendation C “Develop Civil Society Engagement”
(i) Specific objective: develop the engagement of civil society along two axes—education in the uses of digital technology and support for citizens and legal entities who wish to mobilize against informational interference.
(ii) Description of the existing situation: Since 1983, National Education has had a Media and Information Education Liaison Center (CLEMI), whose mission is “to teach students a citizen practice of the media [by relying] on dynamic partnerships between teachers and information professionals.” Its mission is in particular to produce and disseminate resources with a view to supporting actions with students from kindergarten to high school. This learning mission is carried out unevenly across the territory and the manipulation of information is not specifically addressed as a massive, deliberate phenomenon involving a foreign actor aimed at harming the fundamental interests of the Nation. Furthermore, several education stakeholders interviewed believe that although the majority of students are aware of the principle of “fake news,” this does not encourage them to diversify their sources of information.
The Ministry of Culture is also mobilized in favor of media and information education (estimated annual budget of €8M) via support for educational tools, the mobilization of public media, public libraries and the organization of annual Digital Culture Meetings aimed at promoting these actions.
There are also numerous associations raising awareness of critical thinking and the risks linked to online disinformation (i.e., APEM, Fakeoff), which seem unevenly active across the country. Furthermore, the risk linked to informational interference is rarely specifically mentioned in their scope of action (often broader on disinformation, linked to critical thinking, cyber-harassment or other).
Academic research in France and abroad also provides valuable studies on citizens’ behavior in the face of the media and disinformation (i.e., Medialab from Sciences Po), the threat of informational interference in certain contexts (i.e., the Strategic Research Institute of the Military School (IRSEM) as well as psychological mechanisms of information manipulation (e.g., University of Cambridge).
Finally, certain more recent devices can be mentioned, given their link with the specific objectives described above:
- the VIGINUM ethical and scientific committee, made up of eight independent personalities and responsible for monitoring the activity of the service and making recommendations on it via a public annual report; and
- the operational cyber defense reserve of MINARM (COMCYBER) and MININT.
(iii) Means of execution:
- Involve National Education (with the support of VIGINUM) according to the following priority axes: systematize and strengthen media and information education (EMI) among all students, and include a component specific on LMI, generalize the media education week for all students (i.e. Finland), train all teachers in LMI (including a section on LMI;
- Create units involved in LMI in the National Guard to support the implementation of public policy dedicated to informational interference (see recommendation A);
- Mobilize the Economic, Social and Environmental Council (CESE) via its Education, Culture and Communication Commission to contribute to the drafting of LMI public policy (see recommendation A);
- Extend the scope of responsibility and the resources of the VIGINUM ethics and scientific committee to the entire state system aimed at combating informational interference to help strengthen citizen confidence in the state response;
- Include in corporate social responsibility (CSR) the actions of private structures engaged in approaches discouraging the financing of disinformation and informational interference and sanction the actors who contribute to it. Implementation would be followed by the ARCOM-ARPP body described in Recommendation A.
(iv) Stakeholders concerned: the list of impacted stakeholders includes National Education (all schools, middle schools, high schools), including VIGINUM, the SNU, the reserve, the media, social media platforms, advertisers, the EESC or even citizens as a whole.
(v) Implementation timetable: by mid-2027, the implementation of the following measures is proposed:
- 2024: launch of initiatives within National Education, with the objective of finalizing VIGINUM-CLEMI content in the first half of 2025, and implementation of the “media week for all” and teacher training from the start of the 2025 school year (with a system fully operational in 2030);
- 2025: creation of the LMI ethics and scientific committee, concomitantly with the development of public doctrine in this area (see recommendation A);
- 2025: involvement of the CESE in the definition of LMI public policy (see recommendation A);
- 2025: launch of initiatives linked to the reserve;
- 2026: proposal/draft law on CSR to be initiated from 2025 for adoption in 2026.
(vi) Cost estimation: within National Education, the CESE and CSR stakeholders, the reallocation of existing resources will be prioritized. New resources will be mobilized for the reserve and the SNU with a target of 30 dedicated FTEs (i.e., 300–350 reservists based on an average of about twenty days per year. The ethics & scientific committee will be strengthened via the allocation of a team of four permanent FTEs and an annual study budget (i.e. €1M/year in total).
(vii) Limitations and risks of implementation: the involvement of National Education is the issue given the necessary resources (time, training, etc.). Media and information education must be done correctly to avoid excessive skepticism. Directed by an action that is too politicized, it could even give the feeling of the exercise of a ministry of propaganda.
C. List of Recommendations “ . . . Combined with a European Ambition”
1. Recommendation D “Cooperate and Share Information”
(i) Specific objective: develop cooperation and actively participate in sharing information relating to FIMI at EU level to, on the one hand, improve the effectiveness of the French response, and on the other hand, support the recommendation F aimed at building a consensual political project between member states.
(ii) Description of the existing situation: Several European organizations are mobilized against FIMI, notably since the detailed action plan against disinformation of 2018. The EEAS implements the CFSP under the direction of the High Representative of the EU for Foreign Affairs and Security Policy, in close collaboration coordination with the EU Foreign Affairs Council (EAC), where the foreign ministers of member countries sit. The EEAS is the central coordinating body in the fight against FIMI between Member States and at the international level (i.e., G7, NATO) via its Strategic Communication division and its various Task Forces (South, East, Western Balkans). This division also manages the Russian threat awareness and documentation site www.euvsdisinfo.eu as well as the EU rapid response system (SRA) in the event of a disinformation campaign targeting several Member States. The resources devoted to these measures are limited and depend in part on the secondment of national experts. They remain distant from political decision-makers. Teams must coordinate with other EU organizations such as DG Connect. Furthermore, the diversity of approaches and specific political wills (i.e., Poland, Hungary) makes consensus difficult in matters of FIMI.
There are, in addition, several European organizations:
- European Digital Media Observatory (EDMO), created in 2018, aims to support and lead the independent European community which fights against disinformation (organization of fact-checking and university research in particular), to provide media education resources, and to assist public decision-makers on the subjects of disinformation. EDMO is managed by a consortium of universities and has a governance structure that is completely independent of public authorities;
- The European Center of Excellence against Hybrid Threats is an autonomous international organization based in Helsinki, created in 2017, whose mission is to strengthen the capacities of its participating States to prevent and counter hybrid threats, through the informal sharing of good practices, issuing recommendations, new ideas and approaches. With 35 member countries, the center has an annual budget of €4M.
Finally, events such as the European News Media Forum (ENMF), organized by DG Connect, allow for annual discussions on news media issues.
(iii) Means of execution:
- Promote, in partnership with proactive European countries on FIMI, the increase in resources granted to EDMO, the extension of its mission to informational interference and the diversification of its governance beyond the academic world (add for example: press groups, public broadcasting groups, press agencies, NGOs such as RSF, journalism schools, etc.). Provide EDMO with a fund to finance initiatives from civil society;
- Accelerate the decision-making of the SRA by creating an entity mandated by the CAE to react to any detection of proven action. Ideally this response will be done in coordination with the Member State(s) concerned without excluding an autonomous response on behalf of the EU (i.e., denunciation, imputation, or even economic sanctions);
- Amplify the impact of the annual ENMF coupled with the European Media Education Week, by encouraging the French media to join the event (notably public audiovisual groups) and by communicating on the event through media awareness and education initiatives (see recommendations B and C).
(iv) Actors concerned:
- national authorities responsible for guiding European decisions (MEAE, parliamentarians), ministries and state bodies with cooperation at EU level;
- the European authorities, and primarily the European Commission and the EEAS, as well as France’s European partner countries (for example Sweden or Finland - see recommendation F);
- actors from civil society: the media (traditional and new, public and private), journalism schools and NGOs.
(v) Implementation schedule:
- from 2024: promotion and development of bilateral cooperation in the field of FIMI;
- from 2025: promotion by the MEAE within the CAE of cooperation in matters of FIMI and the strengthening of the STRATCOM of the EEAS, and the creation of an entity delegated by the CAE to react diplomatically; provision of additional national experts in FIMI detection to the EEAS (i.e.: experts from VIGINUM).
- from 2026: promotion to DG Connect of the evolution of EDMO’s objectives, means and governance and promotion of the European News Media Forum;
- from 2027: development of cooperation actions to promote the development of a FIMI system modeled on that of cybersecurity (consistent with recommendation F).
(vi) Estimated costs: at EU level, the costs of combating disinformation were €10-15M per year between 2018 and 2020. By analogy with the world of cybersecurity in the broad sense, ENISA had a budget of about €23M in 2021 versus about €9M in 2009, 5 years after its creation. Finally, in France, VIGINUM had a budget of €12M when it was created in 2021. On the implementation horizon (2027/2028), we can therefore estimate an additional cost of about €20M/year for the strengthening of existing resources (+50% in 4 years), and the creation of a dedicated agency (excluding specific financing programs). At the French level, within the MEAE, additional dedicated staff of 5 FTEs or €1M/year would make it possible to implement the measures described above as well as recommendation F. Within VIGINUM the contribution to the SEAE will be made possible by 1 additional FTE.
(vii) Limits and risks of implementation: any international initiative is by definition based on the voluntarism and capacity for action of different countries: it will therefore be critical for France to carry out these initiatives in a partnership approach with countries ready to engage on this subject and not seek to impose a French vision.
2. Recommendation E “Develop Research and Innovation”
(i) Specific objective: develop research and innovation at EU level to support the development of new tools and technologies to combat disinformation.
(ii) Description of the current situation: Consideration of LMI in European research programs has emerged with the recognition of hybrid threats: for example, CNRS is a partner in the European AI4Trust project aimed at developing a platform of AI tools capable of helping public decision-makers and fact-checkers verify the reliability and origin of certain information. The involvement of human and social sciences (SHS) in LMI research programs is recommended to analyze past disinformation campaigns. This will help to protect the integrity of electoral processes and to better understand the impact of algorithmic prescriptions of social networks.
The field of LMI illustrates the duality of using AI for malicious purposes (e.g. the automatic design of disinformation campaigns), or for defensive purposes (i.e. to rapidly detect fake news or the artificial amplification of the spread of information). But, the development of new AI-based LMI solutions must not create new vulnerabilities, nor increase the attack surface.
(iii) Means of implementation:
- The French Ministry of Higher Education and Research is encouraging multidisciplinary research within the EU involving SHS alongside experts in cybersecurity, AI and LMI, with the creation of LMI-focused research posts. European research programs could also be opened up to partner countries such as India, Taiwan, Great Britain and West Africa. This approach would be accompanied by a transfer of these innovations to industry, via start-ups, for example, with highly agile development models. This would enable the emergence of a range of sovereign AI cybersecurity solutions, to anticipate growing threats such as adversarial attacks.
- To encourage platforms to share their data, it is proposed to initiate a dialogue between researchers and platforms. This would make it possible to better analyze and document the modes of action of online disinformation, and to measure the impact of user interfaces. Researchers would be able to propose indicators of the impact of disinformation campaigns, prevention or possible censorship, which could be a side-effect of content removal by platforms.
- It is also proposed to work closely with the major digital platforms to identify their potential subcontracting needs to create new markets and the rise of new LMI start-ups. A complementary solution is to increase the responsiveness of funding programs for collaborative R&D projects by promoting cascade-funding approaches, with European funding channeled to National Coordination Centers (NCCs) within the States, working as a network. These centers would prioritize needs at European level, while selecting SMEs and allocating funding at national level. In addition, it is proposed to focus the themes of the European call for projects on the development of:
- automated detection tools for disinformation campaigns and deepfakes;
- information labeling technologies based on the C2PA model;
- technologies for estimating the degree of specialization of information presented to a user; and
- technologies for cyber protection and assessing the robustness of AI to known and emerging threats.
(iv) Stakeholders: European players in academic and applied research in the fields of SHS, cyber and AI, digital platforms, start-ups and major European manufacturers.
(v) Implementation timetable:
- in 2024: definition of the French position concerning the integration of the FIMI topic into European DIGITAL EU research programs (with consistency with research programs more specifically oriented towards homeland security and defense);
- in 2025: creation of LMI-focused research posts; French proposal to revise the governance, operating and evaluation procedures for European R&D programs, to make them more agile and better adapted to dealing with new threats;
- in 2026: adoption of French proposals to revise governance, operating and evaluation procedures for R&D programs;
- in 2028: introduction of the first tools developed by the EU to combat informational interference; and
- from 2029 onwards: research and innovation in support of the LMI achieve consensus within the EU.
(vi) Estimated costs: it is proposed to fund collaborative projects from the budgets of research programs already planned, by including the LMI field in the agenda of these programs and encouraging funding of at least seventy percent from the budgets of private-sector players.
(vii) Limits and risks of implementation: financing European collaborative projects is not envisaged at French level, except in the case of rare bilateral AAPs (ANR and BPI Franco-Germany. . .).
3. Recommendation F “Take Proactive External Action to Strengthen International Cooperation.”
(i) Specific objective: specify the involvement of French diplomacy in support of our strategy to combat informational interference. The strengthening of the French system is only envisaged through coordination at EU level and increased bilateral cooperation, requiring the reinforcement of the European legislative arsenal.
- In Estonia, the STRATCOM team attached to the Prime Minister and the think-tank International Centre for Defense and Security (ICDS) appear to be effective organizations in the field of LMI.
- NATO follows a two-pronged approach, focusing on understanding the information environment and proactively communicating with the public and partners through its Public Diplomacy Division (STRATCOM). NATO and the EU are intensifying their collaboration and holding joint exercises. NATO also cooperates closely with Ukraine, with the establishment of a specific platform since 2016. It collaborates with several Centers of Excellence, including the Helsinki Center for Hybrid Threat Management, the Riga Center for Strategic Communication and the Tallinn Center for Cyber Defense.
- Since 2019, the G7 has had a “Rapid Response Mechanism,” an initiative led by Canada aimed at coordinating member countries to better identify and respond to the diverse and evolving threats to democracies, particularly in terms of disinformation.
- It should also be noted that the United States, Great Britain and Canada are communicating on interesting joint initiatives, such as the development of a common framework for analyzing information manipulation.
(iii) Means of implementation: for French diplomacy, this involves :
- Propose at European level a FARA-type regulation to oblige agents representing the interests of foreign powers to declare their activities, as distinct from a liberticidal “Russian law”-type mechanism;
- Develop bilateral partnerships with countries with effective LMI systems (Sweden, Japan);
- Pursue, with the OECD, the work carried out with the creation of the DIS/MIS Resources Hub (DISinformation/MISinformation) and, more generally, define actions to strengthen resilience in third countries (notably in the EU’s neighborhood) to disinformation threats emanating from outside;
(iv) Stakeholders involved:
- national authorities (governments, led by MEAE, parliamentarians, etc.);
- the EU’s international partners: NATO, the G7, ENISA, EDMO, Europol, and the European Electoral Cooperation Network (ECNE); and
- independent media, journalists, researchers for fact-checking work and civil society as a whole; including economic players, including in third countries.
(v) Implementation timetable: if bilateral cooperation work, as well as within international organizations, is to be ongoing, implementation can be sequenced as follows:
- 2024–2026: promotion of a European FARA based on the example of national implementation (see recommendation A), to be adopted by 2028 at the latest;
- 2026–2028: promote the structuring of international cooperative ventures, with a view to a European close, mandatory coordination of actions to combat FIMI; support for the creation by the EU of a European agency dedicated to the fight against FIMI (detection, awareness-raising) ahead of the 2029 European elections; and
- 2028–2035: promotion of international regulatory cooperation to avoid fragmentation and prevent regulatory arbitrage.
(vi) Estimating costs: it is difficult to estimate the costs of such recommendations. It will be a question of directing European funds towards cooperation and setting up the early warning service. For example, the cost of an annual European information alert and response exercise is estimated at 500 €/year.
(vii) Limits and risks of implementation: The main difficulty lies in the need to build a European consensus on these subjects of interference and manipulation of information, which could be complicated by different cultural approaches, but also by the reluctance of certain States to favor transparency on certain links of interest with commercial partners in third countries. This also requires an end to the assumption that there can be no interference within the EU between member states or on the part of allied states.
Ideally, a stronger European mechanism should be put in place directly, based on a tried-and-tested national mechanism and solid bilateral cooperation, to lend credibility to France’s ability to drive this approach at European and even international level.
D. Implementation
The fight against informational interference calls for a coordinated response between France and the EU, with an emphasis on the digital dimension. Involving several ministries, this multidimensional approach aims to counter immediate (Russia) and systemic (China) threats, while involving citizens. Planned expenditure includes institution-building, funding for journalism, awareness-raising campaigns and operational tools. But, this will require a certain amount of renunciation, such as promoting the French model outside the EU, and heavy investment, in the context of prudent public debt management to maintain France’s credibility at European level.
An initial estimate of the additional costs arising from these recommendations is as follows (excluding inflation):
- estimated total additional annual cost for France from 2028: €25M; and
- additional cost borne by the EU (excluding structuring initiatives such as the EU social network): €20M.