chevron-down Created with Sketch Beta.

The International Lawyer

The International Lawyer, Volume 58, Number 2, 2025

How Could France Develop by 2035 a Public Policy That Embodies Its Determination to Combat Informational Interference Alongside a European Ambition?

Vanina Paoli-Gagin

Summary

  • It is clear that informational warfare is playing a growing role in the new con-flictual triptych of “competition, contestation, confrontation” characterizing in-ternational relations.  
  • It is the expression of the power strategies of a world in transition, driving to increasingly aggressive behavior, playing on thresholds and multiplying threats.
  • With this in mind, control of the information field has become a major strategic challenge, given the increasing digitalization of human activities.  
  • The creation of a strategic “influence” function within the Revue nationale stratégique is there-fore a sign of France’s determination to develop an ambitious strategy to control and reduce foreign informational interference in cyberspace.
How Could France Develop by 2035 a Public Policy That Embodies Its Determination to Combat Informational Interference Alongside a European Ambition?
Maxime Horlaville - MxHpics via Getty Images

Jump to:

“Winning the war before the war” by acting on the mind is not a new slogan, as it has always been applied in modern conflicts. But with the development of the Internet and its applications in the cognitive layer (search engines, social networks, etc.), the battle of ideas and ideologies is taking on a whole new dimension. In October 2018, Isabelle Falque-Pierrotin, then President of the French Data Protection Authority (CNIL), declared: “Personal data has moved beyond the realm of protection to become a real issue of power, influence and even manipulation, at the very heart of our democratic systems. The deployment of artificial intelligence (AI) in many countries is pushing questions about personal autonomy or national sovereignty to the extreme, issues that are taking on new strategic importance, both nationally and internationally.” The Revue nationale stratégique (2022) points out that “Influence, in all its dimensions—diplomatic, military, economic, cultural, sporting, linguistic, informational—is an area of contestation, requiring a coordinated response. It is a new strategic function in its own right.” The war in Ukraine is indicative of this component of offensive action in the cognitive layer. Since 2018, at least, France has been building a system to better control discourse on the web: COMCYBER’s consideration of computerized influence tactics (LII), the creation of VIGINUM, and the sub-directorate for strategy within the MEAE. The European Union has issued regulations (DSA), even if this architecture is undoubtedly not yet complete, at both national and alliance levels (EU, NATO in particular).

Against this backdrop, and keeping in mind that influence through propaganda and the manipulation of information has its roots in “peacetime,” what strategy do you recommend for France and Europe, up to 2035, to reduce and better control, according to the rules of democracy, the actions of digital influence that affect national defense and security?

Executive Summary

The international context is marked by the extension of conflict to an ever-increasing number of environments and fields. It is clear that informational warfare is playing a growing role in the new conflictual triptych of “competition, contestation, confrontation” characterizing international relations. It is the expression of the power strategies of a world in transition, driving to increasingly aggressive behavior, playing on thresholds and multiplying threats. Influence, manipulation, disinformation and propaganda are all expressions of informational warfare. While these modes of action are not new, in our age of everything/everyone connected and 24-hour news channels, they are part of a permanent, undeclared information war. These operations carried out by foreign powers in “grey” zones are all the more difficult for democracies such as France to apprehend, as they play on democratic values and intrude into the intimate spheres of individuals via the new digital uses of citizen-consumers. With this in mind, control of the information field has become a major strategic challenge, given the increasing digitalization of human activities. The creation of a strategic “influence” function within the Revue nationale stratégique is therefore a sign of France’s determination to develop an ambitious strategy to control and reduce foreign informational interference in cyberspace.

Prospective analysis has identified three key trends in dealing with informational interference.

Firstly, the digitalization and personalization of information access is booming thanks to emerging technologies such as the Cloud, 5G and AI. This trend is driving widespread digital adoption. Secondly, geopolitical tensions between Western countries, and Russia and China are becoming increasingly acute. The latter are seeking to regain their international standing, challenging the influence of the United States and its allies. The celebration of the People’s Republic of China’s centenary in 2049, and the economic, demographic and geopolitical challenges facing Russia, underline the significance of these long-term tensions. Thirdly, cyberspace and information space have become major fields of confrontation, enabling state and non-state actors to act discreetly. Finally, strategic analysis has highlighted two variables that could have a significant impact on the fight against interference in information space: the determination of France’s partner countries to preserve an information space governed by democratic values, and the intelligibility of information technologies.

To cope with the uncertainties of a world where informational warfare has become permanent and is on the threshold of kinetic confrontation, the aim could be to set up by 2035 a public policy that embodies France’s determination to combat informational interference, by reinforcing the current system with an integrated, inter-ministerial approach and relying, on the one hand, on partnerships with digital players and, on the other hand, on an aware and mobilized civil society. To deploy such a strategy, the recommendations are structured around two axes:

  1. “Developing a French response . . .” based on three recommendations:
    1. Strengthen monitoring, analysis and response;
    2. Communicate and raise awareness;
    3. Develop the commitment of civil society;
  2. “. . . combined with a European ambition” based on three recommendations:
    1. Cooperate and share information;
    2. Develop research and innovation;
    3. Take proactive external action to strengthen international cooperation.

Introduction

Information manipulation and propaganda are permanent features in the history of conflict. Carrying out an informational destabilization campaign against a rival has always made it possible to divert its attention, significantly weaken it before confrontation, or even annihilate its will to win a victory without war in the physical field. As defined by VIGINUM, “foreign digital interference poses a real and serious threat to the democratic functioning of societies. Integrated into so-called hybrid strategies, they are defined by the intention to undermine the fundamental interests of the Nation, the propagation of manifestly inaccurate or misleading content, inauthentic dissemination (artificial or automated, massive and deliberate) designed to amplify the visibility or virality of this content on digital platforms, and the direct or indirect involvement of a foreign actor, whether state-owned or not.” This is what we’re talking about here, even if the notions of information manipulation or disinformation campaign will be used interchangeably in the future.

First and foremost, the context of this new form of warfare is changing. It is characterized by an accelerated erosion of borders, the fragmentation of Western societies, and the calling into question of historical divisions (war/peace, civil/military, expert/novice, dictatorship/democracy, etc.), all in a new transnational and opaque digital information environment. Conflictuality now extends to the following environments and fields: land, sea, air, outer space, cybernetics, electromagnetics and information. These developments call for a new approach. The “peace-crisis-war” continuum has been used to characterize the state of rivalries between political models in particular since the end of the Cold War.

This new generalized war obeys a new grammar and takes place in a new geography. The use of the triptych “competition, contestation, confrontation” gives a more accurate account of the complexity of conflictuality in an increasingly threatening world, where the concept of alliance has given way to that of partnership. The combination of these evolutions in an increasingly digitized world has seen the emergence of a permanent and undeclared “information war”, waged in particular by uninhibited and more often than not authoritarian powers.

These grey-zone actions, which could be described as the “new normal,” are all the more difficult for France and democracies in general to grasp, because they:

  • deviate from the principles underpinning fundamental freedoms in our democratic society (open, free, ultra-digitized and legalistic) into a form of reflexive trap;
  • play on the non-alignment of aggression thresholds between different stakeholders; and
  • flourish thanks to a rapid and powerful transformation of the digital uses of citizen-consumers, directly or indirectly in the hands of private players, notably American.

The growing use of digital technology by citizens to obtain information goes hand in hand with a growing distrust of the media. In 2023, the French spent an average of 56 minutes a day on social networks and personal messaging, and 2 hours 24 minutes in the 15-24 age group, a figure that is constantly rising. This growing consumption is taking place in a climate of mistrust: 57% said at the end of 2023 that they distrusted what the media had to say about major current affairs, and 67% did not trust social media. A study published by the Autorité de régulation de la communication audiovisuelle et numérique (ARCOM) in March 2024 shows that ninety-four percent of French people are interested in information, that they are not fooled by the risks associated with algorithmic media, although they are not always able to protect themselves against them, and that some express the need to diversify their sources. These figures also illustrate the questioning of the press and media in Western countries, which represents a major challenge for democracy and press freedom. To overcome the crisis of editorial independence and the fragility of the independent press in France, a combination of structural reform, collaboration between players in the sector and public education is needed to preserve press freedom, the integrity and vitality of journalism in a constantly evolving digital world.

Western states and digital players became aware of what had become a major trend and began to react to informational attacks targeting political and health events (regulations, organizations dedicated to detecting attacks and countering their effects). Then, the fight against information manipulation and influence was gradually taken on board, at both national and European level, as a new pillar of French and EU defense and security policy.

In France, the creation of the “influence” strategic function, involving the Ministry of the Armed Forces (MINARM) and the Ministry of Europe and Foreign Affairs (MEAE), bears witness to the determination to develop and implement engagement and response capabilities adapted to the information field. Two distinct systems have been set up: the Comité Opérationnel de Lutte contre les Manipulations de l’Information (COLMI), to protect France, and the Task Force Interministérielle Informationnelle (TF2I), to respond to external threats.

The COLMI, which reports to the General Secretariat for National Defense and Security (SGDSN), is responsible for formulating working guidelines in the fight against information manipulation, as well as response proposals in the event of the detection of any blatant foreign digital interference. This body brings together key government bodies such as the MEAE, MINARM and the Ministry of the Interior (MININT). Added to this is VIGINUM, created on July 13, 2021 (under the aegis of SGDSN), whose major challenge is to preserve public debate from the manipulation of information originating from abroad on digital platforms. At the same time, the TF2I, under the responsibility of the MEAE, is in charge of coordinating interministerial action, both proactive and retaliatory, in the field of information. Although the French government has commissioned a number of reports, it is clear that it has yet to deliver an operational document to clarify its doctrine and fully mobilize its apparatus.

With the Strategic Compass, the European Union (EU) also wishes to react firmly to information manipulation and interference activities carried out from abroad, as in the decisive and coordinated action against the disinformation campaign carried out by Russia as part of its military aggression against Ukraine (with the activation of the ERCHT group: Enhancing Resilience and Countering Hybrid Threats). This response is consistent with the internal policies of member states, establishing a common understanding of the threat and continuing to develop a series of instruments for effectively detecting, analyzing and combating this threat, as well as imposing sanctions on those who engage in it. To enhance the resilience of societies, the EU and NATO have supported the creation of the European Centre of Excellence for Combating Hybrid Threats, based in Helsinki.

The EU’s toolbox for combating information manipulation and interference activities carried out from abroad is beginning to expand, including in the context of Common Security and Defense Policy (CSDP) missions and operations. This will enhance response options, resilience capabilities and cooperation, both within the EU (on a voluntary basis) and with partner countries, and improve situational awareness, thanks to the early warning system. The joint operational mechanism on electoral processes and the possible designation of electoral infrastructures as critical infrastructures will also be a step forward. Cooperation with like-minded partners such as NATO and the G7, without prejudice to the reinforcement of efforts undertaken within the framework of the United Nations, is a key area of focus, as are actions with civil society and the private sector. This voluntarism of the Strategic Compass does not reflect the diversity of positions and exposure to the threat within member countries. France, Finland and Sweden are the most advanced countries on this subject. This is how, little by little, we are building a political response to attacks targeting the EU, notably from Russia, China, Iran, and Turkey.

It is also worth recalling the importance of securing digital space in the study of information warfare. The digital revolution, which is overturning our lifestyles, economies and social practices, is profoundly transforming our relationship with information by confronting us with a mass of constantly available information and a generalized competition of opinions. Opinions are expressed unfiltered, according to a hierarchy and logic that are difficult for web and social network users to understand. Recent years have seen the advent of artificial intelligence (AI). Although it has proved its worth in practice, the use of AI in digital technologies also presents risks for the safety, dignity and rights of citizens. Indeed, AI algorithms can be seen as mere pieces of software and data that are potentially vulnerable to attacks of physical or cyber origin, the impacts of which are still poorly assessed. For example, PRC is developping highly sophisticated influence activities, especially through expertimentation of generative AI (Tik-Tok accounts may have targeted candidates running for US 2022 mid-term election) and Russia may boost its use of AI in order to interfere in the american election of 2024 to serve its interest in the war against Ukrainia by lessening the occidental support. Russia also uses AI to create deepfakes to mislead experts and target local people in conflict zone via malicious influence. Confidence in AI is an active area of research, and progress in this field is essential to enable its use in certain critical applications.

We are also looking to ensure that learning data is of good quality, representative and diversified, even in the face of attack, in order to reduce the risk of poisoning attacks and improve model robustness.

Disinformation is defined in the Action Plan for European Democracy as “false or misleading content disseminated with the intention of misleading or for profit or political gain, and likely to cause public harm.” This public harm can, in particular, be characterized by threats to political processes and democratic policymaking, as well as to the protection of the health of Union citizens, the environment or security. The Commission’s definition of disinformation “excludes misleading advertising, misquotation, satire, parody, as well as clearly identified partisan information and comment.” Unlike “hate speech or terrorist content, false or misleading information is not in itself illegal.”

Thus conceived, the European concept of misinformation also differs from that of disinformation, in particular through the criterion of intentionality: disinformation refers to “false or misleading content transmitted without intention to harm, even though its effects may nevertheless be harmful; this is notably the case when people share false information in good faith with friends or family members.” The European institutions are not seeking to regulate misinformation, even if its virality can make it problematic, but to limit it through “proactive communication, the provision of reliable information and raising awareness of the need to critically evaluate content and sources.”

However, no single solution capable of guaranteeing complete protection against all types of attack exists, and new challenges and trade-offs may arise as AI models become more complex and ubiquitous. Consequently, it is crucial to adopt a holistic and systemic approach that takes into account the security aspects of the entire model lifecycle, from design to deployment, and involves cross-training AI and cybersecurity teams, establishing security standards and best practices, and carrying out regular audits.

We also need to embrace the legal environment of the subject, which constitutes both a response to the challenges of information warfare and a risk to fundamental rights. The French law of December 22, 2018, on combating the manipulation of information defines false information as any “allegation or imputation of a fact lacking verifiable elements of such a nature as to make it likely.” The Constitutional Council has entered a reservation of interpretation on the notion of false information. It can only refer to “inaccurate or misleading allegations or imputations of a fact of such a nature as to alter the fairness of the forthcoming ballot.” These allegations do not include opinions, parodies, partial inaccuracies or simple exaggerations. They are those whose falsity can be objectively demonstrated.” Thus, “only the dissemination of such allegations or imputations meeting three cumulative conditions can be called into question: they must be artificial or automated, massive and deliberate.”

At the European level, there are rules of flexible law supplemented by the Digital Services Act. Legally, qualifying the influence of false information is not straightforward. The very purpose of advertising is to influence people’s behavior and choices, whether in commercial or political matters. The search for influence is not in itself prohibited. The question is when influence becomes a form of manipulation. The scale of commercial and political manoeuvres to distort reality, create illusions and exploit cognitive biases in order to influence individuals without their knowledge raises the question of the protection that the law can offer.

In this context, France’s action must become a global strategy; a French strategy, acceptable to Europe, which must enable us, by 2035, to reduce and better control the actions of digital influence that affect national defense and security. To be effective, this institutional response implies intervention in the informational field of citizens themselves. The envisaged strategy must therefore be understood, accepted, relayed and even adopted by the population as a means of defense, which must neither jeopardize fundamental freedoms, nor be assimilated to a tool for internal control.

Dealing with this issue requires a strategy that could take the following considerations into account:

  1. Focus on the digital dimension: France’s response focuses on the specific challenges posed by the digital environment, recognizing the growing importance of this space in the dissemination of information and attempts at interference;
  2. Interministerial approach and coordination with the EU: the fight against informational interference requires an interministerial, multidimensional and coordinated approach, involving different sectors and government institutions;
  3. Involvement of French and European citizens: citizens play a crucial role in the fight against informational interference. Their awareness and active participation are essential to strengthening society’s resilience in the face of these threats; and
  4. Dealing with immediate and major threats: the French strategy aims to deal with both immediate threats, such as those from Russia, and major threats, such as those from China.

Part I: Forward-Looking Analysis

A. Approach

Before drawing up a proposed strategy, the approach adopted consists in identifying unforeseen or underestimated situations, as well as possible opportunities in a prospective analysis. The aim is to avoid thinking of the future as a continuity or extrapolation of the present, by focusing solely on strong trends. Foresight is not forecasting, but rather an analysis designed to describe borderline situations, in order to test the choices we make and the choices we abandon in our strategy.

In the light of the situation described in the introduction, the situation can be formalized as follows, using the Strengths/Weaknesses/Opportunities/Threats approach:

Force of France in the context of foreign cyber influence

  1. Democratic system favoring transparency and reliable information.
  2. Defense and security organizations in place to combat foreign interference in cyberspace (intelligence services, COMCYBER, VIGINUM), backed by recognized military power.
  3. Diplomatic network, French-speaking community, international media.
  4. Content production ecosystem and tools (cinema, press, animation, games, etc. . . ).
  5. Internationally recognition.

France’s weakness in the context of foreign cyber influence

  1. Public debt is a constraint on our public policies.
  2. Fragile social cohesion.
  3. Public naivety about our competitors’ ways of doing things.
  4. Decline in critical thinking.
  5. Strong dependence on foreign technologies and digital solutions, and lack of knowledge of how these technologies work (e.g. algorithms, AI or platforms).
  6. Coherence of interministerial response.

Opportunities for action against foreign cyber influence

  1. European strategy for defense (strategic compass with a digital influence component) and regulation in the digital field, with an activatable budget.
  2. Allies with a model resistant to foreign interference (Sweden and Finland) to impulse an EU dynamic.
  3. Rebuilding alliances (India, Brazil, UK).
  4. The right time to act in this field (neither too early nor too late), as the international framework is not set in stone.
  5. Leverage the digital technologies of our partners.
  6. Capitalize on civil society players in the EU and US, i.e., universities, think tanks, whistle-blowers.

Threats posed by foreign cyber influence

  1. Asymmetry in the means employed in the information battle: exploitation by our competitors of an open society based on democratic values.
  2. Risk of failure of European cooperation on major issues, such as the cloud.
  3. Accumulation of “existential” risks for citizens, which could be exploited by our competitors and lead everyone to de-prioritize the subject of information manipulation (e.g., climate change, humanitarian crisis).
  4. Technologies making it possible to saturate the information space and paralyze the cognitive field of targeted audiences, with easy and inexpensive access to these technologies.
  5. Decline in France’s relative weight at international level (less representative, less visible).

Given that our main adversaries and competitors in the digital space are involved in informational interference, our prospective analysis to 2035 focuses on the intentions of state and technological players, and on the perception of this new conflictuality by players in Western societies in an increasingly digitized, fragmented and hostile world.

B. Players

On the battlefield of information warfare, there are many different players. Some confront each other mercilessly, while others simply take part in games of influence or try to untangle discourse and manipulation.

1. Western Partners

In the fight against informational interference, France can find partners in the Western democracies with which it is traditionally allied, notably within the Atlantic Alliance, and with which it shares a set of values derived from the Enlightenment. Naturally, the United States, with its economic, military, and digital clout is a major player. The nations of Europe, led by Great Britain, a pioneer in information warfare, and Germany, are also partners of choice. Among our European allies, Sweden and Finland deserve particular attention, as their geographical position facing east has led them to develop specific approaches to resilience and the fight against information manipulation. Current events should also prompt us to take a closer look at what Israel is doing in this field.

2. Competitors

The Western camp is increasingly challenged by an alliance of opportunity between a few major state competitors. Russia, heir to a long tradition of propaganda and “active measures” stemming from the Cold War, and China, driven by an agenda of global domination in the run-up to 2049 and the PRC’s 100th anniversary, are vying for the position of adversary-in-chief. In their wake, regional powers driven by ideological and religious ambitions such as Iran and Turkey want to promote their own model. Extremist and terrorist organizations are also increasingly present in the information field. What all these players have in common is their contestation, or even detestation, of democratic values, first and foremost freedom of expression, which represents a threat to their very existence.

3. Civil Society

Civil society refers to all groups, organizations and individuals within a given population and territory. In the context of the fight against informational interference, this civil society is essentially remarkable in the form of NGOs and university researchers. While this fight remains essentially a state and regalian competence, with collaborations between states relatively circumscribed to certain very specific subjects, these NGOs operate on a multinational and multidisciplinary collaborative model. In this respect, their role is essential, as they collect information, quantify it and document the various forms of foreign interference. They produce reports, propose normative frameworks to facilitate the exchange of information, and suggest approaches for the coordinated implementation of hybrid, technological and human and social science (SHS) innovations as tools in the fight against informational interference. But this international “civil society” only seems to interact with states or state organizations (EU, NATO), and a link with national civil societies remains to be forged and strengthened. In France, civil society remains highly fragmented and little mobilized on the issue of informational interference. The situation can be compared to that of the difficult and slow awareness of the “cyber” risk.

4. Digital Players

The French digital ecosystem is essentially made up of content producers, publishers and distributors, such as online media, whether subscription-based or free-access, like the pages run by individual influencers. Added to this category are social engineering and/or digital marketing companies, offering analysis, research and monitoring services, which, depending on the application and purpose, can be likened to online influence or counter-influence. But this French ecosystem remains highly dependent financially and technically on the major digital platforms and social networks, all of which are non-European. These major platforms impose, without assuming it, a form of editorialization that influencers know how to manipulate, or even bias, but which remains difficult to regulate. Private French and European counter-influence players are still emerging, but could grow very rapidly, with a potential national and European market involving the media, online content publishers and institutions. The European regulatory framework, if complemented by a French one, could encourage the development of this market in France. It is likely, however, that this economic sector cannot do without collaborations with civil society, for greater social acceptability, and with academia for better transfer of innovation and exploitation of knowledge from SHS research.

C. The Variables

In view of the state of play described in the introduction and summarized in the SWOT matrix, the three key trends that are particularly structuring for our problem are:

  • Digitization and personalization of access to information: new technologies (Cloud, 5G, AI, . . .), and IT and communications equipment deployed on a massive scale are enabling widespread adoption of digital uses in all sectors of activity (private and public). The ultra-personalization of services is also a key area of development for capturing the attention of individuals and making the investments of major digital players economically viable. R&D efforts in quantum computing have been stepped up: the first concrete applications could arrive sooner than expected, as early as 2030, offering new perspectives for digital uses;
  • Russia and China are challenging Western countries: Moscow and Beijing share many common interests, motivated by a desire to change the international order governed by the United States and its allies. The year 2049 corresponds to the centenary of the People’s Republic of China. In 2050, Russia will be faced with key challenges relating to economic diversification, demographic stagnation, the preservation of its zone of influence, tensions with the West and, ultimately, its own identity.
  • Cyberspace and the information field as areas of confrontation: the intertwining of globalization, global issues and power strategies means that we need to act “in the shadow of war.” In a digitized world, confrontation in informational space means challenging one’s adversary below the threshold of kinetic action. This conflict enables authoritarian states, on the one hand, to control their populations and, on the other, to act, discreetly and directly, on Western societies by using the rules of openness and freedom that prevail there.

On the other hand, the following data constitute variables likely to have an impact on interference in the information space (organization of variables cf. figure 1):

  • V1): the adoption of democratic values in Western countries;
  • V2): the moral strengths of Western societies;
  • V3): the intelligibility of information technologies;
  • V4): regulation of digital activities outside the EU;
  • V5): technological innovation ;
  • V6): permissiveness of cyberspace;
  • V7): the level of organization of states (public policies, alliances, partnerships);
  • V8): the cohesion of Western countries in the face of contestation from Russia and China.

Figure 1 - Variable organization.

In line with the model used to characterize digital information warfare (adversary intentions, modes of action adapted to cyberspace and infotarget), the two pivotal variables selected relate to power strategies and execution methods:

  • Politics / International relations: “Voluntarism of France’s partner countries to preserve an information space governed by democratic values”:
    • (v. max): complete disinterest on the part of partner countries in preserving informational spaces governed by democratic values (e.g. following the rise of populism);
    • (v. min): consensus among partner countries to preserve information spaces governed by democratic values;
  • Technology: “Intelligibility of information technologies for users”:
    • (v. max): innovation leading to a loss of control by individuals of their informational sphere, e.g., cognitive science combined with generative AI combined with implants. . . ;
    • (v. min): innovation enabling enlightened, emancipated use of information technologies (e.g. via the emergence of models).

D. Scenario Building

The construction of scenarios, by cross-referencing the extreme values of the variables selected (see Fig. 2), enables a descriptive analysis of borderline cases to test the choices and renunciations of the strategy.

Figure 2 - Positioning of prospective scenarios in relation to pivotal variables.

Figure 2 - Positioning of prospective scenarios in relation to pivotal variables.

From the matrix of prospective scenarios (see Figure 2), two scenarios serve as a framework for testing the chosen strategy. Based on the analysis of the players (see chapter 1), the most likely scenario seems to be the “GAMAM-X Style” scenario: the remaining democratic states face up to a digital economy backed by Chinese ambition. In fact, the associated scenario is the “Erasure of borders”: under this scenario, nationalist populism gradually paralyzes the EU, and the defense of the information sphere now relies on democratized technology.

The “Children’s Island” scenario is a low-impact situation in which to test a strategy, as the informational sphere is preserved, and technological innovations make it possible to limit inauthentic actions. On the other hand, the “Game Over” scenario poses other problems including: the total loss of control over digital uses and the absence of democratic consensus among Western partners render obsolete many of the levers of a strategy dedicated to information warfare. To test a strategy in this scenario would risk losing the quality of our analysis of the various actions.

The “GAMAM-X Style” and “Border Erasure” scenarios are therefore intellectually propitious environments in which to test the chosen strategy at a later date.

1. “GAMAM-X Style” Scenario

In 2035, the world has become bipolar, with, on the one hand, a dominant China that has concluded agreements with the American digital giants known as GAMAM-X (i.e., Google, Apple, Meta, Amazon and X) and, on the other hand, a globally democratic Western bloc that has endorsed digital technology solely in the economic sphere. The majority of Westerners supported a policy in favor of pluralistic media following a strict code of ethics. But, part of the population has tipped over into the virtual worlds of GAMAM-X.

Consuming digital activities like powerful painkillers, enthusiasts have left social life to survive in connected addiction. This new equilibrium remains in place as long as nuclear deterrence remains credible.

This scenario is based on the following chain of events:

  • An awakening of Western liberal democracies in 2024, triggered by foreign disinformation campaigns and cyber-attacks that attempted to lead to the fiasco of the Paris Olympics, and to numerous acts of interference in the European and US election campaigns;
  • Following these events, Europe and the United States sought to reinforce their control of information space by imposing strict rules on GAMAM-X. The latter rebelled, denouncing an attack on freedom of expression, and created a parallel Internet - Internet4Freedom - operational in 2026, thanks to their stranglehold on telecommunications infrastructures (i.e., undersea cables, satellite networks, data centers) and an agreement with China for their supply of semi-conductors, of which China controls most of the raw materials. Internet4Freedom is becoming a formidable channel of economic influence for GAMAM-X, based on a business model similar to today’s, but free from the restrictive laws and regulations that saw the light of day in the early 2020s.

China also uses Internet4Freedom as a vehicle for effective soft power. Nevertheless, a digital peace has been established between GAMAM-X and Western democracies, due to the latter’s dependence on GAMAM-X’s digital services and economic power:

  • From 2027 onwards, part of the Western population is under the sway of Internet4Freedom, gradually deserting democratic life. Another part of the population, more politically engaged, has turned away from digital life outside the labeled media guaranteeing free, quality information. Democratic regimes have remained in power despite record abstention rates, as their political room for maneuver has become minimal; and
  • By 2030, China will have asserted its global dominance, having neutralized its great American rival, which is unable to mobilize a large part of its population under the influence of Chinese soft power and has become over-dependent, even in its sovereign prerogatives, on the digital giants, which are themselves indirectly controlled by China, which manages their supply of semi-conductors.

2. “Blurring Borders” Scenario

In 2035, the EU is paralyzed by nationalist populism, with France’s major historical partners focused on promoting their national narratives by any means necessary. The United States is waging repressive campaigns against journalists critical of the government, while at the same time claiming to value freedom of expression for its citizens. The federal government regularly conducts informational interference operations to destabilize European countries that are not aligned with America’s foreign policy.

The federal government regularly carries out informational interference operations to destabilize European countries not aligned with US foreign policy, which is focused on maintaining a balance with China. The emergence of “infox proof” social networks, resilient to disinformation, enables a minority of citizens to preserve an informational space governed by critical thinking.

This scenario is based on the following sequence of events:

  • Political and media context from 2025: following the elections of 2024, populist parties have gained influence in Europe; the United States, grappling with a deep, quasi-insurrectional internal crisis, turns in on itself, developing an increasingly authoritarian regime; China and Russia multiply their unabashed informational attacks on democracies;
  • Emergence of alternative, decentralized social networks: the Lumo initiative was launched as a European social network (founded by former GAMAM executives) based on a blockchain and allowing users to control their choices of access to information. Lumo has teamed up with Bluesky, a similar project run by former Twitter employees, to develop content verification and authentication technologies;
  • Mobilizing civil society to defend democratic values: Spiritoj, an international community of committed citizens, has called for the formation of a union to preserve free and open societies. Spiritoj is confronted by neo-fascist groups such as 64Chan, who defend freedom of expression without moderation. Another organization, Cyber Snipers for Democratic Values (LISCVD), has formed to carry out informational interference operations against authoritarian regimes;
  • Technological evolution and impact on information consumption: numerous technological innovations have been developed to give people greater control over their access to information, such as the Fakewall One, a box that blocks fake news, and the anti-doomscrolling patch available in pharmacies, which reduces the effects of dopamine linked to social networks;

Spiritoj is pitted against neo-fascist groups such as 64Chan, who defend freedom of expression without moderation. Another organization, Cyber Snipers for Democratic Values (LISCVD), has formed to carry out informational interference operations against authoritarian regimes;

  • Technological evolution and impact on information consumption: a number of technological innovations have been developed to give people greater control over their access to information, such as the Fakewall One, a box that blocks fake news, and the anti-doomscrolling patch available in pharmacies, which reduces the effects of dopamine linked to social networks; and
  • Fragmentation of the global Internet and geopolitical alliances: tensions between the major powers on the global Internet have increased, resulting in the fragmentation of the Internet into compartmentalized regional zones. Economic and cultural partnerships have been forged between China, Russia and Africa, while regional states such as California and Scotland have empowered themselves to defend democratic values.

Part II: Strategic Analysis

A. Approach

Prospective analysis allows the description of possible “futures” by comparing trends with unforeseen or underestimated events. Strategic analysis aims to develop a response along three lines: “what can I do,” “what will I do,” and “how will I do it?

B. The Strategic Objective

1. Definition

The objective is to put in place, by 2035, a public policy embodying France’s determination to combat informational interference, by strengthening the current system and adopting a fully interministerial approach, based on partnerships with digital players and a mobilized civil society.

2. Compliance Criteria (« OSCAR 2 » Format)

A SINGLE OBJECTIVE: to counter the adversary’s actions by acting in “end-to-end” mode on all stages of a digital interference action. It will be broken down into sectoral objectives (education, diplomacy, State/citizen links, etc.), without calling into question either the gains brought about by digital technology, or the fundamental democratic values of freedom of expression and freedom of the press.

SIMPLE: this strategy is simple because it’s easy to understand and concerns the everyday lives of the French.

At a time when the French seem to want to be more involved in public life, our approach aims to give them a sense of responsibility in their lives as citizens, and to help define a unifying model for digital society.

A CHOICE ASSUMED WITH PRIORITIES: it’s clear that the State has become aware of the defense and security dimension of information warfare. As a result, we need to build on France’s and the EU’s strengths (administration, legislation, allies with corporate models that are seasoned in these modes of action, societal involvement of certain companies, etc.) to develop human and technical mechanisms that make foreign interference inoperative or less invasive. The priorities of this strategy are therefore: a clear public policy, a strong involvement of citizens and digital players. This strategy does not aim to create a new administrative structure at French State level, nor to create specific international alliances on this subject.

AMBITIOUS: this strategy is based on the conviction that our open, democratic model of society is a strength. Assuming it by involving the French in an approach that promotes critical thinking is an ambitious strategy that aims to make French society more resilient in the face of foreign interference. It also has the advantage of revitalizing democratic life by putting the debate of ideas back at the heart of the “cite.”

REALISTIC: in a tight budgetary context, the aim is to optimize the existing situation by leveraging the societal and regulatory aspects to develop technical and intellectual “obstacles.” The coordination of inter-ministerial actions is also central to this study. Lastly, the idea is to draw inspiration from the models of partner countries, to benefit from their unique experiences and find a community of interest for a European project.

AN ANSWER TO THE QUESTION ASKED: although digital influence is at the heart of the problem, the strategy does not focus solely on its technical aspect. By integrating the main principles of digital usage, it provides a comprehensive response to an information war (propaganda and manipulation of information) being waged from abroad in the French and even European information sphere, and which is spreading to the general public. Considering that such interference is fully integrated into the new phases of conflictuality (competition, contestation and confrontation), this strategy is applied at all times to protect our country, limit opposing initiatives and contribute to reinforcing the nation’s moral values.

C. Defining Alternative Strategies

As mentioned above, information warfare is at the heart of authoritarian regimes’ strategies to oppose Western powers and control their populations. On the other hand, Western societies, most of which are democratic, are natively open and therefore exposed to digital interference. This rivalry is characterized by its seriousness and unprecedented nature (scale, speed of the phenomenon, secrecy, etc.), the ultimate aim of which is to change behavior in favor of the adversary. To determine the spirit of the two alternative strategies, we took the following parameters into account:

  • Current dynamics:
  • France: ongoing debate on the Nation’s “moral forces,” VIGINUM and coordination groups, growing importance of the reserve, Universal National Service (SNU), . . . ;
  • Europe: strategic compass, involvement of the EEAS, conceptualization of the problem (Foreign Information Manipulations and Interference (FIMI)), a history, a dynamic NGO ecosystem, . . . ;
  • Uncertainty about the permanence and strength of democratic values in the EU; and
  • Threats: (i) Russia, to be contained immediately; (ii) Turkey, Iran, Azerbaijan, etc., to be dealt with on an ad hoc basis; (iii) China, to be contained over the long term.

The strategy to be adopted can therefore be based on two approaches:

  • a national approach based on moral forces; and
  • a systemic approach at European level based on reason.

1. “Helsinki-sur-Seine” Strategy

France, like many other countries, is faced with the growing threat of informational interference. Such interference can take many forms, from the dissemination of false information to the manipulation of public opinion and can have a significant impact on French society. The Finnish model is recognized as one of the most effective in the fight against informational interference. Finland has developed a holistic approach based on collaboration between government, media, citizens and civil society organizations. As this model is the most successful in Europe, it was chosen for the construction of this strategy.

This global approach implies the definition of a clear national identity, a single national narrative and a shared spirit of defense. It therefore requires strong support from political parties and intermediary bodies. If this strategy is to succeed, we also need to forge closer links with Finland, to understand its model and draw inspiration from it. It can be envisaged in two phases: phase 1, “the time of political consensus,” 2024/2027, and phase 2, “the time of implementation,” 2027/2035. France’s strategy for combating informational interference would focus on the following five areas:

  1. Strengthen the resilience of French society by addressing the following themes:

Strengthening the resilience of French society requires a multi-faceted approach, combining education, culture and institutional support. Here are a few key areas to tackle this crucial issue:

  • developing educational programs from an early age, instilling in pupils the skills needed to critically analyze media content;
  • building a strong, inclusive national narrative that embodies common values, shared histories and ideals that transcend divides; and
  • financially support independent, quality media in order to preserve their editorial autonomy and their ability to practice rigorous journalism.
  1. Improving detection and warning capabilities

Improving our ability to detect and warn of informational interference is imperative to guarantee the integrity of our democratic processes and the security of our institutions. Here are a few essential measures to strengthen this capacity:

  • strengthen the current monitoring system by making it more interministerial, for better integration and effective coordination between the various government players involved;
  • collaborate with digital platforms to quickly identify and remove manipulated content and fake accounts (cf. the strengthening of European rules); and
  • raise citizens’ awareness of the challenges of the IML and arm them against misinformation, particularly during major national events such as elections or major sporting fixtures.
  1. Organizing interministerial coordination and response

The fight against informational interference requires effective coordination and a unified response at interministerial level. Here are two key areas for strengthening this coordination and improving the response to this complex challenge:

  • to ensure coherent and effective action, define a clear and precise national doctrine for combating information interference (guiding principles, strategic objectives and intervention guidelines for all ministries and agencies concerned); and
  • cooperate with other countries and international organizations to share information, analyses and best practices in the fight against disinformation and information manipulation.
  1. Involvement of civil society

The commitment of civil society is an essential pillar in the fight against informational interference, requiring the collective mobilization and active participation of citizens. Here are three key takeaways of strengthening this commitment:

  • mobilize social and environmental responsibility (SER) in the fight against informational interference, by including a specific component on “accountability in the informational sphere” in the obligations imposed on companies;
  • organize open and transparent dialogues to explain how the state works and the institutional response; and
  • involve the National Guard, the SNU and the voluntary sector. Every citizen has a constitutional duty to contribute to the defense of the nation.
  1. Research and innovation

Research and innovation play a crucial role in the fight against information manipulation and involve ongoing investment and close collaboration between different players. Here are two key areas for promoting research and innovation in this field:

  • developing innovative detection technologies; and
  • encouraging interdisciplinary research, bringing together experts in IT, social sciences, communications and other relevant fields.

2. « The Century of LEDs » Strategy

Europe, like the rest of the world, is faced with the rise of informational interference. Such interference threatens the very foundations of European democracy, undermining confidence in institutions and manipulating public opinion. The Enlightenment, with its spirit of rationalism, criticism, and freedom of expression, offers a valuable source of inspiration for combating this phenomenon. This strategy requires a strong commitment from all players, institutions, citizens, and digital platforms.

Putting this strategy into practice will require a great deal of diplomatic involvement on the part of France. Drawing inspiration from the RETEX on Cyberdefense, a timeframe could be envisaged: time one “reinforcement of the FR mechanism” 2024/2027, time two “promotion of a FR initiative in the FIMI field” 2027/2029, and time three “adoption and implementation” during the EU mandate 2029/2034. The European strategy to combat informational interference would be structured around the following five axes:

a. Education and Promotion of Critical Thinking

In the current context of online information proliferation, it has become essential to equip European citizens with the skills needed to discern truth from misinformation. For the EU, the following two approaches are possible:

  • roll out a continent-wide media and information literacy program aimed at equipping citizens with the skills and tools they need to critically assess online content and identify reliable sources of information; and
  • actively support research and innovation initiatives in the field of media and information literacy to help identify best pedagogical practices.

b. Strengthening the European Media Landscape

Strengthening Europe’s media landscape is crucial to preserving democracy and promoting an informed, pluralistic society. With this in mind, the EU is implementing measures to support and protect independent, quality media, an essential pillar of democracy. More specifically, these measures include:

  • providing financial and structural support for independent media, to ensure their economic viability and editorial independence;
  • avoiding media concentration, which can compromise the diversity of voices and freedom of expression; and
  • as part of the strengthening of the European media landscape, put in place a robust legal framework to guarantee media freedom and accountability, and ensure that media players comply with ethical and professional standards.

c. Digital Platforms and Transparency

The regulation of digital platforms and the promotion of transparency are crucial issues in the current context of massive online dissemination of information. The EU is committed to implementing strict regulations to guarantee the transparency of algorithms and protect users’ personal data. With this in mind,

  • apply European regulations, in particular the Digital Services Act (DSA) and the Digital Markets Act (DMA); and
  • promote a proactive approach to reliable and responsible online information by encouraging the adoption of a “European Digital Information Charter.”

d. European Cooperation and Coordination

European cooperation and coordination are crucial to countering these threats and ensuring the continent’s resilience in the face of these informational attacks. More specifically we would recommend to:

  • create a European agency dedicated to the detection and characterization of informational interference: this agency would be responsible for monitoring suspicious informational activities, identifying sources of disinformation, and carrying out in-depth investigations to counter these threats on behalf of the EEAS (in close liaison with the European Center of Excellence for Combating Hybrid Threats, located in Helsinki);
  • strengthen the Early Warning System (EWS) and the Integrated Policy Response Capability (IPCR) to enable more rapid reporting of attempted interference to a permanent network involving all member states on a mandatory basis; and
  • boost cooperation between member countries to share information and best practices.

e. Dialogue and Citizen Engagement

Dialogue and citizen engagement are essential to strengthen the resilience of European society to informational interference and to promote a healthy and democratic media environment. The EU is committed to taking action in this direction through a number of initiatives:

  • organizing regular awareness-raising campaigns to inform citizens about the dangers of informational interference, particularly in the run-up to major events, encouraging critical behavior when consulting online information;
  • actively encourage dialogue and citizen participation in reflection on the future of information in Europe; and
  • provide financial and institutional support for civil society initiatives that contribute to the fight against informational interference.

D. Choice of the Final Strategy. The Two Strategies (“Helsinki-sur-Seine” and “The Century of LEDs”) Were Compared with the two Scenarios Selected in the Prospective Analysis According to Five Criteria (see Figure 3).

In both scenarios, the European-oriented strategy offers the best long-term effectiveness in the face of the threat mentioned above, but, because it is based on a European political consensus, its construction involves many hazards (particularly in the “Erasure of Borders” scenario, which sees an EU paralyzed by nationalist populisms).

The strategy centered on a strong national identity is, obviously, more scalable, because it is based solely on national choices (but limited means). But it is risky in the long term, because it does not guarantee access to the technologies necessary, on the one hand, for economic development, and on the other hand, for understanding the cognitive war prepared by our competitors.

  GAMAM-X Style Blurring borders
« Helsinki-sur-Seine » Finances : •••• Finances : •••••
Feasibility : •••• Feasibility : •••••
Social acceptability : ••• Social acceptability : ••••
Effectiveness : ••• Effectiveness : ••••
Flexibility : • Flexibility : •••
« The Century of LEDs » Finances : •• Finances : •••
Feasibility : ••• Feasibility : ••••
Social acceptability : •• Social acceptability : •••
Effectiveness : • Effectiveness : •••
Flexibility : ••• Flexibility : •••••

Legend:

  • Finances: • cheap  ••••• very expensive
  • Feasibility: • easy to implement  ••••• difficult to implement
  • Social acceptability: • easy  ••••• difficult
  • Efficiency: • very effective (regarding threat)  ••••• not very effective
  • Flexibility: • very scalable  ••••• not very scalable

Figure 3 – Illustration of the evaluation of strategies in relation to scenarios. It emerges that, in the short term, the national level is the most relevant to respond to the problem (emergency in the face of Russia) by strengthening partnerships in a bilateral framework: this option also makes it possible to promote a common destiny based on democratic values. On the other hand, it seems that an EU-wide approach is more relevant to confront the information war increased tenfold by AI, because it involves heavy investments. This convergence of struggles will be all the easier to set up because it will be based on a network of already existing cooperations. In view of the initiatives underway within the EU, this option also aims to make information technologies as intelligible as possible. The chosen strategy therefore aims to develop a French response in line with a European ambition to combat foreign digital interference, now and by 2035.

First, the adopted strategy “Transforming the digital sword into a democratic shield”: the fight against foreign digital interference represents a complex and pressing challenge, which requires a response that is both targeted and comprehensive. With this in mind, France is proposing a multidimensional and coordinated strategy, in synergy with a broader European approach. This strategy aims to protect both citizens and democratic institutions against information threats, while preserving freedom of expression and the circulation of reliable information in the digital space. The ideal would obviously be to start with a strategy supported by the EU. But it is clear that there is still no consensus on this subject. At this stage, France is one of the main targets of information interference. And if not all EU countries are affected by this problem today, they will undoubtedly be tomorrow. The feedback on the development of cybersecurity at the European level illustrates the relevance of testing a national model before considering (with partners) the credible promotion of a common vision within the Union. The combination of our two strategies thus takes this political reality into account, but implies a stagnation, or even a decline in populism to allow a European consensus: otherwise (scenario “Erasure of borders”), France will benefit from an autonomous combat system, but limited in resources, which will require reviewing its alliances to have access to all digital technologies.

To translate this strategy into recommendations, the following principles are retained:

  • digital technology (and its use) is the guiding principle;
  • the fight against information interference requires an interministerial, multidimensional, multi-sector approach coordinated with the EU;
  • the EU must develop complementary and synergistic responses supported by several nations to facilitate their deployment; and
  • the involvement of French and European citizens is essential.

To deploy such a strategy, the recommendations are structured around two axes:

  1. Axis 1: “The Development of a French response. . .” based on three recommendations:
    1. Strengthen monitoring, analysis and response by relying on a fully interministerial state system, increased collaboration with the private sector and ARCOM;
    2. Communicate and raise awareness by informing citizens, journalists and information professionals about the extent of information interference; and
    3. Develop the commitment of civil society by educating on the uses of digital technology and involving citizens in the defense of the information sphere.
  2. Axis 2: “… combined with a European ambition” based on three recommendations:
    1. Cooperate and share information within the EU by promoting a collective response (contributes to the creation of a consensus on the importance of the fight against information interference);
    2. Develop research and innovation to support the development of new tools and technologies to combat disinformation; and
    3. Carry out proactive external action in the service of strengthened international cooperation by ensuring the adhesion of Member States to the fight against information interference on the EU political agenda.

This strategy will be detailed according to the six recommendations listed above in the following chapter. The consistency of this strategy with the Alliance’s approach is provided by the initiatives already undertaken. Indeed, the EU is one of the main partners chosen by NATO, which recognizes the importance of cooperation to counter disinformation. NATO and the EU are intensifying their collaboration, focusing in particular on strategic communication, improving situational awareness and holding joint exercises. The European Centre of Excellence for Countering Hybrid Threats in Helsinki aims to strengthen the civil-military capabilities of participating countries, increasing their resilience and preparing them for information interference. In conjunction with the EU and NATO, this centre of expertise will also have the mission of raising awareness among leaders and public opinion in Western countries. It embodies this desire for cooperation.

The implementation of this strategy also relies on renunciations. Given the level of public debt, major national investments are not envisaged, particularly in the field of R&D. The mastery of new technologies for the LMI relies on a European strategy. The active promotion of the French model is also not retained: soft power and counter-narrative remain the bases of the French response to its competitors. As for the resilience of the population (no “Finnish” model), it relies on the discernment of citizens in the uses of digital technology and does not develop the political aspect of the response, in particular societal projects aimed at the well-being of the population to be considered as a “vaccine” against disinformation. Finally, this strategy does not plan to deal with the supporters (“useful idiots”) of our adversaries. It does not call into question freedom of expression, and it is limited to actions where there is an involvement, direct or indirect, of a foreign actor, state or not.

Finally, to assess the relevance of this response, the proposed strategy was compared with the DISinformation Analysis & Risk Management framework (DISARM) Blue matrix, which allows to design a response to an information attack. As a reminder, this matrix was created in 2018 to combat information manipulation campaigns, and the DISARM model uses the structure of the ATTACK matrix used in the field of cybersecurity. There is a DISARM red model to characterize an attack and a DISARM blue model to design the response. The “Transforming the digital sword into a democratic shield” strategy covers the main areas of the DISARM blue model except for the subject of measuring effectiveness. Furthermore, this exercise shows that the DISARM model is perfectly suited to the tactical level, but not to the strategic level addressed by this study.

Part III: Operational Recommendations

A. Preamble

Combating informational interference is a complex challenge. The proposed strategy aims to develop a credible, multidimensional, and coordinated French response, in synergy with a global European approach, to protect citizens and democratic institutions against informational threats, guaranteeing freedom of expression and the circulation of reliable information in the digital space.

B. List of recommendations “Developing a French response . . .”

1. Recommendation A “Strengthen Monitoring, Analysis and Response”

(i) Specific objective: strengthen watch, analysis and response to foreign digital interference, with a state response via interministerial consolidation supported by increased collaboration with private players (media, network platforms) and a strengthening of the regulatory authority (ARCOM).

(ii) Description of the existing system: with VIGINUM under the aegis of the SGDSN (detection, characterization), the involvement of intelligence services in imputation and attribution actions, and a response and influence strategy supported by the MEAE (STRATCOM) and the MINARM (protection of our military operations), France has an organizational arsenal envied by its international partners. The implementation of the DSA, by a group of European regulators under the authority of the Commission, aims to ensure that platforms comply with the imperative to reduce the distribution of illegal content, and to increase the transparency of online platform algorithms. Finally, it involves monitoring the implementation of remedies designed to limit the artificial amplification and de-hierarchization of information.

In legal terms, the July 29, 1881 law on freedom of the press remains central. The year 2018 saw an important addition with the law of December 22, known as the “fake news law,” relating to the fight against information manipulation. This has been shortly supplemented by a law against foreign interference, inspired by the American Foreign Agents Registration Act (FARA) and the United Kingdom (National Security Act 2023, which introduces a Foreign Influence Registration Scheme, or FIRS).

This Act of July 25, 2024 to prevent foreign interference in France provides for the creation of a digital register of foreign-influenced activities to be kept by the French High Authority for Transparency in Public Life (HATVP). It will be separate from the register on interest representatives (lobbies) created by the so-called “Sapin 2” law of 2016.

This register will list, after declaration to the High Authority, the activities of people acting on behalf of a “foreign principal”: foreign powers or entities or foreign political parties or groups outside the European Union. The purpose of these activities must be to influence public decision-making (particularly law-making) or the conduct of public policy, including France’s European or foreign policy. They may involve:

  • communicating with elected representatives or public decision-makers (declared candidates in national or European elections, leaders of political parties, ministers, ministerial advisors or advisors to the Head of State, members of parliament, regional and departmental executives, mayors of towns with more than 20,000 inhabitants, former presidents of the Republic or ministers for five years after leaving office, etc.);
  • or/and carry out communication campaigns; and
  • or/and collect or pay money for no consideration.

Foreign entities or political parties acting to promote their interests or those of a foreign state will also have to declare their activities.

Thus, any individual or legal entity, regardless of nationality, promoting the interests of a foreign power to the French public authorities or the general public, will be subject to a reporting obligation. But diplomatic and consular staff posted in France and officials of foreign states will be exempt.

This directory of foreign-influenced activities will be public and will be shared by the HATVP, the National Assembly and the Senate.

The Act also provides for sanctions: “Individuals refusing to provide the HATVP with information (on their identity, their influence, the people they approach, etc.) risk three years’ imprisonment and a 45,000 euro fine. The penalties for legal entities are more severe: a fine of 225,000 euros, a ban on receiving public aid . . .”.

The scheme will come into force no later than July 1, 2025, and an implementing decree is planned.

The text has been supplemented to require think tanks and institutes to declare foreign (non-EU) donations and payments to the HATVP.

In addition, former ministers, local executives and members of independent authorities will be subject to stricter control over their career transition to the private sector. This control, currently carried out by the HATVP with regard to conflicts of interest, is extended to cover the risk of foreign influence over a five-year period.

It also reinforces penal provisions. A new aggravating circumstance has been created in the penal code when an attack on property or persons is committed on behalf of a foreign or foreign-controlled power or entity. The penalties incurred will therefore be more severe. In such cases, special investigative techniques (wiretapping, etc.) may be used.

The law is applicable overseas, notably in New Caledonia and French Polynesia.

For what regards algorithm technique and asset freeze, the law authorizes intelligence services, on an experimental basis until June 30, 2028, to use algorithmic techniques to detect connections likely to reveal foreign interference or threats to national defense (e.g. cyber attacks). The use of this technique is currently only permitted to detect terrorist threats, and it has been permitted on a permanent basis since the law of July 30, 2021 on the prevention of terrorist acts and intelligence.

The government will have to submit an interim report and a final report evaluating this extension of algorithmic technique.

In addition, the procedure for freezing financial assets, authorized in the case of terrorism, has been extended to cases of foreign interference. Persons engaging in, inciting or financing such acts will thus be able to have their funds and resources frozen in France. An act of interference is defined in the Monetary and Financial Code as “an act committed directly or indirectly at the request or on behalf of a foreign power and having the object or effect, by any means, including the communication of false or inaccurate information, of undermining the fundamental interests of the Nation, the functioning or integrity of its essential infrastructures or the regular functioning of its democratic institutions.”

A better information for Parliament is also included, i.e, before July 1, 2025, and every two years thereafter, the law requires the government to submit a report to Parliament on the state of threats to national security, particularly in terms of foreign interference. This report may be debated in Parliament.

(iii) Means of execution:

  1. A position of national coordinator for the fight against the manipulation of information will be created under the supervision of the Prime Minister to define the national doctrine (to be implemented in public policy) and coordinate actions to combat digital interference. The “education”, “engagement”, “awareness,” and “communication” components included in the state response will be addressed in recommendations B and C.
  2. ARCOM, equipped with new resources and missions, will be reinforced by a civilian college collecting initiatives against disinformation while respecting ethics and democracy, particularly in the service of recommendation B. A body coordination with platforms and telecom operators will be created to protect the highlights of democratic life and international collaboration (elections, international summits, etc.), thus perpetuating the initiatives carried out for the 2024 European elections. A body for monitoring the financing of disinformation, under the joint supervision of ARCOM and the Professional Advertising Regulation Authority (ARPP), would be created for the main players in online marketing and advertising (platforms, advertisers, marketing agencies, influencers).
  3. The protection of economic assets, in particular Operators of Vital Importance (OIV) or Essential Service Operators (OSE) listed on the stock exchange, against manipulation of information will be reinforced via a dedicated coordination body, involving Service de Information and Economic Security (SISSSE) and the Financial Markets Authority (AMF) with awareness-raising missions on the subject among business leaders.
  4. On the technical aspect of the media, LMI’s public policy must encourage all technologies and practices which contribute to the reliability of information (e.g., Newsguard, etc.) or according to the principles of the Japanese initiative based on media technology “Originator Profile (OP).” In the light of these initiatives, the DSA will have to be re-evaluated in order to determine whether it is relevant to develop it. In the medium term, a public algorithm service will be created to regulate uses and avoid confinement in information bubbles.
  5. The adoption of a French FARA is now effective, continuing the work already undertaken at the level of the assemblies. By a new Senate report entitled “Combating malicious foreign influence mobilizing the whole nation in the face of the neo-Cold War” was presented on July 25, 2024. It is the result of the work of the commission of inquiry into public policy in the face of foreign influence operations where the author of this article sat as Vice-chairman, some of whose hearings can be viewed online.

This organization is structured around a contribution from three thirds (State, private sector and individual) who meet periodically under the aegis of the coordinator and form a crisis unit in the event of an impactful event. The full implementation of these actions, starting in 2027, will stimulate diplomacy for the preservation of a democratic and transparent information space by aiming to combat information interference (see recommendation F).

(iv) Actors concerned: ministries, SGDSN, media, associations, in coordination, at European level, with the European Committee for Public Media Services (CESM), an independent body bringing together regulatory authorities, including ARCOM and the ARPP.

(v) Implementation schedule:

  • 2024: appointment of a national LMI coordinator;
  • 2025: publication of a public LMI policy mainly targeting informational interference (in line with the EU approach);
  • 2026: strengthening of ARCOM’s missions and resources;
  • 2027: creation of a public algorithm service; and
  • between 2026 and 2030: strengthening of administrations contributing to the LMI in line with the policy published in 2025.

(vi) Cost estimate:

  • Human resources (HR), for the state organization: 40 full time employees (FTE);
  • Budget: €10 million/year for financing independent actions (via an audiovisual tax on major digital platforms and/or European funds in addition to what is provided for by the Media Freedom Act).

(vii) Limits and risks of implementation: all actions must be carried out in concert (rule of 3 thirds: state, private and citizens) to effectively combat foreign digital interference.

2. Recommendation B “Communicate and Raise Awareness”

(i) Specific objective: inform citizens about the issues and the reality of informational interference and strengthen their vigilance regarding the mechanisms of online disinformation and promote, particularly financially, the best journalistic practices in the fight against disinformation.

(ii) Description of the existing situation: The National Commission for Information Technology and Liberties (CNIL) and ARCOM have created an educational kit for the education of the digital citizen for trainers and parents. Due to a lack of visibility, this type of initiative is not well known enough by citizens. Furthermore, the Government Information Service (SIG) notably informs the general public of Government actions, coordinates the communication of ministries and monitors media and social networks in order to analyze opinion. Major player in communication which is among the leading advertisers in France, it must speak to all audiences and could further inspire communication professionals, including the private sector, in order to amplify its impact. Some private actors are also mobilized, such as the Descartes Foundation which identifies and maps the multiple actors acting in France, Europe and the world, involved in the fight against disinformation and in the promotion of a public debate based on sincere information.

In addition, a set of practices and tools allow journalists to verify the authenticity or origin of information without their readers necessarily knowing. Designed in 2019, the Journalism Trust Initiative (JTI) of Reporters Without Borders (RSF) is a tool for certifying transparency and compliance with the standards of ethics and professional conduct of the news media. After online training to combat climate disinformation, available since April 2023, Agence France-Presse (AFP) launched, in November 2023, a new course for journalists and journalism students in order to face the challenges growing disinformation surrounding the elections. In January 2024, the Cultural Affairs Committee of the National Assembly launched a flash mission on foreign interference in the media with a view to examining the different avenues through which foreign powers attempt to disseminate their messages to French public opinion, as well as the actions carried out by French and European public authorities to try to contain this interference. Independent and quality media play a crucial role in the dissemination of reliable and verified information, thus guaranteeing the democratic health of society.

(iii) Means of execution:

  1. Create a unit dedicated to LMI awareness within the SIG to identify the initiatives of the various ministries and private actors under the leadership of the national coordinator for the fight against the manipulation of information; to organize awareness and information campaigns for each age group; promote the public version of VIGINUM’s foreign interference actions database;
  2. Financially support independent and quality media, in particular by providing subsidies or tax incentives, in order to preserve editorial autonomy and their ability to exercise rigorous and impartial journalism: this system will be financed via an audiovisual tax on media. Major digital platforms and/or European funds in addition to what is provided for by the Media Freedom Act; and
  3. Study with the French Federation of Press Agencies (FFAP) how to improve the functioning of the media in the context of new digital uses, and promote good practices, based on the example of the JTI.

(iv) Stakeholders concerned: the SIG, ministerial and private stakeholders already involved, as well as journalists and media of all types and all media.

(v) Implementation schedule:

  • 2025: creation of an LMI unit in the SIG;
  • from 2025: citizen awareness campaigns on LMI issues; and
  • from 2026: strengthening of financial support for the independence of the press and media.

(vi) Cost estimate: communication budget of €5M/year. Creation of 5 FTEs for the LMI unit of the SIG. Fiscal impact made neutral by reorientation of current measures on new criteria.

(vii) Limits and risks of implementation: the scope of the LMI unit of the SIG must be clearly defined in order to prevent the idea of state “propaganda:” the transparency of the system must contribute to better confidence in the institutions. Freedom of expression is not called into question. The priority of communication campaigns will be foreign digital interference.

3. Recommendation C “Develop Civil Society Engagement”

(i) Specific objective: develop the engagement of civil society along two axes—education in the uses of digital technology and support for citizens and legal entities who wish to mobilize against informational interference.

(ii) Description of the existing situation: Since 1983, National Education has had a Media and Information Education Liaison Center (CLEMI), whose mission is “to teach students a citizen practice of the media [by relying] on dynamic partnerships between teachers and information professionals. Its mission is in particular to produce and disseminate resources with a view to supporting actions with students from kindergarten to high school. This learning mission is carried out unevenly across the territory and the manipulation of information is not specifically addressed as a massive, deliberate phenomenon involving a foreign actor aimed at harming the fundamental interests of the Nation. Furthermore, several education stakeholders interviewed believe that although the majority of students are aware of the principle of “fake news,” this does not encourage them to diversify their sources of information.

The Ministry of Culture is also mobilized in favor of media and information education (estimated annual budget of €8M) via support for educational tools, the mobilization of public media, public libraries and the organization of annual Digital Culture Meetings aimed at promoting these actions.

There are also numerous associations raising awareness of critical thinking and the risks linked to online disinformation (i.e., APEM, Fakeoff), which seem unevenly active across the country. Furthermore, the risk linked to informational interference is rarely specifically mentioned in their scope of action (often broader on disinformation, linked to critical thinking, cyber-harassment or other).

Academic research in France and abroad also provides valuable studies on citizens’ behavior in the face of the media and disinformation (i.e., Medialab from Sciences Po), the threat of informational interference in certain contexts (i.e., the Strategic Research Institute of the Military School (IRSEM) as well as psychological mechanisms of information manipulation (e.g., University of Cambridge).

Finally, certain more recent devices can be mentioned, given their link with the specific objectives described above:

  • the VIGINUM ethical and scientific committee, made up of eight independent personalities and responsible for monitoring the activity of the service and making recommendations on it via a public annual report; and
  • the operational cyber defense reserve of MINARM (COMCYBER) and MININT.

(iii) Means of execution:

  1. Involve National Education (with the support of VIGINUM) according to the following priority axes: systematize and strengthen media and information education (EMI) among all students, and include a component specific on LMI, generalize the media education week for all students (i.e. Finland), train all teachers in LMI (including a section on LMI;
  2. Create units involved in LMI in the National Guard to support the implementation of public policy dedicated to informational interference (see recommendation A);
  3. Mobilize the Economic, Social and Environmental Council (CESE) via its Education, Culture and Communication Commission to contribute to the drafting of LMI public policy (see recommendation A);
  4. Extend the scope of responsibility and the resources of the VIGINUM ethics and scientific committee to the entire state system aimed at combating informational interference to help strengthen citizen confidence in the state response;
  5. Include in corporate social responsibility (CSR) the actions of private structures engaged in approaches discouraging the financing of disinformation and informational interference and sanction the actors who contribute to it. Implementation would be followed by the ARCOM-ARPP body described in Recommendation A.

(iv) Stakeholders concerned: the list of impacted stakeholders includes National Education (all schools, middle schools, high schools), including VIGINUM, the SNU, the reserve, the media, social media platforms, advertisers, the EESC or even citizens as a whole.

(v) Implementation timetable: by mid-2027, the implementation of the following measures is proposed:

  1. 2024: launch of initiatives within National Education, with the objective of finalizing VIGINUM-CLEMI content in the first half of 2025, and implementation of the “media week for all” and teacher training from the start of the 2025 school year (with a system fully operational in 2030);
  2. 2025: creation of the LMI ethics and scientific committee, concomitantly with the development of public doctrine in this area (see recommendation A);
  3. 2025: involvement of the CESE in the definition of LMI public policy (see recommendation A);
  4. 2025: launch of initiatives linked to the reserve;
  5. 2026: proposal/draft law on CSR to be initiated from 2025 for adoption in 2026.

(vi) Cost estimation: within National Education, the CESE and CSR stakeholders, the reallocation of existing resources will be prioritized. New resources will be mobilized for the reserve and the SNU with a target of 30 dedicated FTEs (i.e., 300–350 reservists based on an average of about twenty days per year. The ethics & scientific committee will be strengthened via the allocation of a team of four permanent FTEs and an annual study budget (i.e. €1M/year in total).

(vii) Limitations and risks of implementation: the involvement of National Education is the issue given the necessary resources (time, training, etc.). Media and information education must be done correctly to avoid excessive skepticism. Directed by an action that is too politicized, it could even give the feeling of the exercise of a ministry of propaganda.

C. List of Recommendations “ . . . Combined with a European Ambition”

1. Recommendation D “Cooperate and Share Information”

(i) Specific objective: develop cooperation and actively participate in sharing information relating to FIMI at EU level to, on the one hand, improve the effectiveness of the French response, and on the other hand, support the recommendation F aimed at building a consensual political project between member states.

(ii) Description of the existing situation: Several European organizations are mobilized against FIMI, notably since the detailed action plan against disinformation of 2018. The EEAS implements the CFSP under the direction of the High Representative of the EU for Foreign Affairs and Security Policy, in close collaboration coordination with the EU Foreign Affairs Council (EAC), where the foreign ministers of member countries sit. The EEAS is the central coordinating body in the fight against FIMI between Member States and at the international level (i.e., G7, NATO) via its Strategic Communication division and its various Task Forces (South, East, Western Balkans). This division also manages the Russian threat awareness and documentation site www.euvsdisinfo.eu as well as the EU rapid response system (SRA) in the event of a disinformation campaign targeting several Member States. The resources devoted to these measures are limited and depend in part on the secondment of national experts. They remain distant from political decision-makers. Teams must coordinate with other EU organizations such as DG Connect. Furthermore, the diversity of approaches and specific political wills (i.e., Poland, Hungary) makes consensus difficult in matters of FIMI.

There are, in addition, several European organizations:

  • European Digital Media Observatory (EDMO), created in 2018, aims to support and lead the independent European community which fights against disinformation (organization of fact-checking and university research in particular), to provide media education resources, and to assist public decision-makers on the subjects of disinformation. EDMO is managed by a consortium of universities and has a governance structure that is completely independent of public authorities;
  • The European Center of Excellence against Hybrid Threats is an autonomous international organization based in Helsinki, created in 2017, whose mission is to strengthen the capacities of its participating States to prevent and counter hybrid threats, through the informal sharing of good practices, issuing recommendations, new ideas and approaches. With 35 member countries, the center has an annual budget of €4M.

Finally, events such as the European News Media Forum (ENMF), organized by DG Connect, allow for annual discussions on news media issues.

(iii) Means of execution:

  1. Promote, in partnership with proactive European countries on FIMI, the increase in resources granted to EDMO, the extension of its mission to informational interference and the diversification of its governance beyond the academic world (add for example: press groups, public broadcasting groups, press agencies, NGOs such as RSF, journalism schools, etc.). Provide EDMO with a fund to finance initiatives from civil society;
  2. Accelerate the decision-making of the SRA by creating an entity mandated by the CAE to react to any detection of proven action. Ideally this response will be done in coordination with the Member State(s) concerned without excluding an autonomous response on behalf of the EU (i.e., denunciation, imputation, or even economic sanctions);
  3. Amplify the impact of the annual ENMF coupled with the European Media Education Week, by encouraging the French media to join the event (notably public audiovisual groups) and by communicating on the event through media awareness and education initiatives (see recommendations B and C).

(iv) Actors concerned:

  • national authorities responsible for guiding European decisions (MEAE, parliamentarians), ministries and state bodies with cooperation at EU level;
  • the European authorities, and primarily the European Commission and the EEAS, as well as France’s European partner countries (for example Sweden or Finland - see recommendation F);
  • actors from civil society: the media (traditional and new, public and private), journalism schools and NGOs.

(v) Implementation schedule:

  • from 2024: promotion and development of bilateral cooperation in the field of FIMI;
  • from 2025: promotion by the MEAE within the CAE of cooperation in matters of FIMI and the strengthening of the STRATCOM of the EEAS, and the creation of an entity delegated by the CAE to react diplomatically; provision of additional national experts in FIMI detection to the EEAS (i.e.: experts from VIGINUM).
  • from 2026: promotion to DG Connect of the evolution of EDMO’s objectives, means and governance and promotion of the European News Media Forum;
  • from 2027: development of cooperation actions to promote the development of a FIMI system modeled on that of cybersecurity (consistent with recommendation F).

(vi) Estimated costs: at EU level, the costs of combating disinformation were €10-15M per year between 2018 and 2020. By analogy with the world of cybersecurity in the broad sense, ENISA had a budget of about €23M in 2021 versus about €9M in 2009, 5 years after its creation. Finally, in France, VIGINUM had a budget of €12M when it was created in 2021. On the implementation horizon (2027/2028), we can therefore estimate an additional cost of about €20M/year for the strengthening of existing resources (+50% in 4 years), and the creation of a dedicated agency (excluding specific financing programs). At the French level, within the MEAE, additional dedicated staff of 5 FTEs or €1M/year would make it possible to implement the measures described above as well as recommendation F. Within VIGINUM the contribution to the SEAE will be made possible by 1 additional FTE.

(vii) Limits and risks of implementation: any international initiative is by definition based on the voluntarism and capacity for action of different countries: it will therefore be critical for France to carry out these initiatives in a partnership approach with countries ready to engage on this subject and not seek to impose a French vision.

2. Recommendation E “Develop Research and Innovation”

(i) Specific objective: develop research and innovation at EU level to support the development of new tools and technologies to combat disinformation.

(ii) Description of the current situation: Consideration of LMI in European research programs has emerged with the recognition of hybrid threats: for example, CNRS is a partner in the European AI4Trust project aimed at developing a platform of AI tools capable of helping public decision-makers and fact-checkers verify the reliability and origin of certain information. The involvement of human and social sciences (SHS) in LMI research programs is recommended to analyze past disinformation campaigns. This will help to protect the integrity of electoral processes and to better understand the impact of algorithmic prescriptions of social networks.

The field of LMI illustrates the duality of using AI for malicious purposes (e.g. the automatic design of disinformation campaigns), or for defensive purposes (i.e. to rapidly detect fake news or the artificial amplification of the spread of information). But, the development of new AI-based LMI solutions must not create new vulnerabilities, nor increase the attack surface.

(iii) Means of implementation:

  1. The French Ministry of Higher Education and Research is encouraging multidisciplinary research within the EU involving SHS alongside experts in cybersecurity, AI and LMI, with the creation of LMI-focused research posts. European research programs could also be opened up to partner countries such as India, Taiwan, Great Britain and West Africa. This approach would be accompanied by a transfer of these innovations to industry, via start-ups, for example, with highly agile development models. This would enable the emergence of a range of sovereign AI cybersecurity solutions, to anticipate growing threats such as adversarial attacks.
  2. To encourage platforms to share their data, it is proposed to initiate a dialogue between researchers and platforms. This would make it possible to better analyze and document the modes of action of online disinformation, and to measure the impact of user interfaces. Researchers would be able to propose indicators of the impact of disinformation campaigns, prevention or possible censorship, which could be a side-effect of content removal by platforms.
  3. It is also proposed to work closely with the major digital platforms to identify their potential subcontracting needs to create new markets and the rise of new LMI start-ups. A complementary solution is to increase the responsiveness of funding programs for collaborative R&D projects by promoting cascade-funding approaches, with European funding channeled to National Coordination Centers (NCCs) within the States, working as a network. These centers would prioritize needs at European level, while selecting SMEs and allocating funding at national level. In addition, it is proposed to focus the themes of the European call for projects on the development of:
  • automated detection tools for disinformation campaigns and deepfakes;
  • information labeling technologies based on the C2PA model;
  • technologies for estimating the degree of specialization of information presented to a user; and
  • technologies for cyber protection and assessing the robustness of AI to known and emerging threats.

(iv) Stakeholders: European players in academic and applied research in the fields of SHS, cyber and AI, digital platforms, start-ups and major European manufacturers.

(v) Implementation timetable:

  • in 2024: definition of the French position concerning the integration of the FIMI topic into European DIGITAL EU research programs (with consistency with research programs more specifically oriented towards homeland security and defense);
  • in 2025: creation of LMI-focused research posts; French proposal to revise the governance, operating and evaluation procedures for European R&D programs, to make them more agile and better adapted to dealing with new threats;
  • in 2026: adoption of French proposals to revise governance, operating and evaluation procedures for R&D programs;
  • in 2028: introduction of the first tools developed by the EU to combat informational interference; and
  • from 2029 onwards: research and innovation in support of the LMI achieve consensus within the EU.

(vi) Estimated costs: it is proposed to fund collaborative projects from the budgets of research programs already planned, by including the LMI field in the agenda of these programs and encouraging funding of at least seventy percent from the budgets of private-sector players.

(vii) Limits and risks of implementation: financing European collaborative projects is not envisaged at French level, except in the case of rare bilateral AAPs (ANR and BPI Franco-Germany. . .).

3. Recommendation F “Take Proactive External Action to Strengthen International Cooperation.”

(i) Specific objective: specify the involvement of French diplomacy in support of our strategy to combat informational interference. The strengthening of the French system is only envisaged through coordination at EU level and increased bilateral cooperation, requiring the reinforcement of the European legislative arsenal.

  • In Estonia, the STRATCOM team attached to the Prime Minister and the think-tank International Centre for Defense and Security (ICDS) appear to be effective organizations in the field of LMI.
  • NATO follows a two-pronged approach, focusing on understanding the information environment and proactively communicating with the public and partners through its Public Diplomacy Division (STRATCOM). NATO and the EU are intensifying their collaboration and holding joint exercises. NATO also cooperates closely with Ukraine, with the establishment of a specific platform since 2016. It collaborates with several Centers of Excellence, including the Helsinki Center for Hybrid Threat Management, the Riga Center for Strategic Communication and the Tallinn Center for Cyber Defense.
  • Since 2019, the G7 has had a “Rapid Response Mechanism,” an initiative led by Canada aimed at coordinating member countries to better identify and respond to the diverse and evolving threats to democracies, particularly in terms of disinformation.
  • It should also be noted that the United States, Great Britain and Canada are communicating on interesting joint initiatives, such as the development of a common framework for analyzing information manipulation.

(iii) Means of implementation: for French diplomacy, this involves :

  1. Propose at European level a FARA-type regulation to oblige agents representing the interests of foreign powers to declare their activities, as distinct from a liberticidal “Russian law”-type mechanism;
  2. Develop bilateral partnerships with countries with effective LMI systems (Sweden, Japan);
  3. Pursue, with the OECD, the work carried out with the creation of the DIS/MIS Resources Hub (DISinformation/MISinformation) and, more generally, define actions to strengthen resilience in third countries (notably in the EU’s neighborhood) to disinformation threats emanating from outside;

(iv) Stakeholders involved:

  • national authorities (governments, led by MEAE, parliamentarians, etc.);
  • the EU’s international partners: NATO, the G7, ENISA, EDMO, Europol, and the European Electoral Cooperation Network (ECNE); and
  • independent media, journalists, researchers for fact-checking work and civil society as a whole; including economic players, including in third countries.

(v) Implementation timetable: if bilateral cooperation work, as well as within international organizations, is to be ongoing, implementation can be sequenced as follows:

  • 2024–2026: promotion of a European FARA based on the example of national implementation (see recommendation A), to be adopted by 2028 at the latest;
  • 2026–2028: promote the structuring of international cooperative ventures, with a view to a European close, mandatory coordination of actions to combat FIMI; support for the creation by the EU of a European agency dedicated to the fight against FIMI (detection, awareness-raising) ahead of the 2029 European elections; and
  • 2028–2035: promotion of international regulatory cooperation to avoid fragmentation and prevent regulatory arbitrage.

(vi) Estimating costs: it is difficult to estimate the costs of such recommendations. It will be a question of directing European funds towards cooperation and setting up the early warning service. For example, the cost of an annual European information alert and response exercise is estimated at 500 €/year.

(vii) Limits and risks of implementation: The main difficulty lies in the need to build a European consensus on these subjects of interference and manipulation of information, which could be complicated by different cultural approaches, but also by the reluctance of certain States to favor transparency on certain links of interest with commercial partners in third countries. This also requires an end to the assumption that there can be no interference within the EU between member states or on the part of allied states.

Ideally, a stronger European mechanism should be put in place directly, based on a tried-and-tested national mechanism and solid bilateral cooperation, to lend credibility to France’s ability to drive this approach at European and even international level.

D. Implementation

The fight against informational interference calls for a coordinated response between France and the EU, with an emphasis on the digital dimension. Involving several ministries, this multidimensional approach aims to counter immediate (Russia) and systemic (China) threats, while involving citizens. Planned expenditure includes institution-building, funding for journalism, awareness-raising campaigns and operational tools. But, this will require a certain amount of renunciation, such as promoting the French model outside the EU, and heavy investment, in the context of prudent public debt management to maintain France’s credibility at European level.

An initial estimate of the additional costs arising from these recommendations is as follows (excluding inflation):

  • estimated total additional annual cost for France from 2028: €25M; and
  • additional cost borne by the EU (excluding structuring initiatives such as the EU social network): €20M.

The analyses and opinions contained in this article are the sole responsibility of their author.

    Author