Intersubjective reality shaken by digital technology
Since the dawn of recorded history, human society has been built on belief systems and narratives shared among people. Yuval Noah Harari, the historian known for his book Sapiens: A Brief History of Humankind, points out that intersubjective reality includes religions, laws, currencies, and even political systems such as democracy and autocracy.
But what if these very shared beliefs and narratives are being gradually manipulated without us even realizing it? This is neither science fiction nor conspiracy theory. In modern society, information technologies, such as the internet, smartphones, and AI, function as crucial mechanisms shaping reality, increasingly penetrating deeper into the foundations of our cognition and judgment. Information technology is no longer simply a means of communication or a processing device; it now influences our very judgments about what we should trust and what kinds of words are considered socially correct.
In this era, where the design of algorithms and information infrastructure influences people’s perceptions of the world, reality itself is likely to be technologically reconstructed. Today, the arenas for battle or competition are no longer limited to territory and military power. As a new domain of international competition, a struggle for influence over cognition and discourse has emerged.
Until now, within the international order based on sovereign states, often referred to as Westphalian sovereignty, strategic interference and intelligence activities were considered actions of states or their proxies. Today, however, the actors interfering in the perceptions of reality are not limited to states. Global platform companies and even ordinary individual users shape societal perceptions by producing and spreading information. This asymmetric and pluralistic structure makes our intersubjective reality manipulable, with profound implications for democratic processes and the international order.
Given these developments, we need to redefine information security beyond the narrow framework of information defense. Specifically, information security in cyberspace is not only about protecting existing assets as a safety assurance perspective but also about maintaining the soundness of information infrastructure itself. In recent years, we have seen increasing instances where automatic content generation by generative AI, individual optimization algorithms using tracking technology such as targeted advertising, and automatic posting programs (bots) operating on social media have been exploited for malicious purposes.
These applications of information technology also constitute the technical foundation for Digital Influence Operations that can interfere with elections and fragment societies. Digital influence operations refer to activities in which nations or organizations use digital technology to attempt to change the decision-making or behavior of targeted audiences. In other words, an information war is being waged in cyberspace over our very perceptions of reality.
In the next chapter, we will take a closer look at Russia’s information manipulation techniques as a concrete example of such digital influence operations.
What is the aim of the narratives that fuel fear about the influx of immigrants?
A typical example of foreign influence operations is the 2016 US presidential election. Multiple U.S. government agencies (ODNI, CIA, FBI, NSA) concluded in a joint report published in 2017 that Russian President Vladimir Putin directed influence operations targeting the 2016 U.S. presidential election. This was a symbolic incident in which state-sponsored digital influence operations led to real political outcomes.
In that operation, Russia first launched a cyberattack (phishing) targeting the Democratic National Committee (DNC) election campaign headquarters to steal emails. It then released the stolen emails online and conducted a disinformation campaign through social media. This is believed to have fueled distrust in Democratic candidate Hillary Clinton and increased support for Republican candidate Donald Trump, who had initially been considered the underdog. As a result, this information manipulation became widely recognized as a successful project.
This series of activities is known as Project Lakhta. While only the Russian government knows the exact details of its objectives, Trump’s victory and the deepening polarization within American society aligned with the Kremlin’s geopolitical interests. Specifically, Russia appears to have sought to enhance its influence by creating divisions within the United States (between conservatives and liberals) and by undermining trust in the electoral system.
This strategy was also evident during the invasion of Ukraine starting in 2022. The Russian government used social media and other platforms to disseminate large volumes of false and misleading information, aiming to undermine the morale of the Ukrainian people and divide the international community. The impact reached Japan as well, where narratives circulated online claiming that supporting Ukraine would increase Japan’s security risks, influencing public opinion.
In addition, the Russian government has reportedly used generative AI to automatically generate and disseminate pro-Russian messages. A symbolic example is a deepfake video of Ukrainian President Volodymyr Zelenskyy declaring an end to the war, which spread online. Information manipulation using realistic visual and audio effects makes the distinction between truth and falsehood increasingly difficult.
Furthermore, a digital influence operation known as Operation Doppelganger deployed localized propaganda campaigns, primarily in NATO member states. In countries such as Germany, France, and the United Kingdom, at least 17 fake sites mimicking major media outlets were identified. They published pro-Russian stories and articles undermining support for Ukraine. Information from these sites was certainly amplified automatically on social media by bots.
Since the Soviet era, Russia has conducted information operations, known as Active Measures, against European countries and the United States. Today, many NATO countries supporting Ukraine face domestic instability stemming from issues such as immigration and economic disparities. Exploiting these conditions and spreading narratives that incite anxiety, such as claims that immigration or supporting Ukraine will put the nation at risk, the Russian government seeks to deepen social divisions and lower the priority of supporting Ukraine, according to experts.
That is, the government’s aim is not merely to spread disinformation. Its true objective is widely believed to be undermining the political system from within by destabilizing shared cognition, or intersubjective reality, thereby deepening societal division and advancing Russia’s strategic interests.
Critical thinking to counter influence operations
How should we confront these influence operations?
Entities conducting digital influence operations analyze our cognitive tendencies and execute their strategies accordingly, making prevention difficult at the individual level. However, understanding their existence and methods provides some measure of protection. It is, as it were, a form of psychological inoculation.
First, we should recognize the fact that the online space itself, including the social media platforms we use daily, has become a battleground for information warfare. On platforms like X (formerly Twitter), information spreads instantly through hashtags and retweets. In such a highly responsive environment, a single post or image can sway public opinion within a brief time.
Moreover, information on social media spreads according to proprietary algorithms, which inherently contain structural biases. Because many platforms are designed to capture user attention, angry emotions and extreme opinions spread more easily. We must not underestimate the risk that certain narratives or propaganda can rapidly propagate via influencers and bots.
In addition, the advanced targeting technology of social media advertising has dramatically enhanced the accuracy and effectiveness of information delivery. We must not forget that we can unconsciously be filtered based on our browsing history and interests, being unknowingly steered toward specific ideologies or emotions. This technology of cognitive optimization is closely tied to the growth of the online advertising industry.
In this environment, every one of us needs critical thinking. In fact, much of what we take for granted rests on perceptions shared by society. For example, it may be natural for Japanese people to think that liberal democracy is the global standard and the most rational system of governance. Yet, looking around the world, we find authoritarian countries like Russia and nations with political systems rooted in religious principles.
We tend to take liberalism and democracy for granted, as if water will always flow when we turn on the tap. But these, too, rest upon intersubjective reality. When that foundation shakes, social cohesion crumbles easily, and narratives from both within and without begin to divide our minds.
The development of digital platforms has enabled anyone to voice their opinions and exert influence. While this progress can deepen democracy, it also carries the risk of weakening the very foundations of society. How to maintain the balance between freedom and security in information flow – this is the challenge now confronting us all.
From a cybersecurity perspective, society must continually update itself. If we remain defenseless against the mass circulation of narratives, we cannot maintain the health of liberal democracy. Just as we apply patches to fix vulnerable program code, we, as information users, must update ourselves and maintain a critical eye toward the vast flow of information and narratives.
* The information contained herein is current as of August 2025.
* The contents of articles on Meiji.net are based on the personal ideas and opinions of the author and do not indicate the official opinion of Meiji University.
* I work to achieve SDGs related to the educational and research themes that I am currently engaged in.
Information noted in the articles and videos, such as positions and affiliations, are current at the time of production.

