Top Qs
Timeline
Chat
Perspective
Cognitive warfare
Military activities designed to affect behaviors From Wikipedia, the free encyclopedia
Remove ads
Cognitive warfare consists of any military activities designed to affect attitudes and behaviors, by influencing, protecting, or disrupting individual, group, or population level cognition.[1][2] It is an extension of information warfare using propaganda and disinformation.[1]
![]() | This article may require copy editing for grammar, style, cohesion, tone, or spelling. (October 2024) |
NATO General Paolo Ruggiero distinguishes it from other information-related activities by its objectives: "Its goal is not what individuals think, but rather, the way they think."[3][4] Exponents of cognitive warfare aim to influence human thought, reasoning, sense-making, decision-making, and behavior, through the manipulation of information and use of machine learning structures which distribute information on the internet.[1]
Other methods of cognitive warfare include the targeted use of inaudible sound waves (frequency of <20 Hz) and microwaves to incapacitate enemy forces by disrupting the neurological functions of human targets without causing visible injury.[5][6][7] According to the U.S. National Institute of Health, infrasound's effect on the human inner ear includes "vertigo, imbalance, intolerable sensations, incapacitation, disorientation, nausea, vomiting, and bowel spasm; and resonances in inner organs, such as the heart."[5][6]
Remove ads
Concept
Summarize
Perspective
Definition
The academic community has not yet reached a complete consensus on the definition of "cognitive warfare."[8] Oliver Backes and Andrew Swab from Harvard University's Belfer Center (2019) define cognitive warfare as "a strategy aimed at changing the way a target population thinks, and thereby changing its behavior." Meanwhile, Alonso Bernal and others from Johns Hopkins University (2020) define cognitive warfare as "the weaponization of public opinion by external entities, with the goal of influencing the public and/or government policy, or undermining government actions and/or the stability of government institutions."[9] Professor Lin Zhengrong of the National Defense University stated that the term "cognitive warfare" first appeared in a report by the North Atlantic Treaty Organization (NATO). In that report, cognitive warfare is defined as a new domain of competition that goes beyond traditional domains such as land, sea, and air. It is described as "an unconventional mode of warfare that exploits individual psychological biases and reflexive thinking, using technological networks to manipulate human cognition, induce changes in thought, and thereby cause negative impacts."[10] Major General Robert H. Scales, former commandant of the U.S. Army War College, once summarized NATO's operational philosophy by stating: "Victory will be defined more by the mastery of the human and cultural domain than by the occupation of geographic high ground."[11] Professor Liang Xiaobo of the National University of Defense Technology believes that cognitive warfare is an important form of public opinion propaganda, psychological persuasion, and ideological struggle, based on modern theories and science, aimed at gaining the initiative over people's thought, beliefs, and values.[12]
According to Lin Bozhou, a scholar at the Institute for National Defense and Security Research, in 1999, scholars from PLA academies proposed the concept of "unrestricted warfare," advocating for the use of all means to enable the weak to defeat the strong and force the enemy to concede to one's interests. In 2003, the Chinese Communist Party incorporated the "Three Warfares" theory into the Political Work Regulations of the People's Liberation Army. In 2014, military research institutions introduced the concept of "cognitive dominance," emphasizing ideological manipulation, influence operations, and strategic information warfare. This demonstrates that the CCP's use of cognitive and psychological tactics against the people of Taiwan is grounded in a certain theoretical framework.[13]
This type of warfare is also referred to as "influence operation," "cognitive domain operations,"[14] or "cognitive domain warfare."[15][12]
Relationship with Other Types of Warfare
Scholars believe that "cognitive warfare" is a subordinate concept within the frameworks of grey-zone warfare or hybrid warfare.[16]
- Some viewpoints argue that cognitive warfare is a component of information warfare,[17] and information warfare itself is a part of hybrid warfare.[14][18]
- Some scholars argue that information warfare is a subordinate concept within cognitive warfare.[8]
Cognitive warfare may encompass multiple domains, including traditional propaganda warfare, psychological warfare, ideological warfare, and legal warfare.[12][10][19] Compared to "information warfare," which can target decision-makers through online social media and physical interpersonal networks (Ventre 2016; Libicki 2020; Prier 2020; Di Pietro, Caprolu, Cresci 2021), as well as distort and manipulate voters' cognition and emotions (de Buitrago 2019; Serrano-Puche 2021), "cognitive warfare" not only focuses on media manipulation but also extends into the field of neuroscience, aiming to influence brain functions beyond conventional mass media channels. Among the various related concepts, it is currently only "cognitive warfare" that employs neuroscience as a weapon in practical applications, specifically targeting and influencing the cognitive functions of the brain.[8]
In cognitive warfare, information serves as a weapon of combat.[16] In such operations, information can be real or partially true and partially false; it does not necessarily have to be entirely "fake news."[20][21] Leaked documents from within the government or inappropriate remarks and actions by political figures can be enough to trigger social division.[22] Regarding the distinction between cognitive warfare and information warfare, Rajesh Tembarai Krishnamachari (2004) believes that "cognitive warfare" is a mode of operation aimed at influencing the adversary's consciousness and behavior. It involves the use of various means such as media, social media, culture, and politics to manipulate and influence both the public and the adversary's awareness. "Information warfare," as a component of cognitive warfare, focuses on the use of information and technologies—such as media, social media, the internet, and electronic and digital technologies—to impact the adversary's consciousness and behavior.[23] Liang Xiaobo and other scholars have also pointed out that cognitive warfare largely relies on language as its primary medium to exert influence.[12][24]
Remove ads
Objectives and downstream effects
Summarize
Perspective
"Destabilization" and "influence" are the basic objectives of cognitive warfare, which then serves to spread dissatisfaction or encourage particular beliefs and actions in society, so that the enemy destroys itself from within, making it impossible to resist, deter or divert the attacker's objectives.[25][26] As well as attempting to change the way people think, cognitive warfare also seeks to change the way the audience feels and behaves; if successful, it will likely shape and influence the beliefs and behaviours of an individual or group in favour of the attacker's tactical or strategic objectives.[27] In the most extreme cases, it may allow an entire society to fall apart and no longer have the collective will to resist the attacker, which in turn allows the attacker to subdue a society without resorting to the threat of overt force.[28]
Destabilization
The first basic goal of cognitive warfare is to destabilize the social stability of the target population (the object of the attack) by destroying the existing unity and trust in society through, for example, "negative emotional mobilization",[29][30] causing it to become obsessed with internal problems and to lose productivity.
The ways of destabilization include accelerating existing divisions in the group, or introducing new ideas and concepts to pit different groups against each other [31][32] and intensify polarization.
Affect
The second basic goal of cognitive operations is to influence the target population. The attacker manipulates the target group's cognition and understanding of their surroundings, prompts the target group to act in a way that is beneficial to their own purposes,[32] and ultimately causes the target group to resonate with something. Targets of influence include members of the general public, military, leaders or figures in military, political or business fields.
According to cerebral nerves scientists, since it is the brain that is responsible for emotions that influence human judgment, humans are prone to distortions in their perceptions and decisions in situations where they feel stress and fear.[33] In terms of dissemination effects, according to a joint study by the Academia Sinica in Taiwan in the journal Global Security Studies published by the University of Oxford, the effects of cognitive warfare are complex, increasing the cognitive processing costs to the brain even if the audience's brain does not pick up on the false information. Under repeated exposure and repeated stimulation of false messages, audiences will reduce the psychological cost of acceptance,[8] and those who lack sufficient knowledge about public affairs may be more susceptible by relying on relevant external messages,[31] but due to other conditions in real-world environments, studies have shown that the actual political effects induced by audience reception of fake news are quite limited (Hjorth and Adler-Nissen 2019; Jones-Jang, Kim and Kenski 2021), and that audiences may not receive messages directly.[8] Some scholars believe that the various messages conveyed in cognitive warfare may not achieve the desired effect, and that the correctness of the information does not guarantee the outcome of cognitive warfare, and that even if it can achieve a psychological effect, it is still quite subtle.[34]
Downstream effects
According to Masakowski, the objectives of cognitive warfare are to shape/control an enemy's cognitive thinking and decision-making; to manipulate and degrade a nation's values, emotions, national spirit, cultural traditions, historical beliefs, political will; to achieve adversarial strategic geopolitical objectives without fighting; to influence human/societal reasoning, thinking, emotions, et al. aligned with specific objectives; and to degrade a populations trust in their institutions.[1][35] In doing so, Masakowski claims that this allows for the weakening and disruption of military, political and societal cohesion; and undermining/threatening of democracy. Masakowski alleges that cognitive warfare has also been used by authoritarian societies to restructure society and groom populations to accept "continuous surveillance" and that this allows these authoritarian societies to "remove individuals/outliers who resist and insist on freedom of speech, independent thinking, etc."[1]
Remove ads
Modus operandi
Summarize
Perspective
Common Types
Common cognitive warfare styles include:
- Media: through the media or relevant Key Opinion Leaders (KOLs), such as news stories, advertisements, movies and TV shows, and publications, in order to promote one's views and opinions.[10][36]
- Social media manipulation: social media platforms such as TikTok, Xiaohongshu, Twitter, Facebook, Instagram, etc., and related key opinion leaders, they attempt to manipulate and influence public opinion and behavior.[37] The ability to secure the Internet, social media or software has been recognized as a key to national security in the cognitive domain and has become the main battleground for international cognitive warfare.[10][12] By controlling and utilizing whitelisted associated accounts in social media,[38] and using falsely generated fake personal and media accounts to promote favorable information.[39]
- Intelligence manipulation: to achieve the effects of polarization, incitement, and intimidation by altering, falsifying, and disseminating false or inaccurate information and intelligence in order to influence the enemy's decisions, actions, or public opinion in the public sphere.[10] When such information is mixed with part of the real news, it will make it more difficult for outsiders to recognize the truth, easier to convince the public, and more difficult to clarify the facts.[40]
- Cultural influence: influence the enemy's ideology and values through language and culture, such as music, film and art. It has been argued that since there are differences in cultural traditions, religious beliefs, customs, psychological perceptions and thinking patterns among different countries and nations, discrepancies or even hostility among pluralistic ethnic groups within a country, it is important to construct and master the cultural cognitive models of different countries or target groups. In particular, strengthening research on the basic cultural characteristics and cognitive behaviors of enemy military personnel, as well as on the basic perceptions and attitudes of different communities of the target group, including the general population and non-governmental organizations, on common important and sensitive issues, is a key measure for cognitive warfare.[12]
- Political propaganda: influence the awareness and behavior of target audiences through political advocacy, such as speeches, declarations, demonstrations, public relations, public diplomacy, policy reports and campaign advertisements.[8][10]
Science and technology
Under the development of Artificial Intelligence (AI), cognitive warfare can collect and analyze a wide range of data and information on different target groups and specific individuals through the use of big data analysis and computing power, smartphones, social media platforms, etc. to try to simulate and calculate the target's thinking, mental and emotional cognition, social behavior, public opinion, etc.[12] Burke et al. (2020), in this regard, also argued that AI and big data analytics technologies play a key role in the People's Republic of China's strategic guidelines for winning cognitive warfare.[8] In addition, the "Cambridge Analytica" technology can be used as a means to summarize groups with different attitudes, preferences, and positions through the collection of personal information and political inclinations, and to provide different information separately, so as to link target groups with the same preferences together and form a "stratosphere."
Remove ads
Transmission route
Summarize
Perspective
Medium
Cognitive warfare is achieved by an attacker providing content to the attacked object through a conduit.[20] The medium of communication is diverse, such as the literature distributed during ancient wars, while modern cognitive warfare is often through social media such as TikTok, Twitter, and YouTube. Scholar Puma Shen believes that the medium of communication is not the focus of the study of cognitive operations, but should be classified by the communication process as more meaningful.[20] In 2022, scholars in Taiwan suggested that the news dissemination nodes of cognitive warfare may have been expanded from specific people, online opinion leaders, organizations, or other channels to countries, and that studies have demonstrated that news dissemination nodes such as "local collaborators" or Internet celebrities are the key pathways.[41]
Course of events
Categorized by communication process and attack motivation, cognitive combat can be divided into human, gold and information flows, which have different responses. Human traffic is motivated by driving ideology, and attackers organize people who disseminate information (e.g., users of social media), assist in disseminating specific information (which may, but not necessarily, be generated by the attacker) to an audience, or engage in cyber-propaganda and takedowns. Financial flows are motivated by indirect investment, meaning that the attacker only provides funds to sponsor the disseminator of the information, and the generation and dissemination of the information is "outsourced" to the disseminator, usually in the form of "conspiracy theories" or "storytelling." Information flow refers to the generation and dissemination of information by the attackers themselves, usually through the establishment of nodes, the production of fake news, or the production of easy to disseminate the network community video, graphics, lazy packages, node dissemination to attract a large number of audiences, the motivation is to achieve the direct manipulation of information.[20][42]
In the case of international transnational communication involving cultural and linguistic differences, Liang believes that multimedia channels should be used to combine the language and culture of the target audience with the cultural and ideological connotations of the home country, and at the same time, national and private experts and scholars, opinion leaders, and the general public should be used to make the set issues "multi-point, multi-position, multi-dimensional" and spread them simultaneously. These people should be familiar with foreign languages, international communications and cross-cultural interactions, and they can use various issues to provide opinions, build contacts, and accumulate fans or supporters in the public sphere, and at critical moments, they can influence their supporters to achieve communication effects.[12]
Remove ads
Response
Summarize
Perspective
Disclosure control and training
The view was expressed that, owing to the rapid dissemination of information on the Internet, it was possible to break through the limits of time and space, and it was often difficult to identify the source and the authenticity of the information.[29] It is more difficult for countries or regions that boast "freedom of expression" to have legislation that directly restricts the ability to regulate and control online opinion leaders or to control online public opinion.[41] This can be done by enacting legislation to regulate tech giants [8] or by requiring online platforms to reveal the flow of sponsorship money from advertising, live streaming, and other forms of online communication, so that audiences can understand the source of the revenue as well as the possible impacts.[43] In practice, it is also difficult to establish regulations to trace the source of individual financial flows, so the Australian government, for example, regularly reports publicly on the activities of "offshore forces."[31] In 2022, the French National Assembly also established the Commission for the Investigation of Foreign Interference, which is empowered to investigate attempts by foreign governments, enterprises, groups and individuals to influence or bribe domestic political parties, leadership hierarchies or opinion leaders through political or economic means.[43] In 2023, the European Union announced plans to establish an "Information Sharing and Analysis Center" to centralize and analyze cases of information manipulation by foreign forces.[44] There are also scholars such as Silverstein (2019) who advocate that governments should ethically regulate the use of neuroscience as a weapon.[8] In addition, the attacked object needs to be aware that cognitive combat is taking place; it needs to be capable of observing and adapting before deciding to take action (OODA loop).[22]
Some governments in some countries or regions have prohibited the use of specific community software by public officials on computers and cell phones in government agencies,[45] or are considering revising regulations to standardize the information equipment used in public agencies, and at the same time, to enable enterprises to cooperate in assuming their corporate social responsibility and to increase the information security management capabilities of the industry.[18] On the other hand, government departments can publicize the issue through the mass media and set up dedicated investigation units and information clarification mechanisms. At the same time, it is important to strengthen the "media literacy" of the general public and enhance their understanding of the boundaries of freedom of expression so as to respond more effectively to "hate speech" and controversial information that is a mixture of truth and falsehood.[46] The military defense and national security units can counteract "cognitive warfare" related concerns through the status of intelligence detection and acquisition, as well as education and training in information interpretation,[29] and appropriate response attitudes.[47] At the same time, there is a view that governmental authorities should take the initiative to take countermeasures by "exposing" and "cracking down on nodes,"[48][49] and to raise the public's "digital citizenship awareness".[41]
In the case of international transnational communication, the government can coordinate with specialized agencies to train professional and foreign-language personnel to focus on flexible values through publicity channels in order to seek external support and recognition.[10]
Media literacy and civic engagement
In the public sphere, civil society needs to be made to understand the boundaries of freedom of expression in order to recognize the fine line between freedom of expression and national security.[46] In order to improve the public's information identification and cognitive abilities, we must continue to penetrate education to improve media literacy and critical thinking in all sectors of society,[29][41][8] receive news from multiple sources, compare relevant information from different sources, dispel myths,[26] establish a trusted and credible "third-party organization,"[17] and start a fact-checking mechanism for sober information by citizen groups,[50] and to enhance cooperation with non-governmental organizations (NGOs). At the same time, attempts should be made to establish a variety of niche media that express different positions and pluralistic viewpoints, so that different values can be voiced, and internal consensus should be shaped to unite the state and society.[26]
For "local collaborators" who assist in spreading false information, public denunciation and public awareness are ways to counteract those who assist in spreading the information.[51]
Remove ads
Actual Cases
Summarize
Perspective
USA
According to Reuters, former US President Trump authorized the CIA to conduct a covert operation on Chinese social media. The project began in 2019 and was aimed at guiding public opinion against the Chinese government. Three former officials told Reuters that the CIA set up a small team of agents using false identities to spread negative comments about the Chinese government and leak defamatory information to overseas media. Two former officials pointed out that these operations in China were aimed at inciting senior leaders and forcing the government to spend a lot of effort on searching for cyber intrusion operations. Chinese Foreign Ministry spokesman Wang Wenbin responded that the United States has been spreading false information about China in an organized and planned manner for many years, which has become an important means of the United States' cognitive warfare against China.[52][53][54]
In June 2024, Reuters exposed the U.S. Department of Defense's efforts to discredit China's Sinovac vaccine in the Philippines during the COVID-19 pandemic by spreading false information. The report said that the U.S. military used fake online accounts posing as Filipinos to fabricate false statements on the social platform X (formerly Twitter), creating a wave of vaccine boycotts. The report believes that the U.S. military's efforts to discredit China's vaccines began in the spring of 2020 and continued for a period of time after Biden took office. It was ordered to stop in the spring of 2021. A senior official involved in the operation revealed that "we don't look at it from a public health perspective... but rather study how to discredit China."[55]
The Intercept, a US investigative organization,[37] pointed out that the US Department of Defense used Twitter to assist in propaganda activities. Internal Twitter documents show that at the request of US government departments, a large number of Twitter accounts were whitelisted. These whitelisted accounts were used by the US Department of Defense to carry out online propaganda abroad, attempting to shape public opinion about Yemen, Syria, Iraq and Kuwait.
These whitelisted accounts were initially openly affiliated with US government departments, but the US Department of Defense later changed its strategy to hide its relationship with these accounts. Although Twitter executives noticed this, they did not shut down these accounts.
Nathaniel Kahler, then an employee working for the U.S. Central Command, sent a request email to Twitter on behalf of the company's public policy team on July 26, 2017, asking for approval to verify an account and whitelist some Arabic-speaking accounts to amplify specific messages. Among these whitelisted accounts, @yemencurrent was used to promote the U.S. drone strike announcement in Yemen, emphasizing that the U.S. drone strike was "accurately" used to kill terrorists and promote U.S. and Saudi-backed attacks on Houthi terrorists in Yemen.
The Stanford Internet Observatory and Graphika jointly released a report [56][57] entitled "Unheard Voice: Evaluating Five Years of Pro-Western Covert Influence Operations" stating that there is an interconnected network of accounts on Twitter, Facebook, Instagram and five other social platforms in the United States that use inducement strategies to promote pro-Western rhetoric in the Middle East and Central Asia. This propaganda promotes the interests of the United States and its allies while opposing countries including Russia, China and Iran. These accounts mainly criticize Russia for the deaths of innocent civilians during its invasion of Ukraine and the war atrocities committed by the Russian army. Some activities also promote the spread of anti-extremist information. The accounts used in the propaganda against Central Asia are almost entirely concentrated in China. These fake characters and media (accounts) mainly focus on the Xinjiang re-education camps and the genocide of ethnic minorities such as Muslims.
According to the New York Post, in October 2020, Joe Biden's presidential campaign urged Mike Morell, then the former CIA director of the Obama administration, to help Biden organize a joint letter of 50 intelligence officials, saying that the report on the email gate incident of Biden's second son Hunter Biden's laptop was fake news from Russia. In a private testimony to the House Judiciary Committee, Morell talked about the incident on October 17, 2020, or before or after the New York Post published the report on Hunter Biden's email gate. Afterwards, the joint open letter was handed over to the American political news company Politico, which published an article titled "Hunter Biden story is Russian disinformation, dozens of former officials say" on October 19, claiming that the New York Post's report "dozens of former intelligence officials believe that it is Russian information warfare (Hunter Biden story is Russian disinformation, dozens of former officials say) and has all the typical characteristics of Russian information warfare." Twitter eventually banned the New York Post's report on Hunter Biden's computers in the critical weeks before the election, and Facebook also restricted users' access to the full report.[58][59] Republicans accused Twitter of collaborating with government agencies and news media to suppress the New York Post report at a hearing. Twitter executives admitted that suppressing the report was "wrong," but denied that the US government was involved in suppressing the New York Post report.[60] In an interview, Facebook founder Mark Zuckerberg said that the restrictive measures on the New York Post report were based on a warning from the FBI. After the report was published, the FBI contacted the Facebook team and warned them that Russia had interfered in the 2016 election (by spreading false information). Facebook believed that the report fit this pattern and ultimately chose to limit the scope of the report and the ranking weight of the New York Post.[61]
Operation Mockingbird originated in the early days of the Cold War and was a large-scale program carried out by the CIA to control the media and ultimately influence public opinion.[citation needed]
Russia
Use of techniques
According to a report by the U.S. State Department, Russia's information warfare system consists of five parts: government communication platforms for publishing information, state-funded global information distribution channels, the cultivation of agent resources, weaponized social media, and the use of online fake news. Silverstein (2019) believes that Russia interfered in the U.S. presidential election and the UK's Brexit referendum through cognitive operations, and according to Connell and Vogler's research (2017), Russia's cyber warfare uses cognitive manipulation as a key element. A report released in 2023 by the British intelligence agency, Government Communications Headquarters, stated that countries such as Russia and Iran often carry out various types of cyber attacks to spread fake news. Peter Zalmayev, a Ukrainian journalist and director of the non-profit organization Eurasian Democracy Initiative, said that before Russia annexed Crimea in 2014, it had already begun to spread false information in Ukraine, including false statements that undermined Ukraine's sovereignty and denied its status as a sovereign state, and created information confusion to prevent the spread of facts. So far, there has been no effective way to curb this.[citation needed] Shen Boyang believes that Russia used cognitive warfare in the Russo-Ukrainian War.
Mainland China
Participants
Main articles: Internet commentators and Office of the Central Cyberspace Affairs Commission
The Chinese Communist Party (CCP) has employed a large number of 50 Cent Party accounts to post favorable information and comments about the People's Republic of China.[62][63] These commentators and online operatives are often referred to by outsiders as the "50 Cent Party" (wumao dang), a term used to describe individuals allegedly paid by the CCP to manipulate online discourse.[64][65]
These commentators typically pose as ordinary internet users,[66] posting pro-CCP and pro-government remarks, attacking critics of the CCP, or using other information dissemination strategies to influence, guide, and fabricate public opinion online.[64]
On February 27, 2014, CCP General Secretary Xi Jinping, in a speech at the first meeting of the Central Cyberspace Affairs Commission, emphasized the need to "innovate and improve online propaganda, apply the rules of internet communication, promote the main theme, inspire positive energy (Zheng nengliang), vigorously foster and practice the core socialist values, and master the timing, degree, and effectiveness of online public opinion guidance to create a clean and upright cyberspace."[67]
Tactics Used
According to The Economist, the People's Republic of China has conducted sustained psychological warfare against the Republic of China (Taiwan) for many years.[68][69][70][71] Other scholars have also discussed how the Chinese Communist Party employs cognitive warfare tactics to influence the people of targeted countries.[72] The main methods of such operations include military intimidation, exerting influence through bilateral exchanges, interference via religious influence, and the spread of disinformation and content farms online. These tactics generally follow a common pattern: direct or indirect threats, applying psychological pressure on those who oppose its policies, attracting target groups, and creating distractions.[73] As of 2023, such operations have also included establishing front companies, hiring freelance writers around the world, and actively recruiting protest participants.[74]
The term "unrestricted warfare," as used in PLA research, is roughly equivalent to hybrid warfare, and the "Three Warfares"—cognitive warfare, psychological warfare, and legal warfare (lawfare)—can all fall under the broader umbrella of information warfare.[75] However, scholar Puma Shen (2021) argues that these traditional classifications fail to encompass newer developments by the Chinese government, such as "digital authoritarianism" (e.g., Huawei's global exports) and the collection (or import) of overseas personal data.[75] Additionally, scholar Cheng Deying (2022) asserts that the Chinese Communist Party's cognitive warfare against Taiwan no longer differentiates between peacetime and wartime and has become pervasive.[76]
In the case of Mainland China's cognitive warfare against Taiwan, U.S. official Matthew Pottinger publicly stated that China has invested massive resources into conducting cognitive warfare against Taiwan, and these efforts may already be yielding results.[77] A 2022 report titled China Security Report published by the Japan Ministry of Defense think tank, the National Institute for Defense Studies, noted that mainland China launched over 1.4 billion cyberattacks against Taiwan within a year and attempted to recruit personnel from multinational corporations and military-related sectors.[78][79]
In Canada, then-Federal Member of Parliament and House of Commons of Canada on National Defence John McKay, along with Michel Juneau-Katsuya, former Director of the Asia-Pacific region at the Canadian Security Intelligence Service, publicly stated that the People's Republic of China has been conducting extensive infiltration and influence operations within Canada, prompting increased national vigilance.[80][81][82] Scholars such as Kelton have also asserted that the PRC is carrying out cognitive warfare operations in Australia and New Zealand as well.[73]
According to a 2021 research report by Paul Cheron, a scholar at France's Ministry of Armed Forces think tank the Institute for Strategic Research at the Military School (IRSEM), and national security expert Jean-Baptiste Jeangène Vilmer, titled Chinese Influence Operations – A Machiavelli Moment,[83][84] the People's Republic of China has invested heavily in disseminating disinformation to achieve two main objectives: to undermine the democracy of the Republic of China (Taiwan) and to condition the Taiwanese public into accepting the Chinese Communist Party's eventual annexation. This disinformation is often concentrated in the military domain, aiming to convince the Taiwanese people that an invasion by the PRC is ultimately inevitable, while simultaneously justifying potential military actions initiated by the PRC.[85]
In their 2023 co-authored work U.S.-Taiwan Relations – Will China's Challenge Lead to a Crisis?, Ryan Hass, a senior fellow at the Brookings Institution, Bonnie Glaser, director of the Asia Program at the German Marshall Fund, and Richard C. Bush, former Chairman of the American Institute in Taiwan, argue that in addition to the increasingly apparent military threat posed by the PRC, Taiwan is also facing a growing crisis in the form of political warfare by the Chinese Communist Party, which is gradually eroding the will of the Taiwanese people. They caution that this latter aspect is often overlooked, yet if left unaddressed, it may ultimately cause even greater harm to U.S. interests.[86][87]
Network News
The V-Dem Research Project, a transnational academic survey, found in 2019 that Taiwan was the most vulnerable to foreign fake news in the world.[88] According to the 2022 Beijing Global Media Influence Report released by the US human rights organization Freedom House, Taiwan faces a large amount of fake news from mainland China, and the influence of the People's Republic of China on the media of the Republic of China continues to expand. A research report released by the University of Gothenburg in Sweden in 2023 also pointed out that the Republic of China has been the country most affected by foreign fake news in the world for ten consecutive years, and most of these fake news are related to the People's Republic of China.[89] The Republic of China Army once analyzed the controversial news spread by the People's Liberation Army of China. The content types mainly include: creating a psychological atmosphere of "military reunification," undermining the prestige of the Republic of China government, and attempting to disrupt the morale of the army and the people. Among them, the purpose of disrupting morale accounts for the majority.[21][90]
Paul Cheron and Jean-Baptiste Jeangène Vilmer pointed out that the CCP's propaganda system usually starts with the official media releasing false news, then using fake Taiwanese social media accounts to collect and reprint it, and then Taiwanese news media quote and spread the related false news without verification, finally making it difficult for readers to identify the source of the related news.[85] The Republic of China's national security units have also discovered that the "four-level structure of spreading rumors" used by online water armies to influence online public opinion: first, using "disposable" fake social media accounts to post articles, then sharing screenshots of the article content through Facebook fan pages managed by people outside the Republic of China, and then using a large number of dummy accounts in the third level to expand the spread, and finally a large number of dummy accounts to share to more well-known public Facebook groups to increase exposure and induce Taiwanese people to continue to spread it on commonly used social media. In March 2020, Facebook disclosed that it had shut down more than 60 accounts of the People's Republic of China's cyber forces that were posing as Taiwanese users and spreading false information about the coronavirus to Taiwan. In May 2023, Facebook's parent company, Meta, removed more than 100 fake accounts posing as American and European organizations from its platforms, Facebook and Instagram. The network of related accounts operated in a similar way to the methods used by Russia during the 2016 US presidential election. In addition, the Ministry of Justice Investigation Bureau cracked down on such a case in 2022 in which controversial and false information was released through social media platforms in an attempt to influence the perception of the Taiwanese people.[91]
Remove ads
Comparison of cognitive warfare with information warfare
According to Masakowski, cognitive warfare is an extension of information warfare (IW).[1][better source needed] Operations in the information environment are traditionally conducted in five core capabilities - electronic warfare (EW), psychological operations (PSYOPs), military deception (MILDEC), operational security (OPSEC), and computer network operations (CNO).[3][92] Information warfare aims at controlling the flow of information in support of traditional military objectives, mainly to produce lethal effects on the battlefield.[3][better source needed] According to Masakowski and NATO Gen Ruggiero, cognitive warfare degrades the capacity to know and produce foreknowledge, transforming the understanding and interpretations of situations by individuals and in the mass consciousness, and has multiple agnostic applications, including commercial, political and covert IW and CW military operations.[1][3] The Chinese military refers to operations in the cognitive domain as "cognitive domain operations (CDO)."[93]
Remove ads
Cognitive warfare and data
Summarize
Perspective
Using a psychological and psychographic profile, an influence campaign can be created and adjusted in real-time by A.I. ML models until desired cognitive and behavioral effects on the individual and/or population are achieved.[94][better source needed] The U.S. Army and Marine Corps counterinsurgency strategy calls for the use of automated biometric systems to separate insurgents and foreign fighters from the general population.[95] In doing so, this helps counterinsurgents leverage the population and operational environment against the threat network.[95]
Decades of peer-reviewed research show that echo chambers, in the physical world and online, cause political polarization,[96] extremism, confusion, cognitive dissonance, negative emotional responses (e.g., anger and fear), reactance, microaggressions, and third-person effects.[97][98][99][100][101][102][103][104][105][106][107][108][109][excessive citations][verification needed]
Because of these psychological perseverance mechanisms like confirmation bias, this can be very problematic, based on the work of Nyhan & Reifler (2010). Nyhan & Reifler found that even attempting to correct false beliefs often reinforces rather than dispels these beliefs among those who hold them most strongly. This is known as the backfire effect – "in which corrections actually increase misperceptions."[110][111][112][113]
Remove ads
See also
References
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads