Israel Influence Tracker

How digital manipulation reshapes conflict, campuses, and civic life

How digital manipulation reshapes conflict, campuses, and civic life

Disinformation, AI, and Democracy Under Fire

How Digital Manipulation Continues to Reshape Conflict, Civic Discourse, and International Diplomacy

In an era marked by rapid technological innovation and unprecedented connectivity, digital manipulation has become a central tool in shaping realities—often with profound consequences for conflict, societal cohesion, and international relations. From deepfake videos and AI-driven disinformation to social media algorithmic biases and civic platform distortions, state and non-state actors are leveraging digital technologies to influence narratives, polarize societies, and manipulate diplomacy. Recent developments highlight an increasingly complex digital battleground where truth is contested, and societal stability hangs in the balance.


The Escalating Digital Battlefield: Influence Campaigns and Societal Undermining

Influence Operations and Information Warfare

Modern conflicts are no longer confined to traditional arenas; instead, they unfold vividly online, where influence operations serve as strategic instruments:

  • Digital Censorship and Blackouts:
    Countries like Iran continue to employ digital shutdowns during protests, aiming to suppress dissent and control narratives. Such shutdowns have been linked to over 30,000 deaths in recent years, illustrating how digital restrictions can deepen humanitarian crises and hinder international awareness. These measures limit external scrutiny and manage domestic perceptions through digital silence.

  • Proliferation of Deepfakes and Fabricated Content:
    The Israel–Hamas conflict has seen a surge in deepfake videos and manipulated battlefield footage circulated across social media. These AI-generated falsehoods confuse the public, undermine diplomatic efforts, and fuel societal polarization. As AI tools become more accessible and sophisticated, verifying content authenticity becomes increasingly difficult, eroding trust in genuine sources and complicating fact-checking efforts.

  • State-Sponsored Propaganda and Influence Campaigns:
    Governments including Iran, Qatar, UAE, China, and Russia actively craft narratives portraying conflicts as battles against oppression. They often disseminate disinformation through algorithmically amplified emotional content to sway public opinion and destabilize rivals. For instance, Iran’s influence efforts aim to undermine Israel and shape regional perceptions by cyber operations and disinformation campaigns designed to undermine Israeli legitimacy and foster discord.

  • Manipulation of Civic Platforms:
    Investigations reveal state efforts to manipulate civic knowledge platforms like Wikipedia, YouTube, and other social media. These actions shape public discourse, erode trust in civic knowledge, and make societies more susceptible to misinformation and propaganda. For example, entries related to regional conflicts are often altered or censored to align with state narratives, skewing public understanding.

Surge in Hate Speech, Victimhood Denial, and Algorithmic Amplification

The ongoing violence has triggered a sharp rise in online hate speech and victimhood denial:

  • After Iran’s crackdown that resulted in more than 30,000 deaths, antisemitic content and hate speech spread widely across platforms such as TikTok, Twitter, and Facebook. These platforms amplify prejudice, risking normalization of hate crimes and vandalism, especially in Europe and North America.

  • Conspiracy theories denying or minimizing Jewish victimhood continue to spread, undermining historical truths and inciting violence. Such narratives hinder reconciliation efforts and exacerbate societal polarization, making peace more elusive.

  • The personal story of Ran Gvili, the last Israeli hostage held by Hamas after 843 days in captivity, has been widely shared and amplified online. While humanizing the conflict, algorithmic promotion of such narratives also fuels conspiracy theories and antisemitic tropes, deepening societal divides when exploited for political or extremist purposes.

The Role of AI-Enabled Extremist Content

Artificial intelligence has amplified disinformation and hate speech:

  • The Anti-Defamation League’s (ADL) recent study highlights that leading AI models can generate extremist and antisemitic content after specific prompts, making hate speech more persuasive and widely disseminated.

  • Platforms are developing AI detection tools to identify deepfakes and analyze influence campaigns. Governments and private entities are collaborating to strengthen moderation policies, increase platform transparency, and invest in media literacy programs designed to empower users to recognize manipulation.


Recent Key Developments: Flashpoints, Strategic Moves, and Influence Dynamics

Amplification of Personal Narratives and Societal Polarization

  • Rescue of Ran Gvili:
    Personal stories like Ran Gvili’s rescue serve as potent symbols of resilience. However, algorithmic amplification has exacerbated emotional polarization and antisemitic tropes, exemplifying how narratives can be weaponized to deepen societal divides.

Geopolitical and Influence Warfare Dynamics

  • Iran–Israel Rivalry:
    Iran’s influence campaigns seek to destabilize Israel via cyber operations and disinformation. Iranian officials have threatened escalations, with slogans like “WE FINISH IT!” prominently displayed in murals and propaganda. Recent statements from Iranian authorities, such as "IRGC Threatens to Target ‘Israel First’ if US Attacks,", underscore Iran’s willingness to escalate conflicts, raising fears of direct military confrontation.

  • EU’s Designation of IRGC as Terrorist:
    The European Union’s recent decision to designate Iran’s IRGC as a terrorist organization raises the stakes for Iran’s influence efforts worldwide. This move may provoke retaliatory influence campaigns and cyber operations targeting European and allied interests, further intensifying the digital conflict.

  • U.S.–Israel Relations and Regional Tensions:
    The U.S. continues to support Israel with over $6.5 billion in military aid, including advanced missile systems, as conflicts escalate. Critics warn this support may escalate tensions and fuel influence campaigns aimed at shaping domestic and international perception. Meanwhile, Gulf states like Saudi Arabia warn that U.S. inaction could embolden Iran, prompting regional preparations for potential escalation.

Influence Operations in Latin America and Digital Diplomacy

While Israel advances digital diplomacy through social media outreach and cultural initiatives, Iran, China, and Russia are deploying covert influence tactics in Latin America. Investigations have uncovered attempts to manipulate civic platforms like Wikipedia and YouTube, seeking to distort regional narratives and undermine public trust in local institutions.

Regional Initiatives and Escalations

  • The UAE is actively working to establish a large Palestinian civil and reconstruction hub in southern Gaza, aiming to foster regional influence and counter Iranian efforts. This initiative includes a Palestinian civil and reconstruction center, designed to promote stability, build alliances, and project soft power through digital influence campaigns and infrastructural investments.

  • Iran’s IRGC escalates its propaganda efforts with murals depicting targets like Tel Aviv with slogans such as “WE FINISH IT!” Recent statements from Iranian officials reinforce Iran’s willingness to escalate, heightening fears of direct military conflict.

Domestic and International Influence in the U.S.

  • Pro-Israel lobbying groups, particularly AIPAC, are intensifying efforts to influence U.S. policy, including pressuring Democrats and shaping electoral strategies. Recent lobbying expenditures and public advocacy demonstrate how foreign influence campaigns are woven into domestic political debates.

  • Gavin Newsom, governor of California, publicly declared he “never has and never will” accept money from AIPAC, signaling ongoing tensions about foreign influence in U.S. politics.

  • Legislative initiatives like the “Ceasefire Compliance Act of 2023” aim to limit U.S. military aid in conflicts like Gaza, emphasizing greater accountability and preventing U.S.-facilitated violence.


Platform-Driven Sensational Content and Narrative Strategies

Recent analyses reveal how platform algorithms favor sensational, emotionally charged content:

  • Videos such as "US' Arab Ally, Host To 60+ American Fighter Jets For Iran Showdown, Delivers Fatal Blow To Israel" with over 21,700 views are designed to stoke fears and shape perceptions about regional alliances.

  • The recurring question "Why does the US support Israel?" is used to frame narratives that justify or criticize American foreign policy, influencing opinions domestically and abroad.

Such content strategies serve to mobilize emotional responses, polarize societies, and advance geopolitical agendas—often blurring the lines between fact and fiction.


Recent Reports and Implications: The Trump Advisers’ Preference

A notable recent development involves reports that former President Donald Trump’s advisers prefer Israel to strike Iran first. According to sources from The Jerusalem Post, Senior advisers believe that an Israeli military strike against Iran prior to U.S. involvement would align better with political interests, potentially boosting support for escalation. This stance reflects broader debates within U.S. political circles about the timing and nature of potential military actions, and raises concerns about digital influence campaigns that could shape public perception and policy.


Moving Forward: Challenges, Priorities, and the Path Ahead

The sophistication of digital influence campaigns and AI-enabled disinformation presents serious threats to democratic stability, conflict resolution, and regional peace:

  • Strengthen detection and moderation tools to rapidly identify deepfakes, disinformation, and extremist content.

  • Increase transparency around influence operations and regulate platform algorithms to limit harmful amplification.

  • Foster international norms and cooperation for information sharing, setting standards for AI use, and countering malicious influence campaigns.

  • Invest in media literacy, public education, and community resilience programs to empower citizens to recognize manipulation and resist disinformation.


Current Status and Broader Implications

The digital landscape remains highly contested and volatile. Critical conflicts—such as Iran–Israel, regional influence efforts, and the ongoing debates over Gaza’s reconstruction—are increasingly shaped by digital influence and misinformation campaigns. Societies worldwide must remain vigilant, implementing technological safeguards, legal frameworks, and educational initiatives to counter disinformation and protect democratic processes.

The evolving digital realm demands a coordinated global response—balancing technological innovation with safeguards against malicious influence—to preserve truth, human rights, and peace in an increasingly contested information environment. Only through multilateral cooperation and robust safeguards can the international community hope to mitigate the risks posed by digital manipulation and uphold the integrity of societal discourse.

Sources (13)
Updated Feb 26, 2026
How digital manipulation reshapes conflict, campuses, and civic life - Israel Influence Tracker | NBot | nbot.ai