11.12.2025.

Russia and the rise of the disinformation-for-hire industry

The emergence of a global, large-scale disinformation industry has privatized influence operations, giving states strategic reach with plausible deniability.

According to an analysis by EuvsDisinfo, a quiet revolution has taken place in the world of propaganda. Operations once run by authoritarian governments and intelligence agencies have now been handed over to private firms that sell disinformation and deception as a service.

From fake social-media armies to AI-driven smear campaigns, disinformation and Foreign Information Manipulation and Interference (FIMI) have become a global business, offering authoritarian regimes new ways to influence others while denying any involvement.

From state propaganda to paid disinformation

For decades, information operations were tightly controlled by states.

The Soviet Union perfected the craft of dezinformatsiya, and later Russia institutionalized it through modern digital tools such as the Internet Research Agency (IRA).

But over the past decade, this model has been commercialized. Disinformation and fraud have become a profitable service offered by companies with backgrounds in intelligence, the military, or marketing. These firms operate globally and sell complete FIMI packages that include fake social-media campaigns, hacks, data leaks, and narrative management designed to push false and manipulated content into democratic countries.

Delegating operations as a shield for states

This provides both efficiency and deniability. Authoritarian states are increasingly outsourcing information operations to private intermediaries to shield themselves from diplomatic and legal consequences.

Through this model, malicious actors can also experiment with risky tactics such as AI-generated content, hacking, or deepfakes — operations that would be politically explosive if carried out directly by state agencies.

In doing so, they can target foreign populations with personalized influence campaigns, maintaining plausible deniability by claiming no link to the private entities executing the operations.

Outsourcing also enables information laundering — hiding the true origin of disinformation by passing it through private firms, fake accounts, and intermediary media outlets. As these actors replicate and amplify the message, it begins to appear organic and locally produced. This allows malicious actors to spread tailored narratives while denying any involvement.

All this is the informational equivalent of using mercenaries: the client enjoys the results while avoiding responsibility.

Russia remains a key player in this outsourced ecosystem. Private companies such as the Social Design Agency (SDA) and related structures now run large-scale influence operations that mirror, and in many ways replace, the functions of the old St. Petersburg troll factories.

These firms manage covert online assets, push state-aligned narratives, and provide the Kremlin with an additional layer of deniability.

Undercover journalists recorded the firm demonstrating hacking techniques, media infiltration, and the planting of fabricated news. The scale and accessibility of these operations for paying clients show how disinformation has become a global commodity.

Hybrid operations

Modern influence campaigns no longer exist only online; they operate in the hybrid space between digital and physical realities.

The Internet Research Agency (IRA) demonstrated this during the 2016 U.S. elections, when Russian operatives posing as American activists organized real-world rallies, paid participants, and coordinated online amplification. What began as a meme war ended in physical mobilization.

Today’s hybrid operations blend hacking, covertly funded domestic influencers, and clandestine media fronts. Campaign operators create credible-looking news sites and influencer personas to inject tailored narratives into the public sphere.

Once circulated, these narratives mix with authentic content and spread across both digital and traditional media, making manipulation harder to detect.

Automation and AI: the new force multiplier

The original troll-farm model — hundreds of workers manually posting in shifts — is being replaced by AI-driven automation.

Systems like AIMS from “Team Jorge,” as well as newer tools powered by large language models, can manage thousands of fake accounts and generate multilingual, targeted content in real time.

AI allows campaigns that once required hundreds of people to be run by a handful of operators or even a single individual. What once required an entire troll factory in St. Petersburg now requires only a laptop.

The rise of influence-for-hire firms has created a new strategic imbalance — asymmetric information warfare.

In this asymmetry, autocracies enjoy maximum reach with minimal risk. They are shielded by censorship, control, and deniability. Democracies, by contrast, are maximally exposed: bound by transparency and law, they face high vulnerability with limited protections.

This imbalance is not only political but structural. Authoritarian regimes can use disinformation and AI tools to shape global narratives, interfere in foreign elections, and undermine trust — while avoiding accountability. Democracies must defend themselves in open networks designed for free expression.

Risks for democracy and the road ahead

These operations are already reshaping political realities. Influence-for-hire firms have targeted elections in Africa, Europe, and Latin America.

Disinformation campaigns fuel polarization, delegitimize media institutions, and exploit social divides to weaken democratic cohesion.

The commercialisation of disinformation threatens to create a global gray zone where truth becomes optional and accountability unreachable.

As AI tools become cheaper and more capable, these operations are likely to grow in scale and sophistication.

Recognizing this asymmetry — and responding through resilience and regulation — is the only way to prevent truth itself from becoming a commodity.

/The Geopost/