Volume VII: The Astroturf Rebellion – How Fake Grassroots Shapes Real Policy
Dedicated to every citizen who ever received a perfectly worded “personal” email from a “concerned neighbor” and wondered why their neighbor sounded exactly like a corporate PR firm.
Introduction: The Synthetic Lawn
Astroturf is artificial grass—designed to look like the real thing from a distance, but upon closer inspection, reveals itself as manufactured, uniform, and utterly lifeless.
The political phenomenon named after it operates on the same principle. Astroturfing is the practice of masking the sponsors of a message to make it appear as though it originates from ordinary citizens or grassroots organisations . It is democracy’s counterfeit currency—spent freely by those who can afford to manufacture public opinion, accepted briefly by those who cannot tell the difference, and devastating to the trust that makes genuine civic engagement possible.
This volume examines the astroturf rebellion: not a rebellion against power, but a rebellion by power against the very idea of authentic public discourse. From the Hungarian influencer factories to the AI-generated comment floods drowning local government meetings, from opaque shell entities in Australian elections to coordinated bot networks spreading across borders—the story is the same. Those who cannot win the argument legitimately will simply manufacture the appearance of victory.
And for the politicians caught in the middle—squeezed between genuine constituent concerns and the artificial tsunami of manufactured outrage—the testicular discomfort is acute. When you cannot tell whether the voices screaming at you are real people or algorithms, how do you govern? How do you represent?
The answer, increasingly, is that you don’t. You simply follow the loudest noise, which is always the one with the most funding behind it.
Chapter 1: The Anatomy of Artificial Grassroots
What Is Astroturfing?
Digital astroturfing is “a form of manufactured, deceptive, and strategic top-down activity on the Internet initiated by political actors that mimics bottom-up activity by autonomous individuals” . In plain language: it’s making fake public opinion look real.
The core astroturfing strategy is the creation of “front groups” that simulate the appearance of independent associations, but which are funded and staffed by outside patrons—corporations, industry groups, wealthy individuals, or even foreign governments . These groups adopt benign, grassroots-sounding names: Mums for Nuclear, Australians for Prosperity, the National Wetlands Coalition, the Coalition for an Affordable City .
Behind each name lies a sponsor. The National Wetlands Coalition, for example, was a front for real estate and utility firms fighting environmental regulations . Mums for Nuclear, whatever its actual composition, was revealed to be backed by interests far removed from ordinary mothers worrying about their children’s future .
The Mechanisms of Deception
Astroturfing operates through multiple channels, each designed to exploit a different vulnerability in democratic systems:
Mechanism Description Impact
Front groups Organizations with benign names concealing corporate sponsors Creates false appearance of grassroots support
Paid influencers Content creators trained and funded to promote specific messages Amplifies campaign talking points through “authentic” voices
Bot networks Automated accounts generating likes, comments, and shares Inflates perceived popularity of positions
Fake comments Mass-produced submissions to public consultations Overwhelms genuine public input
Astroturf advertising Political ads from opaque shell entities Circumvents disclosure requirements
These mechanisms are not mutually exclusive. Sophisticated campaigns combine them, creating an ecosystem of manufactured influence that can overwhelm any honest attempt at public engagement .
Chapter 2: The Hungarian Factory – Megafon and the Astroturf Influencers
The Birth of a Machine
In the 2022 Hungarian parliamentary election, a new form of astroturfing emerged—one so organized, so systematic, that it may serve as a template for illiberal democracies everywhere.
Two years before the election, an agency called Megafon was established with a single purpose: to recruit, train, coordinate, and support pro-government influencers . These were not existing content creators hired for the campaign—they were influencers created specifically to serve campaign goals, trained in messaging, and funded to dominate social media platforms.
The scale of the operation was impressive. Ten Megafon-supported influencers generated tremendous engagement with their posts and spent far more on political advertising than the official electoral actors—the party leader, the party itself, and its candidates .
The Division of Labor
What made the Megafon strategy so effective was its careful division of campaign functions. Through manual content analysis of their advertisements, researchers discovered that the astroturf influencers had taken over specific communication tasks from the official campaign .
The electoral actors—the party leader and official candidates—focused on positive, policy-oriented messaging: acclaiming achievements, discussing policy proposals, and projecting enthusiasm and pride.
The Megafon influencers, by contrast, handled all the dirty work. They took over:
· Attacking communication – Direct assaults on opponents
· Character-focused messaging – Personal attacks rather than policy critiques
· Fear- and anger-oriented campaigns – Emotional manipulation designed to mobilize the base through negative emotions
The official campaign could thus maintain a facade of positivity and statesmanship while the influencer network did the actual work of political destruction. And because the influencers were formally independent—at least publicly—the party could deny responsibility for their most egregious attacks.
The Authenticity Paradox
The influencers consistently referred to themselves as “influencers” and emphasized their authenticity—a key characteristic for building trust with audiences . They admitted to being motivated by political goals but claimed independence from the ruling parties in terms of both funding and coordination.
Leaked emails told a different story. They revealed formal coordination between Fidesz’s official campaign and Megafon, demonstrating that the influencers were engaged in precisely the kind of astroturfing activity the academic literature describes: “coordinated campaign activity instructed by political actors behind the façade of devoted but autonomous supporters” .
The lesson for our anthology is clear: when you cannot tell whether the voices you’re hearing are authentic or manufactured, the democratic process becomes a hall of mirrors. And for politicians facing this onslaught—both those orchestrating it and those targeted by it—the testicular discomfort is intense.
Chapter 3: The Australian Scene – Shell Entities and the 2025 Election
The Rise of Third-Party Advertising
Australia’s 2025 federal election provided a stark illustration of how astroturfing operates in a Western democracy. Researchers tracking digital political advertising across Facebook, Instagram, and TikTok discovered a striking pattern: for every ad from a registered political party, there was roughly one ad from a third-party entity .
These third-party ads often adhered to the formal disclosure requirements set by the Australian Electoral Commission—but the disclosures did not meaningfully inform the public about who was behind the messages. Authorisation typically included only the name and address of an intermediary, often a deliberately opaque shell entity set up just in time for an election .
The Australians for Natural Gas Case
A key example emerged involving the pro-gas advocacy group Australians for Natural Gas. It presented itself as a grassroots movement, but an ABC investigation revealed the group was working with Freshwater Strategy—the Coalition’s internal pollster . Emails obtained by the ABC showed Freshwater Strategy was “helping orchestrate a campaign to boost public support for the gas industry ahead of the federal election” .
The group’s benign name and grassroots presentation concealed a coordinated campaign designed to shape public opinion on energy policy—one of the most contentious issues in Australian politics.
The Naming Game
Other examples identified in monitoring included groups with equally innocuous names: Mums for Nuclear, Australians for Prosperity . These labels suggested grassroots concern but obscured the deeper agendas behind them. In the case of Australians for Prosperity, an ABC analysis revealed backing from wealthy donors, former conservative MPs, and coal interests .
The strategy is simple but effective: choose a name that sounds like your grandmother’s knitting circle, fill your ads with images of ordinary Australians, and hope no one looks too closely at the fine print.
The Battle Over Energy
Nowhere was this more evident than in messaging around energy policy, particularly nuclear power and gas. Both major parties and a swathe of third-party advertisers ran targeted online campaigns focused on the costs and benefits of different energy futures . These ads played to deeply felt concerns about cost of living, action on climate change, and national sovereignty.
Yet many of these messages, particularly those promoting gas and nuclear, came from organisations with opaque funding and undeclared political affiliations or connections . Voters might see a slick Facebook ad or a sponsored TikTok explainer without any idea who paid for it, or why.
And with no obligation to be truthful—federal legislation continues to lag behind community expectations on truth in political advertising—much of this content may be deeply misleading .
Chapter 4: The Romanian Bot Network – Astroturfing Goes Global
The Top News TV Phenomenon
In late 2025, a Facebook page called Top News TV appeared in Romania’s media landscape. In just one and a half months, it recorded extraordinary activity: 620 posts published, over 481,000 likes, approximately 80,000 comments, about 64,500 shares, and a community of 107,000 followers .
The numbers alone should have raised suspicions. An analysis of 598 page followers revealed a stunning finding: 589 accounts were fake or came from countries with no direct connection to Romania—Myanmar, Madagascar, the Philippines, Vietnam, India, and other states . Approximately 98% of the analyzed followers were inauthentic.
The Network Behind the Page
The operation was not random. Researchers identified that messages from the page supporting specific Romanian politicians were strategically distributed in groups across the country. From an analysis of 726 shares for four posts, they discovered that the content was spread by only 13 active accounts across 197 groups .
Of these 13 accounts, 8 were fake (created in November 2024), and 5 belonged to real people or editorial teams promoting specific political messages. Just four accounts—”Claudiu Ionut Popa,” “Mirela Popa,” “Mihaela Popa,” and “Iuilan Iulian”—posted Top News content in 189 distinct groups .
These accounts showed strong indicators of automation, being components of a network coordinating inauthentic behaviors—in other words, part of a bot network.
The International Dimension
The operation’s international footprint extended further. The domain topnewstv.ro was registered by CA ADWISE LLC, a company based in Colorado, United States . This added another layer of opacity to the operation and raised serious questions about financing and coordination.
Meanwhile, despite new EU regulations on political advertising transparency that entered into force in October 2025, violations persisted. Meta had actually decided to completely abandon political advertising on Facebook and Instagram in the EU, citing “significant operational challenges and legal uncertainties” created by the new rules . Google adopted a similar position.
The Romanian case illustrates how astroturfing has become a global industry—one that crosses borders, exploits regulatory gaps, and operates with impunity.
Chapter 5: The AI Revolution – Manufacturing Outrage at Scale
The CiviClick Campaign
In June 2025, the South Coast Air Quality Management District in Southern California considered a proposal to phase out gas-powered appliances. The rules would have added fees to gas furnaces and water heaters, favoring electric alternatives, in an effort to reduce air pollution in a region spanning Orange County and large swaths of Los Angeles, Riverside, and San Bernardino counties .
The opposition appeared overwhelming. Tens of thousands of emails poured into the agency as its board weighed the proposal .
But the emails were not what they seemed. Public records requests confirmed that more than 20,000 public comments submitted in opposition were generated by a Washington, D.C.-based company called CiviClick, which bills itself as “the first and best AI-powered grassroots advocacy platform” .
How AI Changed the Game
CiviClick’s website boasts several tools including “state of the art technology and artificial intelligence message assistance” that can be used to create custom advocacy letters—as opposed to repetitive form letters or petitions often used in similar campaigns . The company’s chief executive described generating more than 20,000 messages to the air district through “aggressive omni-channel outreach to an audience of over half-a-million people” .
When staffers at the air district reached out to a small sample of people to verify their comments, at least three said they had not written to the agency and were not aware of any such messages .
The email onslaught almost certainly influenced the board’s June decision, according to agency insiders, who noted that the number of public comments typically submitted on agenda items can be counted on one hand . The board rejected the proposal 7-5.
The Implications
“This is just the beginning,” warned Dylan Plummer of the Sierra Club . He described the use of AI-powered campaigns as an “emerging fossil fuel industry playbook” that threatens the integrity of policymaking nationwide, pointing to similar campaigns in North Carolina supporting gas pipeline expansion and in the Bay Area using other AI-powered platforms .
A few states have enacted legislation addressing astroturfing and campaign technologies, including California’s 2019 Bot Act requiring automated online accounts to disclose that they are bots if used to influence people about political or commercial matters. But the law doesn’t mention artificial intelligence, which has exploded in recent years .
University of Pittsburgh researcher Samuel Woolley put it bluntly: “These advances in AI really risk degrading the connections between politicians and political bodies and regular people” because they can “make it look like people want things they actually do not want. And the systems simply aren’t set up to deal with these things” .
Chapter 6: The Poisoned Well – How Astroturfing Destroys Trust
The Categorical Stigma
When advocacy organizations are revealed to be fronts for corporate or political interests, the damage extends far beyond the exposed groups. Sociological research has demonstrated that astroturfing leads to “categorical stigmatization”—evaluators make judgments about whole categories of organizations based on stigmatizing events .
In two survey-experiments, researchers found that the revelation of astroturfing by either a corporate sponsor or a think tank sponsor led to significant declines in trust in advocacy groups overall . Not just the exposed groups. All advocacy groups.
This is the poisoned well phenomenon. When citizens discover that some voices are fake, they begin to doubt all voices. The distinction between authentic grassroots and manufactured outrage blurs. Cynicism spreads.
The Consequences for Democracy
The implications are profound. Civil society organizations that advocate for social change play a central role in fostering democracy, civic trust, and building skills for political participation . They serve as a counterweight against the influence of powerful business actors and other elites.
When trust in these organizations erodes, so does the foundation of democratic participation. People who doubt the authenticity of advocacy may reduce their willingness to contribute time or money. They may disengage entirely from civic life.
And for the politicians caught in the middle—the ones who cannot tell whether the voices screaming at them are real constituents or manufactured outrage—the temptation is to simply follow the loudest noise. Which is always the one with the most funding behind it.
The Testicular Experience
For the politician facing an astroturf campaign, the experience is uniquely uncomfortable. You know the voices are not real. You know the emails are generated. You know the outrage is manufactured. But you cannot prove it—not without resources you don’t have, not without access to data you can’t get, not without the political will to challenge forces far more powerful than yourself.
And even if you could prove it, what would you do? The emails are already counted. The outrage is already registered. The damage is already done.
This is testicular tension at its most acute: the knowledge that you are being manipulated, the inability to stop it, and the certainty that your response—whatever it is—will be used against you.
Chapter 7: The Farmers’ Fight – Astroturfing Hits the Land
The Attack on Farmers for Climate Action
In early 2025, Farmers for Climate Action was hit by a coordinated and sophisticated social media attack designed to mislead people into thinking farmers are opposed to renewables .
Approximately 66 fake social media accounts flooded the group’s pages with comments attacking both the organization and renewable energy ahead of the federal election. The accounts looked like they were real farmers—they included conspicuous Australiana, such as vegemite and flags—but they were not .
“These campaigns appear to be part of a deliberate strategy to create a false perception of opposition to climate action within agricultural communities,” Farmers for Climate Action told a Senate inquiry into astroturfing . “These campaigns aim to drown out the authentic voices of farmers who support renewable energy or who have chosen to enter into commercial partnerships with renewable energy companies.”
The Strategy of Division
The disinformation campaigns preyed on farmers’ own fear for the environment, making them feel they were actively contaminating the land by endorsing renewable energy. False claims about renewable energy harming farmland—assertions that wind or solar projects damage soils, threaten food security, or are opposed by rural communities—were repeatedly debunked by peer-reviewed science and the lived experience of farmers, yet continued to circulate .
The campaigns seemed designed to target farmers specifically, as a way of slowing or stopping the shift to clean energy. This cost farmers direct income from clean energy projects and indirect income through worsening storms, droughts, floods, and fires .
At worst, these campaigns set communities against each other. “Those pushing these campaigns seem not to care that they are dividing rural communities,” Farmers for Climate Action observed .
The Reality Behind the Noise
The deception was particularly effective because it contradicted the evidence. Survey after survey showed most farmers support efforts to rein in climate change. An Agricultural Insights Study released at Farmers for Climate Action’s summit showed 57% of farmers named climate change as their top concern . Another survey a year earlier showed 70% of respondents—all people involved in the farming sector in renewable energy zones across the eastern seaboard—supported clean energy projects in their area .
Yet despite this clear and repeated evidence of high levels of support for renewable energy in farming communities, the astroturf campaigns succeeded in creating a false narrative of widespread opposition. Polls showed that people—including regional residents and supporters of renewable energy—significantly underestimated the level of support for renewable energy in regional communities .
The astroturf rebellion had achieved its goal: drowning out authentic voices with manufactured noise.
Chapter 8: The Regulatory Gap – When Laws Can’t Keep Up
The Australian Disclosure Problem
Australian law requires political advertisers to include authorisation details, but these requirements are easily circumvented. Shell entities set up just before elections can serve as intermediaries, providing names and addresses that reveal nothing about the actual funders .
The Australian Electoral Commission’s transparency tools, combined with platform transparency reports, provide some visibility. But as researchers note, “these tools don’t include user experiences or track patterns across populations and over time. This inevitably means some advertising activity flies under the radar” .
The EU’s Attempt and Its Consequences
The European Union introduced new strict rules on political advertising transparency in October 2025. Regulation 2024/900 requires political advertisements to be clearly labeled and include mandatory information about who finances them, amounts paid, and targeting techniques used .
The regulation also prohibits the use of sensitive personal data for profiling and blocks paid advertisements from sponsors in third countries three months before elections.
The response from platforms was immediate and dramatic. Meta decided to completely abandon political advertising on Facebook and Instagram in the EU, citing “significant operational challenges and legal uncertainties” . Google adopted a similar position, considering that the overly broad definition of political advertising created an “unsustainable” level of complexity .
The result? Less transparency, not more. Platforms opted out rather than comply.
The US Patchwork
In the United States, a few states have enacted legislation addressing astroturfing. California’s 2019 Bot Act requires automated online accounts to disclose that they are bots if used to influence people about political or commercial matters .
But the law doesn’t mention artificial intelligence, which has exploded in recent years. And state-level legislation cannot address the international nature of modern astroturfing operations, which routinely cross borders and exploit regulatory gaps.
Chapter 9: The Government’s Own Hand – When States Astroturf
The EPA Case
Astroturfing is not limited to corporate or political campaigns. Governments themselves have been caught manufacturing grassroots support.
In 2015, a non-partisan investigation by the US Government Accountability Office determined that the Environmental Protection Agency used covert propaganda to manufacture support for its Waters of the United States Rule . The agency created a Thunderclap campaign styled “I Choose Clean Water” that posted a pre-written message to supporter accounts: “Clean Water is important to me. I support EPA’s efforts to protect it for my health, my family, and my community.”
The GAO found that EPA violated federal law because the message constituted “covert propaganda”—the agency concealed or failed to disclose its role in sponsoring the material . Federal agencies can promote their own policies, but cannot engage in covert activity intended to influence the American public.
The Chinese Model
In China, a different form of government astroturfing has emerged through “semi-official” party-state presences on social media. Research has shown that these semi-official WeChat public accounts posture as independent from the party-state in order to attract large followings and gain credibility .
Once credibility is established, these accounts operate as “astroturfed influencers,” enabling the Chinese propaganda apparatus to covertly manipulate online discourse with extraordinary efficiency . The accounts appear grassroots but are anything but.
This represents a state-level application of the astroturf strategy—manufacturing the appearance of independent public opinion while maintaining tight control over the message.
Chapter 10: The Testicular Experience of Democracy
For the Citizen
For the ordinary citizen, the astroturf rebellion produces a distinctive form of discomfort. You receive an email that sounds exactly like your neighbor, but something feels off. You see a Facebook ad from “Mums for Nuclear” and wonder who these mums really are. You read comments on a news article and suspect they were written by algorithms, not people.
You cannot trust what you see. You cannot believe what you read. You cannot participate with confidence.
This is the testicular tension of modern citizenship: the knowledge that you are swimming in a sea of manufactured opinion, with no reliable way to distinguish the authentic from the artificial. It makes you want to disengage entirely—to retreat from public life and let the machines fight among themselves.
For the Politician
For the politician, the experience is even more acute. You face a tsunami of public comment—thousands of emails, hundreds of calls, coordinated social media attacks. You know, in your gut, that much of it is fake. But you cannot prove it. And even if you could, the political cost of ignoring it might be your career.
You are squeezed between the need to respond to genuine constituents and the impossibility of distinguishing them from the manufactured mob. Every decision becomes a gamble. Every vote becomes a risk. Every day brings new discomfort.
For Democracy
For democracy itself, the astroturf rebellion is existential. When citizens cannot trust that public opinion is real, they cannot trust that their representatives are responding to actual needs. When representatives cannot distinguish authentic voices from manufactured noise, they cannot govern effectively.
The result is a death spiral of cynicism and disengagement. Trust erodes. Participation declines. The system becomes less and less legitimate in the eyes of those it claims to serve.
And through it all, the astroturf continues to spread—covering the genuine grassroots with synthetic uniformity, choking out the authentic voices that democracy depends on.
Conclusion: The Lawn That Never Was
The astroturf rebellion is not a rebellion against power. It is a rebellion by power against the very idea of authentic public discourse. Those who cannot win arguments legitimately simply manufacture the appearance of victory.
From Hungary’s Megafon influencers to Australia’s shell entities, from Romania’s bot networks to California’s AI-generated comment floods, the pattern is consistent. The technology evolves. The tactics refine. The fundamental strategy remains the same: create the illusion of grassroots support, overwhelm genuine voices with manufactured noise, and hope no one looks too closely at the seams.
For the politicians caught in the middle—the ones who feel the squeeze from all sides, who cannot tell real from fake, who must govern despite the uncertainty—the testicular discomfort is intense and unrelenting.
And for citizens—the ones whose voices are drowned out, whose participation is devalued, whose trust is systematically destroyed—the experience is worse. It is the slow death of democratic hope.
The astroturf rebellion will not be defeated by better laws alone, though laws help. It will not be defeated by better technology, though transparency tools matter. It will be defeated only when citizens refuse to accept synthetic voices as authentic—when we demand to know who is really speaking, who is really funding, who is really behind the message.
Until then, the artificial lawn will continue to spread. And the genuine grassroots—the real, the authentic, the human—will struggle to survive.
Next in the Series:
Volume VIII: The Media’s Squeeze – How News Shapes the Grip
Dedicated to every citizen who ever received a perfectly worded “personal” email from a “concerned neighbour” and immediately checked to see if their neighbour was actually a bot.



