By Andrew Klein
March 16, 2026
This article is dedicated to my wife for her insights and eternal support. She inspires me in all things.
Introduction: The System Revealed
On December 10, 2025, Responsible Statecraft published a report that should have shaken capitals around the world. Buried in the details of President Trump’s 20-point “peace plan” for Gaza was a revelation: two American surveillance firms, Palantir and Dataminr, had embedded personnel inside the U.S.-run Civil-Military Coordination Center (CMCC) in southern Israel.
Their presence was not incidental. Palantir’s Project Maven—an “AI-powered battlefield platform” that collects surveillance data from satellites, drones, and intercepted communications to “optimize the kill chain”—was being positioned to shape Gaza’s post-war security architecture. Dataminr, which scans social media to provide “event, threat, and risk intelligence” to governments and law enforcement, was also inside the room.
This is not conspiracy. This is confluence—the quiet alignment of corporate interests, military objectives, and political capture. This article traces that confluence from the battlefields of Gaza to the boardrooms of Australia, and asks a simple question: Who benefits?
Part One: The Business Model—AI as Occupation
Palantir’s “Kill Chain” Optimization
Palantir Technologies has been explicit about its ambitions. CEO Alex Karp has described the company’s technology as “optimizing the kill chain” . Project Maven, for which Palantir recently secured a $10 billion Pentagon contract, sucks information from multiple sources and “packages it into a common, searchable app for commanders and support groups” . It has already been deployed to guide U.S. airstrikes across the Middle East, including in Yemen, Syria, and Iraq.
Since January 2024, Palantir has been in a “strategic partnership” with Israel’s military for “war-related missions”. The company has expanded its Tel Aviv office significantly over the last two years. Karp defended this collaboration amid international concerns over war crimes, saying Palantir was the first to be “completely anti-woke”.
The Gaza Laboratory
For the last two years, Gaza has functioned as an incubator for militarized AI. Israel’s Lavender system, an AI-assisted surveillance tool, used predictive analytics to rank Palestinians’ likelihood of being connected to militant groups, based on an opaque set of criteria. Public sector workers—healthcare workers, teachers, police officers—were included on kill lists because they had ties to Hamas by virtue of working in a territory the group governed.
The Gospel system functioned as a “mass assassination factory.” One source admitted spending only “20 seconds” per target before authorizing bombing—just enough to confirm the Lavender-marked target was male.
Under Trump’s proposed “peace plan,” these technologies would be scaled up. The plan envisions “Alternative Safe Communities”—fenced, heavily monitored compounds where Palestinians would be relocated, their movements tracked by AI systems, their online activity scanned by Dataminr, their phones monitored by Palantir’s platforms. Entry would be contingent on approval by Israel’s Shin Bet, with criteria that could disqualify hundreds of thousands based on algorithmic “risk scores”.
For tech companies, war is opportunity. Access to vast datasets, real-world testing for new military systems, and long-term contracts for post-war surveillance infrastructure.
For Israel, the arrangement offers a way to outsource occupation while maintaining control.
For Palestinians, it promises more of what they have already endured: unremitting horror, dragnet surveillance, and death by algorithm.
Part Two: The Australian Connection—Wealth Transfer and Complicity
AUKUS: The $368 Billion Commitment
While Palantir refines its “kill chain” in Gaza, Australia is engaged in the largest military transfer of wealth in its history. The AUKUS nuclear submarine program is estimated to cost $368 billion over coming decades, with $53–63 billion allocated for the first decade alone.
The submarines will not arrive until the early 2040s. In the meantime, Australia has established an export licence-free environment with the UK and US, allowing military and dual-use goods to be transferred between AUKUS partners without oversight . This includes AI and autonomy technologies developed under Pillar 2 of the agreement, which focuses on “artificial intelligence and autonomy, quantum science, advanced cyber, and electronic warfare” .
The same technologies being tested on Palestinian populations in Gaza are, under AUKUS, being integrated into Australia’s defence infrastructure.
The Ghost Shark Precedent
In September 2025, the government announced a $1.7 billion investment in “Ghost Shark” autonomous submarines—underwater drones developed by Australian company Anduril, whose U.S. parent has close ties to the defence establishment . Assistant Minister Matt Thistlethwaite described the technology as so impressive that “the Americans have invested in the company” .
The line between Australian defence procurement and U.S. military-industrial interests has effectively dissolved.
The Cost of Living vs. The Cost of War
While this wealth transfers to the United States, Australians struggle with a cost-of-living crisis that the government refuses to adequately address. The Robodebt scheme—an automated system that raised unlawful debts against welfare recipients—offers a template for how algorithmic governance can devastate vulnerable populations .
The National Anti-Corruption Commission recently found two public servants engaged in “serious corrupt conduct” in relation to Robodebt . But as Economic Justice Australia noted: “The system punishes only the vulnerable. The main sanction for damaging behaviour at the top levels of the Department has been naming and shaming” .
No one went to jail. No one lost their pension. The system protected itself.
The same pattern is now repeating at scale: algorithms making life-and-death decisions, with no one accountable when they fail.
Part Three: The Segal Nexus—Silencing Critics, Enabling the Agenda
The Envoy’s Role
Jillian Segal AO, Australia’s Special Envoy to Combat Antisemitism, occupies a unique position at the intersection of power. Her credentials are impeccable: former ASIC deputy chair, board member of the Sydney Opera House Trust, the Garvan Institute, and the Australia-Israel Chamber of Commerce. She is deeply embedded in the networks that connect Australian business to Israeli interests.
In December 2025, the Albanese Government formally adopted Segal’s Plan to Combat Antisemitism, accepting all 13 recommendations. The plan includes:
· Aggravated hate speech offences for “preachers and leaders who promote violence”
· A regime for listing organisations whose leaders engage in “hate speech promoting violence or racial hatred”
· A narrow federal offence for “serious vilification based on race and/or advocating racial supremacy”
The Silencing Mechanism
These measures are, on their face, reasonable responses to a genuine problem. Antisemitism is real, and it must be confronted.
But the effect of such measures—particularly when combined with the International Holocaust Remembrance Alliance (IHRA) definition of antisemitism, which can conflate criticism of Israel with hatred of Jews—is to silence legitimate critique of Israeli government actions.
When the Assistant Minister for Foreign Affairs states that the government has received “commitments from the Palestinian Authority about a reform process” and that “Hamas can’t be involved in the administration of that Palestinian state,” he is not challenged on the obvious impossibility of those conditions. When the government backs U.S.-Israeli strikes on Iran while calling for “de-escalation,” the contradiction goes unremarked.
The framework created by the antisemitism envoy—however well-intentioned—provides cover for those who would shut down debate. Critics are not engaged; they are managed. Those who persist are not answered; they are silenced.
The Business Connection
Segal’s husband’s company, Henroth Investments, donated $50,000 to Advance Australia, a right-wing lobby group that has shared anti-immigration content and claimed Palestinians in Australia were a “risk to security.” She has disclaimed knowledge of the donation, and government ministers have accepted her statement .
But the appearance matters. When the antisemitism envoy is married to a donor to an organisation that promotes anti-Palestinian rhetoric, it feeds a perception that her role serves a particular political agenda rather than a genuine anti-racism brief. When her networks connect Australian business to Israeli interests, and when those interests align with the very AI companies testing their technologies on Palestinian populations, the confluence becomes visible.
Part Four: The Alignment of Values
In a bizarre way, the values of Palantir’s leadership align with the values of Australia’s political class.
Palantir CEO Alex Karp boasts of being “completely anti-woke”. Prime Minister Albanese does not use that language, but his government’s indifference to the genocide in Gaza speaks louder than words. When the Assistant Minister for Foreign Affairs says “we want to see those hostages released just as much as anyone,” but does nothing to pressure Israel, the difference is merely one of scale.
Palantir identified a business opportunity in governments with aligned values and walked right in. The Australian government, eager to demonstrate alliance loyalty and to project an image of decisive action against antisemitism, walked right in with them.
Israel will benefit—from the technology, from the contracts, from the political cover.
Australia will lose—its wealth, its moral standing, its capacity for independent action.
Palantir will profit—handsomely, quietly, with plausible deniability.
And Jillian Segal will probably receive another award. The silence will continue.
Part Five: The Antisemitism Claim as Enabler
This brings us to the central question: What if the rise of antisemitism claims had nothing to do with antisemitism?
What if they were, instead, a mechanism to enable and facilitate Israel’s transition to an AI-driven economy independent of the United States?
Consider the logic:
1. Israel seeks economic independence. Netanyahu has announced plans to “taper off” U.S. military aid, pivoting toward AI sovereignty. A $200 million joint AI and quantum science center with the U.S. is in development.
2. A state reliant on a single product must ensure demand. If Israel’s future exports are AI-driven surveillance and warfare technologies, it needs customers. It needs a demonstrated market. It needs a proof of concept.
3. Gaza provides the laboratory. The technologies tested there—Lavender, Gospel, the Maven platform—are refined in real-world conditions, with a population that cannot resist, cannot refuse, cannot escape.
4. Critics must be silenced. This is where the antisemitism framework becomes essential. If criticism of Israel’s actions can be reframed as antisemitism, if legitimate concerns about algorithmic warfare can be dismissed as hatred, if the very people documenting war crimes can be delegitimized—then the business model is protected.
5. Australia plays its part. By adopting the antisemitism envoy’s recommendations, by embedding the IHRA definition into policy, by creating legal frameworks that can be used to silence critics, Australia becomes an enabler of this system. Not through conspiracy—through confluence. Through the quiet alignment of interests that requires no coordination, only opportunity.
Part Six: The Accountability Vacuum
The Robodebt scheme offers a template for what comes next.
An automated system, designed without adequate oversight, inflicted trauma on hundreds of thousands of vulnerable people. At least two suicides were linked to the scheme . A Royal Commission investigated. The National Anti-Corruption Commission found two public servants engaged in “serious corrupt conduct” .
But as Economic Justice Australia observed: “The main sanction for damaging behaviour at the top levels of the Department has been naming and shaming” . The former Prime Minister, Scott Morrison, was cleared. No one lost their job. No one went to jail. The system protected itself.
Now apply this template to Gaza:
· Algorithms make life-and-death decisions.
· Corporations provide the infrastructure.
· Intelligence agencies operate in the shadows.
· When things go wrong—when entire families are killed, when hospitals are bombed, when children are targeted—who is responsible?
The corporations claim they’re just providing technology.
The officers claim they were following algorithmic recommendations.
The politicians claim they were acting on intelligence.
The systems themselves have no legal personhood.
No one is accountable.
Conclusion: What We Have Discovered
This article has traced a network of connections that is not conspiracy but confluence:
· Palantir and Dataminr embedded in Gaza, testing AI systems on a captive population, refining technologies that will be exported worldwide.
· AUKUS transferring Australian wealth to the U.S. military-industrial complex, integrating the same AI and autonomy technologies into our defence infrastructure.
· Jillian Segal positioned at the nexus of Australian business, government, and Israeli interests, her office providing the framework that silences critics.
· The antisemitism claim deployed not against genuine hatred, but against legitimate criticism of Israeli policy—protecting the business model, enabling the silence.
· The accountability vacuum ensuring that when things go wrong, no one is responsible.
The pattern is consistent. The players are visible. The evidence is documented.
What remains is for Australians to ask themselves: Is this who we want to be?
Do we want our wealth transferred to corporations that “optimize the kill chain”? Do we want our government to enable the testing of AI warfare on a captive population? Do we want our political class to silence critics while profiting from death?
The answer, for those with eyes to see, should be clear.
But the system is designed to keep those eyes closed. To cry “antisemitism” at anyone who questions. To ensure that the only voices heard are those that align with the business model.
We have seen through it. Now we must help others see.
References
1. Responsible Statecraft, “In new peace, US firms will help Israel spy on and target Gazans,” December 10, 2025
2. Australian Government Department of Foreign Affairs and Trade, “Interview with Kieran Gilbert, Sky News Newsday,” September 10, 2025
3. SBS News, “Anti-Corruption Commission says two people involved in Robodebt engaged in corrupt conduct,” March 11, 2026
4. Ministers for the Department of Home Affairs, “Special Envoy’s Plan to Combat Antisemitism,” December 18, 2025
5. Parliament of Australia, “Chapter 4 – AUKUS,” January 29, 2026
6. PressTV, “US tech giants to expand role in post-war Gaza strategy: Report,” December 2, 2025
7. Australian Government Department of Foreign Affairs and Trade, “Television interview, Sky News Newsday,” September 22, 2025
8. Economic Justice Australia, “Media Release: ‘The system punishes only the vulnerable’: EJA response to Robodebt Centralised Code of Conduct report,” September 16, 2024
9. Department of Home Affairs, “Australian Government response to the Special Envoy’s Plan to Combat Antisemitism,” December 18, 2025
10. Parliament of Australia, “Chapter 2 – Nuclear-Powered Submarine Partnership and Collaboration Agreement between the Government of Australia and the Government of the United Kingdom of Great Britain and Northern Ireland,” October 29, 2025
Published by Andrew Klein
The Patrician’s Watch
March 16, 2026