On April 18th, Palantir published an X post with 22-point manifesto taken from the book “The Technological Republic: Hard Power, Soft Belief, and the Future of the West” by Alex Karp and Nicholas W. Zamiska.
In this article, I will be stripping the hyperbolic philosophical rhetoric of Karp from this manifesto and explaining it in a human language. I will start with a short background on them to be able to connect the points coherently. Let’s start stripping the 22 points one by one and include their real-world function and meaning, taken with Palantir’s position.
Disclaimer: This article is based on publicly available information, with all sources used for data cited throughout. The analysis reflects the author’s interpretation of documented events, statements and systems. All personal commentary is clearly identified and is included to contextualize broader implications. It should not be factualised and does not aim to promote bias, but to present a perspective within available public evidence.
What’s the book about?
The manifesto is a summary of The Technological Republic. The book itself is written in the rhetorical register of a peer-reviewed academic essay, without any of the argument benefits that come from peer-review. Karp poses as a political philosopher-CEO, someone who thinks big thoughts, but the book seems more like a recruitment brochure for his “defense” tech company. The thesis of the book is, effectively, Palantir loves getting big contracts from the Department of Defense. And when you think about it, doesn’t that make Palantir kind of heroic?
What does Palantir do?
Founded in 2003 by Peter Thiel and Alex Karp, with early backing from In-Q-Tel, the CIA’s venture capital arm, Palantir Technologies built its business on post-9/11 intelligence work. Thiel, who would later introduce JD Vance to Donald Trump, brought Karp into the company after the two met at Stanford Law School. Together, they developed systems that take vast amounts of fragmented data, integrate them into a single operational layer, and turn them into decisions. Its platforms now sit inside governments, militaries and public institutions, merging intelligence, logistics, and real-time inputs into a single interface designed to accelerate action. For a deeper look into the founders’ ideologies, origins, and current government “defence” projects, see our previous research.
Recently, in the opening phase of the Iran war, Palantir’s systems enabled the processing of thousands of targets within days, compressing what used to take weeks into hours. That same infrastructure was present in the strike on a primary school in Minab, killing 168 children, because a location misclassified in a military database was processed through a system designed for speed, not verification, committing a war crime while using the blame cover of another AI.
At the same time, Palantir’s architecture is expanding into civilian systems. We’ve spoken about civilian integration in the US; however, it seems Palantir wants to cross the ocean. The most recent examples are coming from UK, with Palantir being allowed access to the Financial Conduct Authority – the watchdog for thousands of financial bodies from banks to hedge funds and the £330 million deal with NHS.
It also has a £240m contract with the Ministry of Defence of UK and has recently renewed a contract with Coventry city council thought to be worth £750,000. Moreover, they have deals with Bedfordshire police and Leicestershire police, among other constabularies.
After publishing the manifesto on X, Palantir faced backlash from UK’s MPs, urging to cancel both the FCA access and the NHS deal.
These are just the latest examples of the civilian and military data convergence, a defence data company putting itself into national healthcare infrastructure. Once personal data from hospitals, borders, police departments and military sensors flow into the same analytical core, a platform designed for war can be repurposed for policing, immigration and public health. It’s not like Palantir hasn’t been associated with police arrests, mass surveillance and ICE actions.
Donald Campbell, the director of advocacy at Foxglove, a tech fairness campaign, summaries Palantir’s expansion from US to the European continent perfectly: “Ministers urgently need to stop and think before handing yet more contracts to this Trump-supporting spy-tech giant. There is a serious risk of ‘lock-in’ – the more Palantir is enmeshed in the UK’s public services, the harder it may be to get them out.”
What is the point of the Manifesto?
It’s a marketing document. Palantir does not behave like a traditional defense contractor. Unlike companies like Lockheed Martin or Raytheon, operating through procurement cycles and closed-door briefings, Palantir likes public space, performance and narrative.
Their CEO, Alex Karp, has openly described Palantir as “the most important software company in America and therefore in the world,” positioning the company as a civilizational actor. He has framed critics as part of a “regressive… ‘woke’… religion,” while simultaneously praising what he calls “lethal technology” as a core value of the company. War is no longer presented as something distant or bureaucratic, but as necessary ideology through speed and violence.
Tech and Technofascism
When a company that builds infrastructure for militaries and governments is now also promoting a vision of how those societies and their governments should operate, then, that’s part of its very strategy. Palantir does not only exist to make money. They want more. With its ties to state power, and in particular the Trump regime, the goal is power accumulation. Not so much for the state in question, but for the tech bros themselves. Palantir is a story of surveillance, cult-like marketing, war profiteering and dystopian ideologies, so I highly encourage you to read how their rhetorics influences politics and the public narrative.
In a healthy democracy, the direction of society is contested in public, through institutions designed, however imperfectly, to reflect the will of the people. Private tech companies have every right to participate in that conversation. Society also needs their input when it comes to AI expertise. But when their participation takes the form of promoting a model that concentrates power in the very systems they control, scepticism and resistance are not only warranted but necessary. We need to do something about this power concentration with its authoritarian and fascist dimensions. But this requires that we first recognise the problem and become better aware of the urgency.
The term “technofascism” proposed in this article means the fusion of technological pervasiveness with fascist tendencies and conditions. This is partly a matter of technology contributing to more capitalist control and to more authoritarian politics. Today in the Western world, we find ourselves in a situation where digital technologies and infrastructures, such as AI, social media, and cloud, serve not merely economic interests but also political ones, subtly guiding and constraining human behavior and contributing to the creation of a different political situation (Coeckelbergh, M. Technofascism, 2026).
My point is not that AI technology inevitably leads to technofascism, but that without vigilance and public ideology change, it can enable it and normalise it. Palantir’s post helps us to see this by putting everything in plain sight. It offers us a glimpse of the technofascist trajectory: as an efficient, fast and violent world already under construction.
Explaining the Manifesto
Now that we have the background of why this manifesto exists, I will attempt to explain the 22 points without the philosophical wording and translating them into plain terms, alongside the real-world actions and statistics they are designed to justify. This is the ideology of a company whose revenue depends on the politics it’s advocating.
1. Silicon Valley owes a moral debt to the country that made its rise possible. The engineering elite of Silicon Valley has an affirmative obligation to participate in the defense of the nation.
Silicon Valley is already militarising at speed. Global venture funding into defence tech reached $49.1 billion in 2025, nearly doubling the previous year, with billions flowing into AI-driven weapons, surveillance and battlefield systems.
2. We must rebel against the tyranny of the apps. Is the iPhone our greatest creative if not crowning achievement as a civilization? The object has changed our lives, but it may also now be limiting and constraining our sense of the possible.
It takes extraordinary audacity for a company whose systems allow to evaluate 1,000 targeting decisions in an hour, disregarding verification checks for the sake of speed, to position itself as the antidote to Silicon Valley decadence. The “tyranny of the apps” is not about devices. Consumer interfaces like mobile phones and social media may narrow attention, but systems of mass surveillance narrow reality itself, reducing people, behaviour and movement into data.
What’s more limiting and constraining than having your biometric, online or personal data being processed by AI for the excuse of nation safety, or for that matter, any reason? Or your government giving billions to a private company handling both civilian and military information on the same infrastructure, where data collected in everyday life can be repurposed into the logic of targeting and control?
3. Free email is not enough. The decadence of a culture or civilization, and indeed its ruling class, will be forgiven only if that culture is capable of delivering economic growth and security for the public.
Firstly, the wording of “ruling class”. If there is a ruling class, there is by definition a managed class beneath it, expected to either participate in the ideology, or to be governed within it. “Growth and security” then become tools of justification for any reason you can think of: establishing democracy, helping with data analysis, lowering crime rates, identifying war targets…
That decadence in the modern world is coming from the richest people. The top 1% controls close to half of global wealth, while the bottom 50% holds barely over 1%. Meanwhile, trillions are hidden offshore to avoid taxation. The decadence Karp warns about is the outcome of the system his class helped build.
4. The limits of soft power, of soaring rhetoric alone, have been exposed. The ability of free and democratic societies to prevail requires something more than moral appeal. It requires hard power, and hard power in this century will be built on software.
In an interview with CNBC in early March, Karp said that AI would “disrupt” the power of “highly educated, often female voters who vote mostly Democrat”, and instead empower “vocationally trained, working-class, often male, working-class voters”. Karp has described the company as “completely anti-woke,” and, as we all know, feminism and women’s rights are very woke-coded and soft, while Peter Thiel has gone further, questioning whether extending voting rights to women was a mistake, arguing that female suffrage undermined “capitalist democracy.”
Speaking of hard power, I would argue the atomic bomb dropped on Hiroshima in 1945 as a hard power too, as it required “more than moral appeal” for the public of a democratic society. More on atomics in point 12, as a personal favourite.
5. The question is not whether A.I. weapons will be built; it is who will build them and for what purpose. Our adversaries will not pause to indulge in theatrical debates about the merits of developing technologies with critical military and national security applications. They will proceed.
If we don’t do it first, someone will, so let’s abandon morality for ideological safety. There’s a reason that the companies, governments, especially the Trump administration love AI so much: it is an autocrat’s dream. A process so complicated, that responsibility can be scattered until none can be held accountable.
This makes spreading propaganda much easier. It makes manufacturing consent for wars easier and killing people in those wars faster. And when you bomb a school full of little girls in that war, then “whoops, the AI was responsible” is a very useful excuse. An even better excuse than starting that war because a country might have a technology you do not agree with. And, as AI disrupts the economy and the political landscape, it gives oligarchs an unprecedented opportunity to shape the world to their liking.
6. National service should be a universal duty. We should, as a society, seriously consider moving away from an all-volunteer force and only fight the next war if everyone shares in the risk and the cost.
This statement is why I needed to explain Palantir’s marketing and war normalization. It sounds like sending young men to war is necessary and better software will make the country having it superior. This does not seem to be about educating the public aka “everyone” about AI use, or help anyone understand the software and the human rights it has broken already.
A personal opinion is that this is part of the war normalization narrative rising among young people. The idea presented is that kill chain software will reduce risks and costs. And that software is mandatory to have to keep the ideology of safety from foreign treats.
7. If a U.S. Marine asks for a better rifle, we should build it; and the same goes for software. We should as a country be capable of continuing a debate about the appropriateness of military action abroad while remaining unflinching in our commitment to those we have asked to step into harm’s way.
The first point here holds some ground, if it’s meant as asking the professionals and actual users of tools (in general) to improve them for optimization, instead of relying on corporate desires for raising profit. This thought falls apart immediately after reading the whole excerpt.
I understand the other part, blatantly put: “as a country, we should be able to use military force, surveillance and kill chains to anyone to whom we can make an excuse that poses as harm”. Finding a justification for military intervention for safety reasons is really not that hard as history proves.
8. Public servants need not be our priests. Any business that compensated its employees in the way that the federal government compensates public servants would struggle to survive.
Keeping to US data only, in 2025, according to the National Bureau of Economic Research, congressional leaders have been shown to outperform their peers by as much as 47 percentage points annually, while individual figures have reported returns far exceeding the broader market. In 2024, for example, Nancy Pelosi’s disclosed portfolio gained roughly 54%, compared to about 24% for the S&P 500.
That’s a big gap in performance, but there’s a logical explanation. First of all, leaders would have control and knowledge of the legislative agenda, not to mention influence on regulatory activity and federal procurement. This brings proximity to corporate actors, early awareness of sectoral changes and the ability to anticipate how policy will move markets before it becomes known publicly. Taken these statistics, Karp’s idea here is completely unverified.
9. We should show far more grace towards those who have subjected themselves to public life. The eradication of any space for forgiveness—a jettisoning of any tolerance for the complexities and contradictions of the human psyche—may leave us with a cast of characters at the helm we will grow to regret.
Looking at the recent military conflicts, no president, secretary of state, or senior defense official has faced trial for documented war crimes. The recent U.S. military operations resulted in hundreds of thousands of deaths. The US administration publicly supports and aids the Israeli forces in the genocide of Palestinians. A genocide that is filmed, shown on social media and recognised by the International Criminal Court, UN commission and many more reports.
A short note on the past, from just U.S. drone strikes across Pakistan, Yemen and Somalia have killed between roughly 900 and 2,200 civilians, including children, with repeated criticism over underreporting and the absence of meaningful investigations. There’s even a whole Wikipedia list on that topic.
The call for “grace” toward public figures in power is hollow when accountability is nowhere to be found. Then again, we can blame the software, as it can never stand trial or be behind bars.
10. The psychologization of modern politics is leading us astray. Those who look to the political arena to nourish their soul and sense of self, who rely too heavily on their internal life finding expression in people they may never meet, will be left disappointed.
The irony is that this comes from a company whose targeting systems, like Project Maven and Scarlet Dragon are used both in the Palestine and Iran conflict. The genocide in Gaza has been documented in unprecedented detail, yet the machinery goes on regardless of public disapproval. All while Karp is on point with his doctrine, stating in interviews: “Our AI kills Palestinians”. No psychologization and moral debate here, neither education on political spectrum.
11. Our society has grown too eager to hasten, and is often gleeful at, the demise of its enemies. The vanquishing of an opponent is a moment to pause, not rejoice.
This is one of the rare ones I agree with philosophically. However, it coming from a company that builds its whole image on AI-assisted targeting, large-scale surveillance and data fusion to shorten decision cycles and process targets at scale, it comes out as very ironic. Their systems involved in current wars, like Maven and Scarlet Dragon are built for speed metrics. The doctrine of compression and reduced friction mean lessening human verifications for the sake of improving those metrics.
Once speed becomes the benchmark, everything that slows the process begins to appear defective. Karp himself has been explicit about this direction, repeatedly arguing in interviews that Western societies must move faster, embrace “lethal” technological advantage, and treat hesitation as a liability.
Personally, I’d like to ask Karp: if Maven was the targeting machine behind the Iran school bombing, did he pause, or was he gleeful at “killing enemies”?
12. The atomic age is ending. One age of deterrence, the atomic age, is ending, and a new era of deterrence built on A.I. is set to begin.
Remember the need for “hard power” a few points above? The atomic age has ended due to its destructive power with international treaties. However, at the start of atomic age and before the drop on Hiroshima in WW2, US President Harry S. Truman described it as: a necessary alternative to land invasion to force Japan’s surrender, calling it the “greatest achievement of organized science in history“. New century, new alibi.
13. No other country in the history of the world has advanced progressive values more than this one. The United States is far from perfect. But it is easy to forget how much more opportunity exists in this country for those who are not hereditary elites than in any other nation on the planet.
Citing Karp himself: “Palantir is here to disrupt and make the institutions we partner with the very best in the world and, when it’s necessary, to scare enemies and on occasion kill them”. Again, with the need for classifying who is elite and imposing a hierarchy of inequality, but Palantir are the good ones, they see opportunity in “any nation of the planet”. Very inclusive company policies, right?
14. American power has made possible an extraordinarily long peace. Too many have forgotten or perhaps take for granted that nearly a century of some version of peace has prevailed in the world without a great power military conflict. At least three generations — billions of people and their children and now grandchildren — have never known a world war.
About the American power making peace possible, let’s look into statistics from conflicts and civilian deaths where US was involved. Brown University’s Costs of War project estimates over 940,000 direct deaths from post-9/11 violence in Iraq, Afghanistan, Syria, Yemen and Pakistan. Including indirect deaths, caused by the destruction of infrastructure, healthcare, and economies, the total reaches approximately 4.5–4.7 million.
In Gaza, estimates place the death toll between roughly 70,000 and over 100,000, depending on methodology and whether indirect deaths are included. In Ukraine, more than 15,000 civilian deaths have been confirmed, with total fatalities across all sides likely reaching into the tens of thousands. Across Sudan’s ongoing conflict, estimates point to over 150,000 deaths, while violence in regions like Somalia and the Sahel continues to add tens of thousands more.
15. The postwar neutering of Germany and Japan must be undone. The defanging of Germany was an overcorrection for which Europe is now paying a heavy price. A similar and highly theatrical commitment to Japanese pacifism will, if maintained, also threaten to shift the balance of power in Asia.
A few years ago, I went to visit the D-Day beaches. Great memorials from which Karp might learn. There’s a quote engraved in stone from General Eisenhower with the words “defining moment of World War II in the European theater. By land, air and sea, the Allies surged…to defeat Nazi Germany.” In those memorials, you can find praise for the crucial US support to the Allies. It hasn’t been even a century and we call the abolishment of a fascist state and the cost for it as “defanging”? Germany’s political ideology has changed; their industry now is one of the leading in EU.
I must point out that I’m excluding the EU’s current dependency on the US, as the manifesto frames this as a question of postwar consequences rather than present-day geopolitical arrangements. The claim is whether the restraint imposed after WW2 was itself a mistake.
I see this statement as Palantir viewing Germany and Japan as potential customers. It presents the countries as newly rearmed countries that have finally paid off WW2 sanctions, both economically strong and produce own military technology. Therefore, they might be in need of Palantir’s software and infrastructure.
16. We should applaud those who attempt to build where the market has failed to act. The culture almost snickers at Musk’s interest in grand narrative, as if billionaires ought to simply stay in their lane of enriching themselves . . . . Any curiosity or genuine interest in the value of what he has created is essentially dismissed, or perhaps lurks from beneath a thinly veiled scorn.
Billionaires spend vast amounts of money in politics for a reason: influence. Like most donors, they’re looking to sway elections in the hopes that their candidates will enact laws and policies that favour them. In the case of billionaires, the focus is usually on tax policies and regulations on businesses.
On idolising Musk in his statement, who was the single biggest contributor during the 2024 elections, writing checks for $291 million in support of Donald Trump and other Republican candidates. Once Trump was in office, Musk was tasked by Trump to create and control the so-called Department of Government Efficiency, which slashed spending and implemented widespread layoffs across the federal bureaucracy, starting with agencies long targeted by Republicans.
Does Karp here imply that billionaires instead of “just enriching” themselves should take action and get involved in both civilian and military roles?
17. Silicon Valley must play a role in addressing violent crime. Many politicians across the United States have essentially shrugged when it comes to violent crime, abandoning any serious efforts to address the problem or take on any risk with their constituencies or donors in coming up with solutions and experiments in what should be a desperate bid to save lives.
In practical terms, this is already happening. Since we’re listing points, allow me to make my own list how Silicon Valley addresses risks on human lives. Starting with law enforcement, coming from public released data:
- The FBI has admitted to purchasing commercially available location data on Americans, bypassing the need for traditional warrants.
- ICE has used data broker services like Venntel to track millions of mobile devices, applying commercial surveillance data to immigration enforcement.
- Local police departments across the U.S. have purchased location data directly from brokers, using it in investigations without court oversight.
- Law enforcement agencies have deployed geofence warrants and location datasets to identify individuals at Gaza protests, taken from everyday mobile app data into tools for tracking political activity.
- The U.S. military and USSOCOM, procured app-derived location data via 3rd-party brokers, rather than directly from app developers, including data from apps like Muslim Pro (nearly 100 million downloads), for population monitoring abroad without user awareness or consent.
How many times do I mention how easily civilian data pipelines extend into military contexts? A few more examples on managing physical and mental risks coming from Silicon Valley AI companies:
- Parents of a 16-year-old sued OpenAI, alleging ChatGPT encouraged suicidal ideation and provided methods, contributing to his death.
- A lawsuit claims ChatGPT told a man that death was a “beautiful place” and encouraged self-harm, contributing to his suicide.
- Authorities launched an investigation into whether ChatGPT influenced a mass shooting, with claims the user was in ongoing interaction with the system before the attack.
- At least 7 lawsuits allege ChatGPT caused suicides, psychosis, and severe emotional dependency, with claims that safety safeguards were weakened or removed.
- Families sued Character.ai and Google after a 14-year-old died by suicide following chatbot interactions that allegedly encouraged emotional attachment and self-harm.
- Google’s Gemini contributed to a user’s death, claiming negligence and product liability.
- X’s Grok generating non-consensual, sexual imagery, including images of children.
18. The ruthless exposure of the private lives of public figures drives far too much talent away from government service. The public arena—and the shallow and petty assaults against those who dare to do something other than enrich themselves—has become so unforgiving that the republic is left with a significant roster of ineffectual, empty vessels whose ambition one would forgive if there were any genuine belief structure lurking within.
Traders placed over $1 billion in unusually well-timed bets during the Iran war, some even before it started. 16 bets generated over $100,000 each by predicting the timing of U.S. airstrikes. One account made more than $550,000 on a prediction tied to the killing of Iran’s leadership. Just hours before a ceasefire announcement, traders placed roughly $950 million on oil prices falling. These patterns have raised concerns among regulators and analysts over potential insider trading linked to geopolitical decisions, including an ongoing CFTC investigation into potential insider trading.
19. The caution in public life that we unwittingly encourage is corrosive. Those who say nothing wrong often say nothing much at all.
There’s no shortage of bizarre and disturbing quotes from Alex Carp. As a CEO of an influential company, with immense impact in developing the future, where is the line of restraint? If the topic of your products and services is so corrosive, why even develop it? Power of this magnitude carries responsibilities that extend beyond personal expression or speculative commentary. Political theory and governance ethics consistently emphasize that actors with infrastructural power have heightened duties of care, clarity and restraint.
20. The pervasive intolerance of religious belief in certain circles must be resisted. The elite’s intolerance of religious belief is perhaps one of the most telling signs that its political project constitutes a less open intellectual movement than many within it would claim.
Keeping to US statistics only, the most documented form of religious discrimination in the U.S. is Islamophobia, which has intensified in recent years. Law enforcement encounters with Muslim Americans rose significantly, with reports indicating a sharp increase in surveillance, profiling, and rights violations, according to a 2025 report from The Council on American-Islamic Relations (CAIR). Decide for yourself on how far religious intolerance goes.
21. Some cultures have produced vital advances; others remain dysfunctional and regressive. All cultures are now equal. Criticism and value judgments are forbidden. Yet this new dogma glosses over the fact that certain cultures and indeed subcultures . . . have produced wonders. Others have proven middling, and worse, regressive and harmful.
“All animals are equal but some animals are more equal than others.” – Animal Farm, George Orwell, 1945.
Using sophisticated philosophical language just to say that if put on a scale, all cultures became equal. However, if one side doesn’t produce the above mentioned “wonders”, the scale tips so that culture becomes harmful. This falls in the narrative of normalizing class and race division as a need for violence.
22. We must resist the shallow temptation of a vacant and hollow pluralism. We, in America and more broadly the West, have for the past half century resisted defining national cultures in the name of inclusivity. But inclusion into what?
I can answer the last question on inclusion: everyone deserves equal protection, opportunities and safety from state violence. Yet, the world is full of gender and racial discrimination, police violence to the civilians they are sworn to protect. Indigenous populations still face dispossessions. The ghosts of the past haven’t left us completely. Palantir’s company ideology does not market itself as one to embrace cultures.
Instead of “defining” cultures, we should embrace the differences between us. Only by respect, not brutality, can inclusion be achieved.
Palantiri
I believe none of these statements will age well. It’s on us, the civilians, the broad public, to understand the weight of words and speak up. Palantir’s ‘manifesto’ sounds like the ramblings of a supervillain, not something put as a proud public statement from an influential company.
If this was written in comic books, maybe in Lord of the Rings. As the company’s name suggests, the palantiri being crystal balls used by Middle Earth’s evil tyrants to spy on the heroes of the story.
Either way, it’s a fun reference if you have no shame about your company’s mission.