Who is Palantir Technologies?

The company that named itself after an all-seeing magic stone and has been acting like it ever since.
If you've been following along, you already know about Flock Safety — the license plate surveillance company quietly spreading across American cities like a particularly well-funded vine. And you've probably read why that matters.
Today we're going bigger. Much bigger.
Palantir Technologies is to Flock Safety roughly what the Death Star is to a blaster (or what ever they call the pew-pew laser guns).
One is a tool. The other is infrastructure.
And if you want to understand where the erosion of the Fourth Amendment is actually headed (not just the neighborhood-level version, but the governmental, military, healthcare-system, immigration-apparatus version) Palantir is the company you need to know.
Don't worry. They named themselves after a Lord of the Rings artifact - the all seeing eye that the big bad uses. I'm going to have some fun with this.
The Seeing Stone: A Brief Origin Story
Founded in May 2003, Palantir emerged directly from the wreckage of September 11th. Peter Thiel — PayPal co-founder, early Facebook investor, professional apocalypse bringer — had watched the intelligence community fail to "connect the dots" before the attacks.
His theory: the government wasn't short on data. It was short on software that could make sense of it.
He was completely correct. However, his solution is like using a shotgun to kill a housefly.
The name "Palantir" is a deliberate choice. In Tolkien's mythology, the palantíri are seeing stones — dark crystal orbs that let their holders observe distant places, communicate across vast distances, and see things no ordinary person could see. They're also, in the actual books, deeply cursed objects that corrupt most of the people who use them and were ultimately weaponized by Sauron to deceive the leaders of Men.
Tolkien fans, you are welcome to make of that what you will.
Thiel seeded the company with $30 million of his own money and brought in a small crew of Stanford engineers and PayPal veterans. Early funding came from In-Q-Tel (the venture capital arm of the CIA...more on that later) which provided something more valuable than money:
direct access to intelligence analysts and real classified datasets.
By the time Palantir was ready for the commercial market, their product had already been stress-tested against actual national security data in ways no competitor could replicate. (Britannica)
That's not a shady origin story, exactly. It's more of an origin story where the morality is genuinely complicated from the jump, and it never really stops being complicated.
The Money (It's a Lot of Money)
Palantir is not a startup. They are not scrappy. In 2025, the company crossed $4.475 billion in annual revenue, with a growth rate of 56% year-over-year.
U.S. commercial revenue grew 137% in Q4 2025 alone. They are now members of both the S&P 500 and the Nasdaq-100. For 2026, the company is guiding toward projected revenue of approximately $7.2 billion. (Palantir Investor Relations)
Their market valuation sits at extreme multiples (sometimes 60 to 80 times forward revenue) which analysts note is either the sign of an extraordinary growth company or a bubble waiting to find its pin.
The company also holds about $7.2 billion in cash equivalents on its balance sheet, which suggests they are not particularly worried about the valuation debate.
The business model has two main sides:
Government — contracts with intelligence agencies, defense departments, military branches, and immigration enforcement going back to the company's founding. This is where the name was made. It is also where the controversies live.
Commercial — supply chain management, predictive maintenance, healthcare data integration. This is where the growth is accelerating fastest, and it's the side of the business that should probably get more attention than it does.
What the Software Actually Does

Palantir's core product is frequently misunderstood as a "fancy database" or a surveillance camera system (that's Flock's lane). What they actually build is something more abstract and, in some ways, more significant.
The center of the architecture is something called the Palantir Ontology.
Traditional databases organize information into rows and columns — flat tables of data. The Ontology instead maps data to things: a person, a vehicle, a shipping container, a bank transaction, a troop position, a hospital patient.
It creates a semantic layer on top of an organization's existing data infrastructure, letting non-technical users ask questions in plain language and get answers that pull from dozens of connected systems simultaneously.
"Which aircraft need maintenance in the next 48 hours?" "What vehicles passed this intersection between 2 and 4 AM?" "Which patients in this region are at elevated risk for readmission?" The system finds the answer across whatever data sources have been connected to it.
Their primary platforms: (Palantir Product Documentation)
Gotham
launched 2008, the defense and intelligence product. Investigative link analysis: connect disparate data streams to find patterns, relationships, and targets. Most commonly cited in counter-terrorism work, including the (unofficial, never confirmed) hunt for Osama bin Laden, where the platform allegedly integrated millions of intelligence documents to identify the courier who led the CIA to the Abbottabad compound.
Foundry
launched 2015, the commercial enterprise product. Supply chain optimization, predictive maintenance, healthcare data management. Airbus uses it to predict part failures across global airline fleets. The UK's National Health Service uses it to manage patient data across the entire English health system.
Apollo
the deployment and infrastructure layer. Pushes updates across every environment Palantir operates in, including disconnected edge devices like drones in active combat zones. The part of the stack you never see.
AIP (Artificial Intelligence Platform)
launched 2023, the fastest growing product by a significant margin. Integrates large language models directly into Gotham and Foundry, allowing organizations to build AI agents that operate within the Ontology. You can run natural language queries against sensitive organizational data and have AI act on the results with real permissions.
The architecture is designed to be sticky. Once an organization has connected its data systems to the Ontology, rebuilt its workflows around it, and trained its people on it, walking away isn't a software switch...it's a multi-year organizational change project.
Palantir knows this. It's not an accident of engineering. It's a feature.
The Government Relationships (Here's Where It Gets Real)
Palantir's government work isn't just a revenue stream. It's the company's origin, its identity, and its most persistent source of controversy.
ICE
Palantir has held contracts with U.S. Immigration and Customs Enforcement since at least 2011. Their software — deployed under contract names like FALCON and HSI Investigate — connects ICE agents to a web of databases including DHS records, commercial data, and law enforcement systems. The ACLU, Amnesty International, and the Electronic Frontier Foundation have argued that this infrastructure directly enabled mass raid operations and family separations during the Trump administration's 2018 zero-tolerance period. In 2025, Palantir was awarded a $30 million no-bid contract to build ImmigrationOS — an AI platform that identifies noncitizens and tracks deportation logistics. (Fortune)
Palantir's standard response to all of this:
the company doesn't make policy. It builds tools. Humans make the decisions. And every data access is recorded in an immutable audit log, which creates accountability that informal systems don't provide.
That argument is not wrong. It is also not complete. We'll come back to this.
The U.S. Military.
Palantir holds major contracts with the Army, Air Force, and Special Operations Command, and is actively competing for Pentagon AI initiatives including the Maven Smart System. In 2025, the UN Special Rapporteur on human rights in the Occupied Palestinian Territories named Palantir as an enabler of what the rapporteur described as the unlawful use of force, citing the company's work with the Israeli Defense Forces. Palantir disputed the characterization. (ACLU, EFF, Amnesty International reporting)
The NHS.
In the UK, Palantir won a £330 million contract to build the NHS Federated Data Platform — a unified data infrastructure connecting health records across the English health system.
Protests erupted. Patient advocacy groups raised pointed concerns about an American defense contractor with deep ties to U.S. intelligence agencies managing the health records of 56 million people.
The UK government proceeded with the contract anyway. (The Guardian)
In each of these cases, the reasonable defense is real:
the software creates accountability, enables faster decisions, and in some instances, saves lives.
The reasonable concern is also real:
the software creates accountability for the institution — not necessarily for the people being analyzed inside the system.
The CEO Who Says the Quiet Part Very, Very Loudly
Most tech company CEOs communicate in a register of careful, optimistic vagueness.
Alex Karp, Palantir's CEO since founding, communicates like a man who has decided that opacity is for lesser companies.
Karp holds a JD from Stanford and a PhD in social theory from Goethe University Frankfurt.
He has publicly acknowledged that Palantir's software is used to kill people. He has argued that Silicon Valley has a moral obligation to support Western military interests — and that engineers who won't work on defense projects are historically ungrateful given that the entire tech industry was built on a foundation of state investment and defense-funded research.
He has positioned Palantir explicitly as the "anti-woke" software company in a sector that prides itself on progressive branding.
There's something almost refreshing about a tech CEO who says what the product does instead of wrapping it in frictionless language about "connecting communities" and "making the world smaller."
The man is honest about the stakes.
What's worth noting is that Karp's bluntness serves a strategic function. By stating the most alarming thing explicitly —
yes, this is used to kill people, that's the point, and here's the moral framework that justifies it
— he preempts more uncomfortable revelations. He's already said the worst thing. Everything downstream operates within a framework he constructed.
The Manifesto
Two weeks ago, Palantir posted what they called a "brief" 22-point summary of Karp's book The Technological Republic: Hard Power, Soft Belief, and the Future of the West to the company's X account. It racked up 32 million views. (Fortune)
A few of the highlights:
Silicon Valley owes a "moral debt" to the country that made its rise possible. "Free email is not enough."
The atomic age is ending. The next era of deterrence will be built on AI weapons. American tech companies should build them without pause for "theatrical debates" about ethics — because adversaries "will not pause to indulge" in those debates.
National service should be a "universal duty." The manifesto hints strongly at mandatory military service.
The postwar disarmament of Germany and Japan was an overcorrection that should be revisited.
And the one that really got people talking: "certain cultures and indeed subcultures...have produced wonders. Others have proven middling, and worse, regressive and harmful."
Victoria Collins, Member of Parliament in the United Kingdom, described the manifesto as "the ramblings of a supervillain," adding that "a company that has such naked ideological motivations and lack of respect for democratic rule of law should be nowhere near our public services."
Dave Karpf, writing for TechPolicy.Press, noted that the message of both the book and the manifesto is essentially that Palantir wants to be the weapons manufacturer of the next century and that the government should spend an exceptional amount of money on Palantir products to make that happen.
Eliot Higgins of Bellingcat offered a more pointed framing:
"Palantir sells operational software to defense, intelligence, immigration & police agencies. These 22 points aren't philosophy floating in space, they're the public ideology of a company whose revenue depends on the politics it's advocating."
This is not an abstract question of corporate values.
The ideology in the manifesto is the business model.
A company that profits from expanded surveillance, military AI contracts, and immigration enforcement infrastructure is publishing a document arguing that these things should expand. That's called vertical integration of ideology and revenue. It is at least consistent.
I'll give them that.
The Audit Log Argument
Palantir's most consistent response to civil liberties concerns is architectural: the software records everything.
Every query run against the Ontology. Every analyst who accessed a dataset. Every modification, every export, every flag. The audit trail is indelible, granular, and theoretically enforceable. Rogue analysts can't silently abuse the system. Investigators can reconstruct exactly how a decision was made, who made it, and what data they used.
This is a genuinely meaningful feature. It is not nothing.
Here is the problem with the audit log argument:
it assumes that the institution using the software is interested in auditing itself.
An audit log tells you what happened. It doesn't guarantee that anyone will look at it, act on it, or face consequences for what they find. The log exists inside a system. The system answers to institutional leadership. Institutional leadership answers to political authority.
Whether abuse is recordable is a different question from whether abuse is accountable.
Where Flock Safety Fits Into All This
Here's why this blog series matters as a unit:
Flock Safety and Palantir are not separate stories. They are different layers of the same architecture.
Flock operates at the street level
cameras on utility poles capturing every vehicle, building a national movement database that any connected law enforcement agency can query retroactively.
Palantir operates at the institutional level
an Ontology that connects disparate government databases into a unified, queryable intelligence layer.
When Palantir's platforms integrate with municipal license plate reader networks, the "time machine" capability Flock enables doesn't just answer questions like "where was this car last week?" It can answer much more complex questions:
Who does this vehicle's owner associate with? What facilities do they visit? What patterns in their movement correlate with other flagged individuals?
The Bigger Picture (The Part That Actually Concerns Me)

I want to be direct about something, because this is where I personally land after researching all of this.
I am not anti-technology.
I run local AI. I self-host my own infrastructure. I use AI tools as collaborators and think they are extraordinary when used intentionally.
The problem is not technology. The problem is who controls it, who benefits from it, and whether the people living inside these systems have any meaningful say in how they're governed.
What Palantir represents — more than any specific controversy — is the consolidation of extremely powerful infrastructure into the hands of a small number of ideologically motivated actors, with contracts structured to be nearly impossible to exit, expanding into healthcare, military operations, immigration enforcement, and commercial enterprise simultaneously.
Their commercial growth is now outpacing government revenue, which means the same Ontology that connects ICE databases is moving into hospital systems, manufacturing plants, and financial institutions.
The manifesto argues that the question is not whether AI weapons will be built, but who will build them and for what purpose.
That framing, applied to Palantir's broader expansion, is worth turning around:
the question is not whether AI-powered data infrastructure will connect your health records, your movement data, and your financial activity — it's who will build it, who will own it, and what rules will govern it when the administration changes.
A tech-illiterate electorate doesn't know to ask these questions. A regulatory environment that's been systematically weakened doesn't have tools to answer them. And a CEO publishing a manifesto that calls ethical debates "theatrical" is not making a philosophical argument...he's lobbying for the removal of the guardrails that would slow his company down, and adding guardrails to stop the competition.
That's the enclosure. And it's moving fast.
So What Do You Do With This?
Nothing dramatic. Not yet.
Start by knowing who they are. Read their government contracts (many are publicly available via FOIA, though the most relevant ones are heavily redacted). Watch what happens with ImmigrationOS. Pay attention to which cities sign Palantir contracts alongside Flock deployments. Look up whether your hospital system is on the NHS Foundry model — or heading there.
The architecture of informational power is being built quietly, in procurement documents and government contracts, by a company that has at least been honest about what it believes. The least we can do is take them at their word, understand what they're building, and decide — with clear eyes — whether that's the world we want to live in.
Check out my other Block Flock GR blog posts and the Flock Starter page


Or you can subscribe to my blog for more updates. I have more ideas and plans than I have time to build, but Id like to think I am putting out decent work.
Sources & Further Reading
- Palantir Technologies — Britannica Money
- Palantir Wikipedia
- Palantir Investor Relations
- Palantir's 22-Point Manifesto — TechCrunch
- Manifesto Analysis — TechPolicy.Press
- Fortune: Palantir Manifesto Coverage
- Guardian: UK MPs React to Palantir Manifesto
- Euronews: "Ramblings of a Supervillain" Coverage
- Palantir Product Documentation
- ACLU on Palantir and ICE
- EFF on Predictive Policing
- Amnesty International Reporting
- Palantir Forward Deployed Engineering — Medium
- Deseret News: The Tech Lords Have Plans For Us

