I. The Gun on the Mantel
Chekhov’s rule of storytelling says if a gun appears in the first act, it must be fired by the third. In our reality, that gun is data. And if it exists—it will be misused.
That’s not dystopian paranoia. It’s a pattern. It’s history. It’s happening.
Right now, the U.S. government is working with a private company called Palantir Technologies to create one of the most powerful surveillance infrastructures in the world. Born from CIA seed money and named after the all-seeing orbs in The Lord of the Rings, Palantir integrates massive datasets from across civilian and military systems. It’s building AI tools to guide government decision-making in law enforcement, immigration, national security, and beyond.
Its namesake—the Palantír—is a fitting symbol. A device that lets its user see across vast distances, into other minds, and into possible futures. But in Tolkien’s world, those who used it were rarely enlightened. They were corrupted. Deceived. Bent toward despair or domination.
What you see in the stone may be true. But it is not the whole truth. And the more you rely on it, the more it shapes you.
So too with Palantir.
II. The Stone That Sees—and Deceives
In Tolkien’s mythos, the Palantír is not a tool of lies—it is a tool of half-truths. It shows real things, but without context. It reveals glimpses, but not meaning. And worst of all, it can be dominated by a stronger will.
Denethor, Steward of Gondor, looks into the stone and sees the armies of Mordor. He despairs. He gives up hope. He tries to burn his own son alive.
Saruman, once the wisest of wizards, uses the stone to see the coming war—and becomes convinced the only path is to join the enemy. He rationalizes betrayal as wisdom. Power as necessity.
Gandalf? He refuses to use the stone. Aragorn uses it once—with restraint, and only to confront Sauron directly. The lesson is clear: the more you gaze into the Palantír, the more it gazes into you.
These are not just fantasy metaphors. They are warnings.
III. The Twofold Danger of Data
"What you collect can be used to control. What you analyze can be used to deceive. What you selectively reveal can be used to rule."
Palantir is not just building dashboards. It is engineering power. And the danger of surveillance data is not one-dimensional. It has two faces:
1. Identification of Targets
Palantir’s platforms allow governments to track individuals in real time across multiple data streams: DMV records, social media, hospital visits, school enrollments, financial activity, and more. It maps relationships. Flags behavior. Predicts threats.
Who becomes a target? Anyone the system decides is inconvenient.
Immigrants have already been hunted with this tech. Police departments have used it to justify preemptive surveillance in Black and brown communities. It is not speculative. It is operational.
2. Manipulation of the Masses
This is the subtler, deeper danger.
The same data that lets Palantir find individuals also lets it model mass behavior. Track sentiment. Predict emotional triggers. Tailor narratives. Feed curated truths into media pipelines. Shape elections, consumer behavior, protest dynamics.
And perhaps most dangerously: those who control this system can selectively leak or conceal data to construct public reality.
Transparency becomes a weapon. A leak becomes a psy-op. The system doesn’t just see the world. It edits it.
IV. If the Data Exists, It Will Be Misused
This is not a theoretical warning. It’s a historical pattern.
COINTELPRO surveilled civil rights leaders.
The NSA spied on millions of Americans without warrants.
ICE used Palantir to track and deport immigrants based on utility bills and license plate scans.
Predictive policing tools flagged Black neighborhoods for increased patrols—because of biased data.
The surveillance state does not stay in its lane. It metastasizes.
The gun is already on the wall. The system is already humming. All that remains is for someone to point it in a new direction.
V. The Illusion of Decision Support
Palantir markets itself as a decision support system. But once an algorithm tells you what to do, it’s no longer support—it’s command.
A model says this person is a threat.
A dashboard says this neighborhood is high risk.
A heat map says this is where the next protest will start.
No one questions the machine. And if it’s wrong? No one is responsible.
This is how morality is automated. This is how injustice becomes procedural.
VI. What Happens When the Stone Changes Hands?
You might support the people in charge today. Maybe you think they’ll use this power for good.
But this system will not be dismantled when the next administration takes office. It will be inherited. Expanded. Refined.
What happens when someone you fear sits at the console?
You don’t need to be the villain to build the villain’s tools. You just need to believe you’re in control.
But the Palantír doesn’t change depending on who holds it. It shapes them.
VII. What We Must Do
Expose the myth of neutral data. All data reflects values, biases, and intent.
Resist normalization of predictive surveillance. This is not safety. It is preemptive punishment.
Demand legislative limits and real oversight of AI tools used by government agencies.
Support FOIA requests, investigative journalism, and transparency advocates.
Use story, myth, and metaphor to wake people up. People fear the Eye more than a spreadsheet.
VIII. The Eye Opens
"The Stones were meant to serve, not to rule." – Gandalf
The seeing-stone is already glowing. The weapon is loaded. The script is underway.
If the data exists, it will be misused.
And if we don’t act now, we may all find ourselves in the third act… staring down the barrel of something we helped build.
Get more at taojoannes.substack.com