top of page

AI, AR, Fake Barns, and the Ethics of War

Updated: Mar 10

“Artificial Intelligence, Augmented Reality, Fake Barns, and the Ethics of War”


Image: A Screenshot taken on my Meta Quest 2 of the Horizon Worlds VR Game Action Land - Farm. You can enter this barn in the game; it's not fake. But is it real?


Exploring the Intersection of Technology and Ethics


In the Black Mirror episode “Men Against Fire,” we see futuristic soldiers battling against the so-called “roaches.” These are human-like creatures that emit disturbing insect sounds when encountered. A virus in their blood poses a threat to humanity’s future.


The soldiers agree to a brain implant called “mass.” This device essentially turns their brains into smartphones. They receive messages, and commands pop up in their visual field, reminiscent of Tony Stark’s glasses from Spiderman: Far from Home: E.D.I.T.H. (“even dead, I’m the Hero,” Tony’s message informs Peter). While these implants are not as advanced, we already have AR Smart Glasses from Meta (formerly Facebook) and Ray-Ban: https://amzn.to/4oB8oEI.


The Reality of Combat


On his first day in live combat, rookie soldier “Stripe” dreams of his presumed wife waiting for him at home. He encounters a family of “roaches” and kills them. One of the creatures screams as Stripe repeatedly stabs it. Afterward, he finds a device resembling a flashlight that blinks in his face before he rejoins his colleagues.


Stripe returns to base, worried about his mass implant. However, the medical officer and psychologist, Dr. Arquette (played by Mike Kelly) from House of Cards) reassures him that everything is fine. Stripe expresses no remorse for the creatures he killed and agrees he would do it again. “Okay, then why are you here?” Arquette laughs. “You did a big thing. You should be proud of yourself.” He then mentions ensuring Stripe gets a “really good sleep,” and we see him on his computer, presumably writing a prescription. But psychologists don’t write prescriptions. Strange.


Stripe dreams of his wife again, but this time, it’s a sexual experience. The “dream” glitches, and multiple images of his wife enter the bedroom on a loop, alongside various stages of their encounter.


Frightened by this glitching dream, Stripe wakes up in the middle of the night. He sees the barracks of sleeping soldiers, all rhythmically moving their fingers, as if each is experiencing the same dream. This raises suspicions about whether anyone is waiting for Stripe at home. We begin to question the extent to which the “mass” implant system does more than relay messages.


The Confusion of Reality


The next morning, Stripe and his team go to an apartment complex to eliminate straggler roaches. His mass implant continues to glitch, allowing him to smell things like grass for the first time in years. Some soldiers are killed, and Stripe enters the building with another soldier.


Inside, Stripe initially sees what he thinks is a roach but discovers it’s a woman hiding from gunfire. Assuming she’s a citizen, he lets her go, only to see her shot by his colleague moments later. This moment sparks confusion: was that woman a roach? Why did she look “normal”?


Confused, Stripe encounters another woman and her two children. He reassures her he won’t hurt them, but then sees his colleague approaching to kill them. Stripe incapacitates his colleague and flees with the “roaches.”


We then get an exposition scene with the roach, once they find shelter. When Stripe wakes up, the roach asks:


Roach: “You see me as I am?”

Stripe: “Of course I see you.”

Roach: “You don’t see roach?”

Stripe: “You ain’t a roach. Roaches is all…”

Roach: “Fucked up?”

Stripe: “Roaches don’t speak.”

Roach: “You just can’t hear us.”

Stripe: “What the fuck are you talking about?”

Roach: “Your implants. Your army implants…”

Stripe: “The Mass system?”

Roach: [Nods] “They put it in your head to help you fight, and when it works, you see us as something other.”


She explains that the flashlight device Stripe saw transmits a virus to the Mass system, causing it to malfunction. This allows soldiers to experience reality as their bodies naturally would without the implants.


Stripe doubts himself, expressing how hideous the roaches appear and how he has seen them firsthand.


Stripe: “No, they’re monsters. I’ve seen them!”

Roach: “The implant made you see this.”


Eventually, Stripe is captured, and the “roaches” are killed. We see another scene with Dr. Arquette, who admits that the Mass implant alters the appearance and sound of the roaches. Their blood contains “impurities” like “cancer” and “substandard IQ.” They cannot be allowed to breed with citizens. The fact that they look and sound just like regular citizens makes them dangerous.


Dr. Arquette: “Humans, you know, we give ourselves a bad rep, but we’re genuinely empathetic as a species. I mean, we don’t actually really want to kill each other. [Laughs.] Which is a good thing, until your future depends on wiping out the enemy. …”


He explains that historically, most soldiers didn’t even fire their weapons. In World War II, only 15-20% of men would pull the trigger. This raises questions about the military's conditioning and the ethical implications of such technology.


The Ethical Dilemma


Dr. Arquette states that the Mass system is the ultimate military weapon. It’s easier to pull the trigger when you’re aiming at the bogeyman.


(I looked into whether those statistics are true, and they’re mostly false or at least misleadingly phrased: https://www.reddit.com/r/AskHistorians/comments/1lgdiej/i_have_heard_that_only_20_of_soldiers_in_the/.)


He plays Stripe’s consent tape, showing him agreeing to the Mass system with his thumbprint and verbally confirming he won’t remember doing so. “Yo, that’s crazy!” the younger Stripe exclaims.


Arquette then forces Stripe to relive the moment he killed the roach on a loop. Now, the roach appears as a human, pleading for his life and family instead of screaming in terror. Stripe screams to make it stop, and Arquette asks if he wants to reactivate the implant.


The screen shifts to an older, decorated Stripe returning home. From his point of view, he sees a beautiful house and the woman from his dreams walking towards him. The camera pans out to reveal a dilapidated house, with Stripe standing in front of it, alone.


The Philosophical Implications


While Black Mirror offers a gripping narrative, my aim here is to explore an epistemic-ethical problem raised by this episode that we are much closer to in 2025 than most people realize. We’re led to believe this episode takes place hundreds of years in the future, but we are already approaching that reality. Soldiers may soon find themselves in similar epistemic-ethical positions, if they aren’t already using this technology.


In the early 19th century, W. K. C. Clifford, in “The Ethics of Belief,” argued that one should never act on insufficient evidence. I contend that soldiers using AI- or AR-enhanced glasses cannot be in a sufficient epistemic position to act morally. This shifts the weight of moral responsibility onto the government, which may be by design. However, this post focuses on the ethics of individual soldiers using AR glasses, like Meta’s Ray-Ban AR glasses or the upcoming Snapchat AR glasses.


The epistemic and ethical problem with these glasses mirrors the situation in the Black Mirror episode. Soldiers willingly wear devices that alter their vision. They cannot see the code that modifies their perception, making it impossible to trust that they’re seeing reality. If one cannot confirm the reality of their information, they cannot ethically act on it. Thus, seeing an enemy combatant through AI-enhanced AR glasses is not enough evidence to conclude that there is indeed an enemy present or that it would be morally correct to pull the trigger.


The Fake Barn Analogy


This epistemological point draws from Alvin Goldman’s famous 1976 article, “Discrimination and Perceptual Knowledge.” Goldman describes an example involving fake barns and real barns in a fictional place called “Fake Barn County.”


Fake Barn County is known for its numerous fake barns, resembling one-sided facades built for movies. Goldman’s argument is that the criteria for knowledge can change based on different situations or epistemic environments.


Normally, seeing a barn while driving in the countryside is strong evidence of an actual barn’s existence. However, in Fake Barn County, a visual appearance of a barn is not reliable evidence of a real barn. The existence of fake barns undermines the trustworthiness of visual perception.


This analogy applies to the situation with AI-enhanced AR glasses, including headsets like the Meta Quest 3 and Meta Quest 3S. While citizens can easily remove their glasses to confirm their surroundings, military personnel using these devices may not have that option.


The Future of Military Technology


I have no idea if the military is currently using such devices. However, I find it hard to believe they aren’t developing them, especially as they become more popular. It’s crucial to note that there are no reports of mass illusions generated by AR or VR headsets that affect all users simultaneously. Yet, this is entirely possible with devices available for purchase today. What could military contractors achieve with a $10,000 headset? How much “augmentation” might these devices have that users are unaware of?


These questions are vital for considering the current and future implications of technology in warfare. If soldiers agree to wear devices that can alter their perception of reality without their knowledge, I argue that none of them are in a sufficient epistemic position to deliberate about the morality of their actions. If they cannot discern whether the “bogeyman” they see is a regular person, they cannot be held morally responsible for their actions.


Conclusion


This intersection of epistemology, ethics, war, and technology raises profound questions. I hope the military would never alter soldiers’ perceptions without their awareness, but I remain skeptical—especially under the current administration.


Douglas A. Shepardson, Ph.D.


Disclaimer: I hold shares and options (June 2026 and January 2028 calls) of Snapchat (SNAP) and do affiliate marketing for Amazon. (That means I get a small percentage if you buy anything through one of the above links, e.g., 1% on electronic devices.) Some might think it strange I wrote a post about an ethical fear resulting from a technology developed by a company I’m invested in. But if I’ve learned anything watching the market since 2015, it’s that Wall Street doesn’t care about posts like this. If they did, AI would’ve developed a lot slower. I’m also not accusing Snapchat of anything nefarious. One can be interested in a piece of technology and invest in it while simultaneously hoping that it isn’t misused and expressing a philosophical worry about how that could happen.

2 Comments


scruz39
Dec 03, 2025

I found this post to be very interesting. I have followed some of these developments — because I’m concerned. I didn’t consider the extent to which these technologies could be used to dehumanize targets (legitimate and illegitimate alike). Anduril officially released ‘EagleEye’ last month, and it seems to be capable of doing what you described occurred in that Black Mirror episode.

This technology is not even years away… it’s here today.

Like
Replying to

For other readers: https://www.youtube.com/watch?v=x9B02pFKpJo


The top comment writes, "remember boys, there wont be respawns." That jocular allusion to the similarities between wearing EagleEye and gun games has 1.7k likes.


I wasn't even aware of that, so thank you for bringing it to my attention! I agree it's extremely concerning, even without the hypothetical AI-update take-overs I proposed above. As a VR gamer who has also proudly beaten five Halo campaigns on legendary, I could see AR in actual combat use immersion blurring the boundary between reality and VR/AR gaming.


No soldier is at risk of dehumanizing someone in natural (direct vision) combat because of playing first-person shooters, or I'd argue at least. I can't see anyone confusing the behavioral associations…


Like
bottom of page