On Drones

John Teschner

There is scarcely any other matter, however, upon which our thoughts and feelings have changed so little since the very earliest times, and in which discarded forms have been so completely preserved under a thin disguise, as our relation to death.

—Sigmund Freud, “The Uncanny,” 1919

I was afraid of ghosts when I was a kid. I lay awake after my parents said goodnight and refused to put my head out of the covers. It was a crude way to manage my fear, but most nights it allowed me to fall asleep after an hour or so. One night, I dreamed that I was still awake and calling out in fear. My father came to the open door, stood there a minute without speaking, then came in. He walked across the room and sat on the edge of the bed. His face was painted like a clown’s. Those were his eyes behind the white paint and red grin, but they met mine with no sign of recognition. We looked at each other like that for a long time. It’s the worst dream I’ve ever had.

Years later, at the Museum of African Art in Washington DC, I encountered the same impassive gaze in an exhibit of masks donated by the estate of Walt Disney. Most had stylized features carved in stained wood. But one was a perfect replica of a human head, right down to the skin attached with nearly imperceptible stitches. Unlike the other masks, it had realistic eyes with corneas and pupils. Only its spiraling horns were inhuman. When I’d seen enough, I strolled a few blocks down the Mall to the National Air and Space Museum, where I could rest my gaze on polished steel and precision rivets.

I wandered through a gallery of World War II aircraft, emerged onto an elevated walkway, and came face-to-face with a Predator drone. It was January 2010; the New York Times had just published the first details of the CIA’s secret drone war in Pakistan, and I wasn’t expecting to encounter a Predator suspended from the ceiling of the Air and Space Museum. It had the simple lines of a balsa-wood glider, and a Hellfire anti-tank missile tucked below each twenty-five-foot wing. Where a cockpit should have been, there was only a smoothly amputated swelling. The longer I looked at the drone, the less it seemed to have in common with the war planes, space capsules,  lunar rovers, and high-altitude gondolas around it than with the skin mask in the other museum. Both gave me a feeling of fear and fascination inextricably mingled, the same irresistible urge that had once driven me to read ghost stories in daylight, though I knew I’d pay for it when I was alone in the dark. The feeling was familiar, profound, and impossible to explain: an urgent signal from a source I did not recognize, carrying a message I could not decipher.


In the 1930s, Britain’s Royal Navy invented radio-control biplanes for battleships to use as target practice. They were called Queen Bees, hence the term “drone.” Fifty years later, an oil-and-uranium entrepreneur named Neal Blue cleared a stretch of coastal Nicaraguan jungle and built a cocoa plantation. After Sandinista rebels overthrew his friends in the Nicaraguan ruling family, he was brainstorming affordable strategies for regime change when he visualized a swarm of unmanned planes assaulting the national gasoline reserves. In 1986, Blue bought a nuclear research company from Chevron and entered the military R&D business. He ordered his new engineers to build an affordable drone from off-the-shelf parts. A General Atomics Predator unmanned aircraft costs about four million dollars and runs on a snowmobile engine.

Despite Blue’s kamikaze inspiration, the Predator was designed to spy. Its gas-sipping engine can keep it aloft at 15,000 feet for up to forty hours. Its high-resolution cameras can beam real-time footage across the globe, giving “commanders what they had only dreamed of,” according to Malcolm MacPherson in his book Roberts Ridge: “total situational awareness—making them like gods, omniscient and all-seeing.” After September 11, 2001, drones quickly became integral to the War on Terror.

Roberts Ridge chronicles the events of March 3, 2002, the day a team of Navy Seals set out to establish an observation post on the peak of Takur Ghar: a ten-thousand-foot mountain in Shah-i-Kot, The Place of the King, a rugged valley where Afghan fighters had already taken last stands against the invading armies of Alexander, Victoria, and Gorbachev. Drone surveillance seemed to show the peak unoccupied, but the Seals’ Chinook landed in a camouflaged encampment of Chechen Al-Qaeda. In the melee that followed, the Chinook was shot down, two Seals were killed, and the team was stranded on the mountainside. Overhead, a Predator broadcast footage of the firefight to command posts in Bagram, Tampa, Oman, and Washington DC.

Takur Ghar is the highest battlefield in U.S. history, and (according to an academic paper entitled “Pitfalls of Technology: A Case Study of the Battle of Takur Ghar”) a textbook example of the limits of total situational awareness. When a Ranger Quick Response Force deployed in a second Chinook to rescue the stranded Seals, a flood of conflicting information and irreconcilable orders poured into the cockpit from officers across the globe: all had access to the drone feed; all were convinced that they understood what was occurring ten thousand feet above the Shah-i-Kot Valley. As a result, the Rangers flew into the same field of fire that had brought down the men they were trying to save. Through the drone’s high-resolution cameras, the officers watched the Chinook crash-land on the peak. Seconds later, the ramp opened, and Corporal Matt Commons ran onto the snow. He fired a few rounds of his M-4 before a bullet hit him in the forehead. Dozens of people watched Matt Commons die. None could help him.

The drone suspended from the ceiling of the Air and Space Museum was the first to be retrofitted with missiles and fire them in combat, under President George W. Bush in 2001. But it was President Obama who made drones a centerpiece of U.S. military strategy. From 2004 to 2007, there were nine confirmed drone attacks in Pakistan. In 2010, there were 118. Remotely patrolling North Waziristan for hours on end, day after day, drone operators acquire a strange sense of intimacy with people they may eventually kill, watching them sleep on rooftops and urinate in the mountain scrub. If a kill order is issued, those people become “squirters,” so named for their tendency to run in all directions, like clowns emptying from a clown car, when they realize a flying robot is about to incinerate them with a Hellfire.  After their shift ends, drone operators climb out of their fake cockpits, walk to the parking lots of the office buildings where they work, and drive through the Nevada desert to their homes in the Las Vegas suburbs. CIA officers can order strikes from the comfort of their own homes. One CIA director described fielding a kill request while lying on his couch watching college football. He told them to shoot.

The United States has severed the connection between the warrior and the field of battle. It is a fundamental shift in the nature of warfare, but it is also the logical extension of a trend that began three thousand years ago. In the seventh century BC, Greek bronzesmiths began making helmets with larger and larger nose guards, until they finally hid the face so completely that warriors’ vision and hearing must have been impaired. The utilitarian  helmet became an ornamental mask. Cara McCarty, curator of the Cooper-Hewitt, National Design Museum, explains this confounding evolution by citing research showing “a link between anonymity and aggression.” Anthropologists have found that torture and mutilation are more common in cultures where warriors wear masks.

In Wired For War, P.W. Singer describes the day U.S. forces tracked Uday and Qusay Hussein to their hiding place in a Mosul villa. For hours, troops fired rifles, rockets, and .50-caliber machine guns into the building while a Predator beamed live footage around the world. At a base in Qatar, soldiers gathered with snacks around flat-screen TVs to cheer each explosion. Footage like this has become so common on YouTube that there is now a term for it: “pred porn.” To understand how riveting it can be, just find the photograph taken May 2, 2011, of President Barack Obama and his top advisors in the White House Situation Room watching live, night-vision footage from a drone circling the Abbottabad compound of Osama Bin Laden.

Singer compares the scene in Qatar to a Superbowl party, but for those who actually fly drones, video games are a better analogy. For years, the military tried to keep unmanned aircraft within traditional categories by allowing only pilots to operate them. But it has finally been acknowledged as an unassailable truth that the work is best suited to soldiers who’ve honed their reflexes and multitasking skills by logging thousands of hours on PlayStation and Xbox. Drones transform exotic landscapes into two-dimensional battlefields and enemies into pixelated squirters with predictable evasion patterns. With the only consequence a four-million-dollar Game Over, our warriors can be more fearless than Myrmidons.


Arnold Schwarzenegger became the Terminator because he suggested to James Cameron, the movie’s director, that the humanoid robot shouldn’t blink while firing its weapon. It was an actor’s intuition, but Arnold’s insight explains what the Greeks figured out three thousand years ago and what U.S. Army generals and Hollywood directors in the digital age are scrambling to exploit: masks not only embolden their wearers; they scare the shit out of everyone else.

Berkeley professor Georgina Kleege calls this “the mask of inhumanity.” Kleege, who is legally blind, writes that human beings watch each other’s eyes: dilated pupils signal approachability, friendliness, even sexual arousal. The blind are feared and shunned, often unconsciously, because their unsee ing eyes don’t offer the recognition others expect. Schwarzenegger exploited this effect to play a robot assassin, but digital animators and android technicians trying to create sympathetic characters struggle with our universal tendency to become deeply uncomfortable when we see faces that are almost perfectly human, but not quite.

Trying to quantify this emotional response, Japanese roboticist Masahiro Mori graphed a slow but steady rise in people’s positive sense of “familiarity” from industrial robot (think R2-D2) through humanoid robot (think C-3PO), after which the line abruptly plummets below neutral into “unfamiliarity,” passing corpse and bottoming out at zombie before rising even more abruptly to a peak at healthy person. Mori named that sheer drop the Uncanny Valley.

The term “uncanny” is usually traced to a 1919 essay by Sigmund Freud, in which Freud explores the theory that people experience uncanny feelings when they aren’t sure a human-like object (like a ventriloquist’s dummy) is truly inanimate. A topic restricted to puppets and waxworks a century ago, uncanny studies have become more pressing in the age of robots. Today, researchers like Karl MacDorman seek quantifiable explanations for our profound and highly consistent aversion to faces in the Uncanny Valley. Terror management theory (TMT), MacDorman believes, is the key. According to TMT, people are unique among animals in knowing that we will die, and we’ve evolved sophisticated processes of repression to cope with that “potentially terrifying” knowledge. We suppress conscious thoughts of death by rationalizing them out of mind (“I have many good years left!”) and react to  unconscious thoughts even more indirectly, by embracing worldviews that promise stability and continuity. This is why so many political ads deal in fear: on the verge of acknowledging our own certain death, our subconscious distracts itself by focusing on comforting membership in a God-blessed, virtuous, and triumphant group. MacDorman hypothesized that faces fall into the Uncanny Valley when they somehow trigger unconscious thoughts of death. In a controlled experiment, he tested whether people who saw a photo of a pretty female android became more patriotic. They did.

To experience the uncanny is to feel the excitement of receiving a profound insight, minus the insight itself. It is the electric thrill of a powerful signal striking an antenna, its source unknown and its significance obscure. It is the difference between my father’s reassuring gaze and the empty grin of a clown, between the vulnerable skin of a face and the rigid bronze of a mask, between an airplane and an unmanned aircraft.


The U.S. has more than seven thousand drones. They track terrorists in Yemen, missiles in North Korea, drug cartels in Mexico, and cattle rustlers in North Dakota. At the Air Force’s “Microaviary,” drones the size of dragonflies and hummingbirds flap their wings and spin their rotors. And they are becoming increasingly capable of making their own decisions. Protocol still requires a human being to give a kill order, but it will soon be morally indefensible to keep us “in the loop.” Computers will make faster, less biased decisions using facial recognition software to identify terrorists and digital mapping to preview missiles’ “bug splats” and protect civilians. Many experts see swarms as the most promising field of drone research. Singer writes that iRobot (manufacturer of the Roomba autonomous domestic vacuum and the Packbot, a popular terrestrial military drone) is coding programs for swarms up to ten thousand strong. The age of drones is less likely to be defined by a flood of information, like the one that led Matt Commons’s Chinook into enemy fire, than by a cloud of robots enveloping their target. Such a swarm would be not only physically devastating for its target, but also, in Singer’s words, “psychologically debilitating.”

As measured by contests ranging from self-driving car races to the famous Turing tests, we are closing in on the goal of robot autonomy. Singer presents convincing evidence that artificial intelligence (AI) will be invented in our lifetimes. The event is being called the Singularity because nothing will ever be the same after it. Although, as an MIT professor of AI explains, our programming is so crude that the first sentient machine is certain to be “leapingly, screamingly insane,” many experts believe the first thing it will do is upload itself into the cloud. With the limitless processing power of the Internet, artificial intelligence will quickly evolve the ability to replicate. One researcher at Carnegie Mellon’s prestigious Robotics Institute has written that human extinction is the inevitable result of robotics research.

Freud wrote that we cannot understand the power of the uncanny without understanding that our deepest fear is not of death itself, but of a death that does not occur on our own terms. This is why the unmanned aircraft in the museum triggers feelings that the World War II fighter planes one gallery over do not. It is why bronze helmets became bronze masks. We long to look into living eyes, especially as we die, even if they are our killer’s. To become a “squirter”—or to be harvested by swarms of sentient robots, along with our parents, children, and billions of other human beings—is an end that denies all stories and morals, all hope of greater meaning. It is a death  outside the bounds of any terms we can call our own, a terror that remains unmanageable.


I talked with Greg Commons in February 2007, four years and eleven months after his son Matt died on Takur Ghar. I was a reporter for a newspaper in Northern Virginia, and Greg, a high school history teacher, had just convinced the General Assembly to authorize Gold Star Family License Plates for the relatives of soldiers killed in war. We met in his living room. He had a warm, wry smile and drank coffee from a mug that said Old Fart. Greg was the same age as my father, and I was exactly the age his son would have been.

Like me, Matt Commons graduated from high school in 1999. Unlike me, he left college and joined the U.S. Army. Greg, a former Marine, supported Matt’s decision. During Matt’s Ranger training, Greg made the thirty-two-hour round-trip drive to and from Fort Benning once a month. After 9/11, he took his son golfing: “I told him, do your job. Keep your head down, you know, but do your job. And depend on your buddies.” He invited Matt to speak to his history class. Matt did so shortly before he deployed. He wore his battle dress. In the parking lot afterward, Matt told his father he could see himself becoming a teacher one day.

Matt called home every Sunday. “He was learning how to be a young man,” Greg told me. “He was still a kid at heart, like we all are.” Matt couldn’t say where he was, but he asked for hand warmers and insulated socks. Greg went to REI and bought seven pairs. He had to wrap them twice so the Army could remove the outer envelope and write his son’s real location on the blank layer beneath. When Matt called home on his twenty-first birthday, Greg told him they’d have a beer together when he got back. “I could tell in his voice there were things he was seeing and hearing that were making him grow up.” When Matt called the next Sunday, Greg signed off, as always, by saying, “I love you.” The next Sunday, March 3, Matt didn’t call.

On Monday night, Greg was waiting for the eleven o’clock news when he heard a knock at the door. Through the window, he saw an officer in uniform. Later, Greg called Matt’s younger brother, a college freshman in Colorado. “That was one of the more difficult things to do.” He let his son cry and scream, then told him to pull it together, take down his credit card number, and book a flight home. For a week, Greg’s house was packed with people. He wore a mask of strength. On the day of the funeral at Arlington, he was alone in the kitchen and started shaking. “I can’t bury my son,”  he said to the empty room. His wife came in. “I can’t bury my son today,” he told her. “You’ll have to do this without me.” She held him; they cried together, and Greg buried his son.

Greg went on to tell me about the Gold Star Family License Plate and about his own military service, guarding American embassies. We talked for hours, far longer than I’d planned. Finally Greg said, “I’m going to tell you something I’ve never told anyone else.”

After Matt didn’t call on Sunday, March 3, Greg went to bed. He opened his eyes at one a.m. “I woke up from a nightmare that was so real. I was on a helicopter with my son, and I was dressed in blue jeans. I had the M-16 rifle that I’d trained with. Everyone else was in fatigues. And we got hit by an RPG, and we got pancaked into a mountaintop. Because it was a dream—I thought at the time, a nightmare—I saw all the guys fall to the floor.” The ramp opened. Greg and Matt ran together onto the snow, both firing their rifles. “There was a momentary lull. I looked at him. He looked at me, and he had this, I don’t know, Dad, this look of doubt, and I smiled at him and told him, It’s going to be all right. Peace came over his facial expression. The doubt disappeared. And then he got hit.”

Matt had been dead five hours when Greg saw him in the dream. He would have no other news of his son until the next night, when the officers came to his door. “I’m a lucky guy,” Greg told me. “I don’t have to wake up in the morning wondering why my son is dead. He died trying to save somebody’s life.”

Both of us were crying. Greg cried because he’d been allowed to see his son’s eyes before the bullet hit him in the forehead. He cried because Matt was dead, and because Matt had died on his own terms. I cried because in Greg’s eyes I saw my father’s, seeing me.

JOHN TESCHNER has coached high school basketball on the Kenyan savanna; written award-winning stories as a reporter in northern Virginia; and trained undergraduate creative writing students to teach at-risk middle schoolers in Georgia. He holds an MFA in creative writing from Georgia College and is currently a grant writer at AchieveMpls and an instructor at Minneapolis College of Art and Design. He writes a monthly column of essays for tropmag.com.