/AI, Drones, Empathy, Alienation, and the Gig Economy

AI, Drones, Empathy, Alienation, and the Gig Economy

By Lambert Strether of Corrente.

The Verge has an important post on content moderation in the corporate hellhole that is Facebook which has implications both for the future of work (or, as we call it, “labor”) and for services that we, as consumers (reproducing our labor power) file mentally as algorithmic or robotic, but are in fact labor, performed remotely. However, I’m going to approach the Verge article indirectly, by looking at drones and drone operators — an earlier, prototype gig economy, if you think of enlistment as a gig — and the stresses that result from whacking faraway brown people remotely. Then I’ll look at Facebook, and then at other forms of remote labor (or, as it seems to be called, “telepresence”).

Working Conditions of Military Drone Operators

From the New York Times, “As Stress Drives Off Drone Operators, Air Force Must Cut Flights

What had seemed to be a benefit of the job, the novel way that the crews could fly Predator and Reaper drones via satellite links while living safely in the United States with their families, has created new types of stresses as they constantly shift back and forth between war and family activities and become, in effect, perpetually deployed.

“Having our folks make that mental shift every day, driving into the gate and thinking, ‘All right, I’ve got my war face on, and I’m going to the fight,’ and then driving out of the gate and stopping at Walmart to pick up a carton of milk or going to the soccer game on the way home — and the fact that you can’t talk about most of what you do at home — all those stressors together are what is putting pressure on the family, putting pressure on the airman,” Colonel Cluff said.

That’s the quote from the Colonel. The Independent has a different version, in “Secret US drone whistleblowers say operators ‘stressed and often abuse drugs and alcohol’ in rare insight into programme

From as far as 8,000 miles away in their base in the Nevada desert, the men operated unmanned drones carrying Hellfire missiles, in places such as Afghanistan, Pakistan, Iraq and Yemen.

…. [The operators] said they were encouraged to dehumanise their targets and even referred to the children they monitored with their drones as “tits”, or “terrorists in training”, or “fun-sized terrorists”. The four said they had struggled with depression and even suicidal thoughts since quitting.

The operators said they were supposed to combine signals intelligence, imagery and human intelligence. Often they lacked one or more of these and yet they still proceeded with the kill missions.

“The programme hemorrhages people. We don’t like it. It’s not a good job.”

And in a follow-up story in the Times, “The Wounds of the Drone Warrior“:

According to another recent study conducted by the Air Force, drone analysts in the “kill chain” are exposed to more graphic violence — seeing “destroyed homes and villages,” witnessing “dead bodies or human remains” — than most Special Forces on the ground…. Unlike conventional soldiers, they aren’t bolstered by the group solidarity forged in combat zones. … What happens when the risks are entirely one-sided? Lawrence Wilkerson, a retired Army colonel and former chief of staff to Colin Powell, fears that remote warfare erodes “the warrior ethic,” which holds that combatants must assume some measure of reciprocal risk. “If you give the warrior, on one side or the other, complete immunity, and let him go on killing, he’s a murderer,” he said. “Because you’re killing people not only that you’re not necessarily sure are trying to kill you — you’re killing them with absolute impunity.”

So, being a drone operator is “a bad job”[1] because:

  1. The nature of the gig mixes work time and private time (“driving out of the gate and stopping at Walmart”) or rather, converts all time into work time;
  2. The gig makes demands while not providing the tools to meet them (“Often they lacked one or more of these”)
  3. The gig lacks cameraderie (“they aren’t bolstered by the group solidarity forged in combat zones”)
  4. The gig makes enormous demands on empathy while not allowing the operator to offer help (“exposed to more graphic violence… than most Special Forces on the ground”)
  5. The gig is morally problematic (“If you give the warrior… complete immunity… he’s a murderer”).

Obviously, being a military drone operator is a limit case for “remote control” gigs, but are these characteristics really true for other gigs, like content moderation? I think they are.

Working Conditions of Facebook Moderators

Now let’s look at the Verge article (which I recomend you read in full). We’ll go through the working conditions for content moderators at Facebook as described by the whistleblowers, and see which of the above characteristics, as they emerged from describing the work of military drone operators, apply:

1. The nature of the gig mixes work time and private time.

Marcus was made to moderate Facebook content — an additional responsibility he says he was not prepared for. A military veteran, he had become desensitized to seeing violence against people, he told me. But on his second day of moderation duty, he had to watch a video of a man slaughtering puppies with a baseball bat. Marcus went home on his lunch break, held his dog in his arms, and cried.

2. The gig makes demands while not providing the tools to meet them

“The stress they put on him — it’s unworldly,” one of [Keith] Utley’s managers told me. “I did a lot of coaching. I spent some time talking with him about things he was having issues seeing. And he was always worried about getting fired.”

On the night of March 9th, 2018, Utley slumped over at his desk. Co-workers noticed that he was in distress when he began sliding out of his chair. Two of them began to perform CPR, but no defibrillator was available in the building. A manager called for an ambulance. Utley was pronounced dead a short while later at the hospital, the victim of a heart attack… [T]he moderators who work in these offices are not children, and they know when they are being condescended to. They see the company roll an oversized Connect 4 game into the office, as it did in Tampa this spring, and they wonder: When is this place going to get a defibrillator?

3. The gig lacks cameraderie

[E]mployees I spoke with believed his tenure exemplified Cognizant’s approach to hiring moderators: find bodies wherever you can, ask as few questions as possible, and get them into a seat on the production floor where they can start working.

The result is a raucous workplace where managers send regular emails to the staff complaining about their behavior on the site. Nearly every person I interviewed independently compared the Tampa office to a high school. Loud altercations, often over workplace romances, regularly take place between co-workers. Verbal and physical fights break out on a monthly basis, employees told me.

4. The gig makes enormous demands on empathy while not allowing the operator to offer help

Early on, Speagle came across a video of two women in North Carolina encouraging toddlers to smoke marijuana, and helped to notify the authorities. (Moderator tools have a mechanism for escalating issues to law enforcement, and the women were eventually convicted of misdemeanor child abuse.) To Speagle’s knowledge, though, the crimes he saw every day never resulted in legal action being taken against the perpetrators. The work came to feel pointless, never more so than when he had to watch footage of a murder or child pornography case that he had already removed from Facebook.

5. The gig is morally problematic

“I really wanted to make a difference,” Speagle told me of his time working for Facebook. “I thought this would be the ultimate difference-making thing. Because it’s Facebook. But there’s no difference being made.”

I asked him what he thought needed to change. “I think Facebook needs to shut down,” he said.>

Working Conditions for Remote Operators Generally

We are told that remote labor (“telepresence”) is the future for many jobs. Kara Swisher and Rani Molla of Recode/Decode has a very interesting interview with Louis Hyman, author of Temp[2]. From the transcript:

[LOUIS HYMAN] And I think part of this acceleration I wrote about in the book is this idea of digital migrants. So sometime in the next few years, we will see robots that are tele-operated by somebody else, and I think people aren’t as attentive to this as they need to be. …. I went to a lab a couple of years ago at Berkeley, and you could put on virtual goggles. Like we all now have these — well, I guess six people have the Oculus Rift or whatever. And you can run a robot body through that. And people there were very excited about this towel-folding robot that could see a towel and fold it. And I sat there for an hour waiting for this towel to be folded and it never could. I hate folding so I was super excited to see this. And I put the goggles on and I could fold the towel almost instantaneously… I could reach the robot’s arms and fold the towel. And I realized when I did this it was like, oh wow, I could do this anywhere. And so I can easily imagine the next couple years, some entrepreneur offering very cheap house-space robots the same way that Tesla used its own drivers to train its Autopilot, to use just hundreds of thousands of people around the world through some kind of online labor program in putting on virtual reality goggles somewhere in Bangladesh or Mexico. And then operating these robots.

And then because of machine learning, the robots would learn how to do all kinds of manual tasks….

RM: Right, so everything that can be digitized, will be digitized. A lot of things will be automated.

And it will be digitized by cheap people.

RM: By cheap people.

This is the important part.

Which, of course, Facebook being Facebook, it intends to do. Verge once more:

If you believe moderation is a high-skilled, high-stakes job that presents unique psychological risks to your workforce, you might hire all of those workers as full-time employees. But if you believe that it is a low-skill job that will someday be done primarily by algorithms, you probably would not.

Instead, you would do what Facebook, Google, YouTube, and Twitter have done, and hire companies like Accenture, Genpact, and Cognizant to do the work for you. Leave to them the messy work of finding and training human beings, and of laying them all off when the contract ends. Ask the vendors to hit some just-out-of-reach metric, and let them figure out how to get there.

(Not to mention Amazon warehouse workers.)s

Now let’s look at a few of these futuristic remote labor gigs. Of the five characteristics, listed:

  1. The gig mixes work time and private time
  2. The gig makes demands while not providing the tools to meet them
  3. The gig lacks cameraderie
  4. The gig makes enormous demands on empathy while not allowing the operator to offer help
  5. The gig is morally problematic

I would say that #1 and #2 are “normal” in the sense that most gigs head toward this baseline anyhow, kaching. #3 is, I think, inherent in remote labor; either you’re working alone or in a warehouse, and in any case you’re under a headset or staring into a screen. I think #5 will most often be a function of #4. So let’s look at potential demands on empathy. (It’s worth noting that in the literature on telepresence I’ve read, the developers focus on latency — that is, the response time necessarily created by remote operation. They don’t give any thought to the operations at all.

The first example: Remote pilots of commercial aircraft. CNBC:

Over the last few weeks, analysts at Jefferies have quizzed plane-buying executives at airlines and leasing companies on what they would want from any new Boeing offering.

The researchers said that given the [new offering] could start from a completely fresh design, airline executives see scope for just one pilot to be physically sat in the plane.

A second pilot would be ground based and be able to “monitor several aircraft” at the same time.

Reducing the number of pilots from an airline’s payroll could save a company millions of dollars in salaries and training costs.

Great. Would the remote pilot, for example, have had to follow Ethiopian Airlines Flight 302 all the way down to the ground? I’m guessing yes; one of the stressors for military drone operators is not being able to look away. Has consideration been given to the demands for empathy placed on the remote pilot?

The second example: Emergency medical drones. My Drone Authority:

In some emergency situations, only a few minutes may make the difference between whether someone lives or dies. Delivery drones can bring first aid supplies, needed medicines, blood, and medical equipment. For example, those suffering from a heart attack might get help from an emergency drone. This drone maintains communication contact with paramedics and can deliver a portable defibrillator. A defibrillator is a device that uses a strong electric pulse to restart the heart. The paramedics are able to observe through remote video what is happening and instruct the people giving aid to the heart-attack victim on how to use the defibrillator.

Great. Will the remote paramedics be required to view a heart attack where the treatment is going wrong?

The third example: Remote drivers for robot cars. From Wired:

Livingston is sitting comfortably in his office in Portland, Oregon, when he appears on the screens inside the car and announces he’ll be our teleoperator this afternoon. A moment later, the MKZ pulls into traffic, responding not to the man in the driver’s seat, but to Livingston, who’s sitting in front of a bank of screens displaying feeds from the four cameras on the car’s roof, working the kind of steering wheel and pedals serious players use for games like Forza Motorsport. Livingston is a software engineer for Designated Driver, a new company that’s getting into teleoperations, the official name for remotely controlling self-driving vehicles.

Total creepiness aside, Will Livingston be prepared for what happens in case of a car crash, and will he have to monitor the screens — heightening things a little, here — while the bodies are pulled from the flaming vehicle?

In case case, the behavior of Facebook toward its moderators — as well as the general incentives to treat those who will be replaced by AIs as disposable — would that in all three cases, remote operators will be seeing events they will not be able to look away from, and which they will remember for the rest of their lives. In a bad way.


If gig workers training the artificial intelligences that will replace them at their horrible jobs would lead to Fully Automated Luxury Communism, I might consider the sacrifice of their empathetic faculties worth it (especially if they were told, truthfully, that was the goal, instead of being treated as disposible and fungible…. lumps of labor). Somehow, however, I don’t think that’s going to be the case.

Print Friendly, PDF & Email