Empathy at Scale

How a drab, new occupation scrubs social media of the dark shadows of our collective unconscious.

How do we deal with technologies that unintentionally magnify the dark side of humanity?

We live in an era of socio-technical systems, massive, virtual machines that process human behavior for commercial ends. Sometimes that’s a social network or a search engine optimized to capture our attention, and sometimes it’s an online shopping service perfectly tuned to stimulate our urge to buy. Regardless of the specifics at work, the algorithms that drive these systems perform an iterative dance between our data and the machine learning processes that make them smarter over time.

The challenge we now face is that that iteration create feedback that can lead to very negative consequences.

Negative Feedback Loops

We’ve all done it. You’re stuck in horrible traffic resulting from an accident ahead of you. It’s all those damn lookie-loos, slowing down to get a good look at the carnage as they pass. And still, when you get to the spot of the accident — since you’re driving so slowly anyway — you just can’t resist the urge to crane your neck for a peek yourself. Were you driving 60 mph, you probably wouldn’t risk it, but driving at 15 mph, what harm could it really do to drop your speed down to 8 mph so you can get a better look? And so our morbid curiosity feeds back upon ourselves, leaving us trapped in traffic exacerbated by our own curiosity.

Social media helps us create similar negative feedback loops. At the simplest level, you might receive a negative reply to a tweet that provokes you to an angry response. That kind of interaction isn’t all that different from the reaction we might have in person, without any algorithmic amplification — just old fashioned psychological stimulus-response.

Now, let’s say that some of your friends see the exchange and start to pile on with their own outrage. Twitter’s algorithms interpret that flurry of negative reactivity as engagement worthy of bringing to others’ attention and the flames are further stoked. We don’t even necessarily have to say anything to set this feedback in motion. Just clicking the wow, sad, or angry emoticons in Facebook tells the algorithms that something is worth showing to others. Even just looking at something on social media can act as a signal to tell the algorithms that there is something worth highlighting here for others.

Filtering the Dross

While most of us can’t seem to stop ourselves from sneaking a peek at an accident as we drive, that doesn’t mean we actually feel good about doing it. And if what we glimpse in doing so happens to be particularly gory, it can even traumatize us.

The dregs of human behavior algorithmically surfaced in social media can be similarly horrifying. The algorithms amplify our curiosity, magnifying the shadow within our collective unconscious until it comes spilling out into our newsfeeds. Even though the result is what those dark recesses of our psyche ‘told’ the algorithms we wanted, it’s not what most of us actually want at a more conscious level of mind. The results can be disturbing — and bad for business.

An Outsourced Conscience

The catalyst for this article was a disturbing piece by Casey Newton on The Verge called The Trauma Floor. It details working conditions at Cognizant, one of the major firms that Facebook uses to outsource moderation of content on its network. It paints a gruesome picture, starting with a new trainee named “Chloe” who, soon after being hired, is exposed to a video of a man being murdered while he begs for his life.

Chloe is being trained to spot objectionable content that violates Facebook policy. As such, she is part of a massive psychological shield, an outsourced conscience operated by human employees. This shield is what staves off the backwash of filth that would otherwise overwhelm these hugely profitable networks and make them untenable. But as Newton’s article points out, for those paid to do this work, the psychological impacts can be absolutely corrosive.

“We were doing something that was darkening our soul — or whatever you call it.”

Li, a Facebook Moderator at Cognizant

Welcome to the seamy underbelly of our social networks. Moderation companies like Cognizant help Facebook and other networks clear out the dross leftover from the algorithmic amplification of our darker human drives. Someday algorithms will likely take over more of this work, but for now, their effectiveness remains limited. Today, the only way to operate these networks with algorithmic feedback on billions of users is by exposing lots of humans to some really nasty stuff.

Scaling Conscience

Our generation is now building a future interface between human and machine intelligence. Humanity has much intelligence and great beauty to embed into machines. But there is also much darkness that is embedded in the process. It is our nature.

The companies behind these gargantuan systems have yet to figure out how to build these anthro-mechanical systems in ways that stifle their sociopathic tendencies. But with rising acrimony and hatred and the growing distortions of reality flowing through these networks, these companies are coming under growing political pressure to do something.

Yes, part of the solution will be better machine learning algorithms capable of the nuance for dealing with the subtly of moderating ever-changing human behavior. But as Tim Berners-Lee puts it, part of the problem comes down to the way we design user experience in these systems:

“How do we design systems so that when we drop a drop of hate into it, falls away dissipates, but when we drop a drop of love in, it propagates and spreads — preserving free speech but influencing us for the good.”

Tim Berners-Lee

Until we figure that out, Chloe and her coworkers will continue to be subjected to the horror of filling in for our missing collective conscience.

Stay in Touch

Stay up with the latest insights on automated service platforms, artificial intelligence, and the impact of these systems on people. Subscribe to the Vital Edge.

Share this:

Scroll to Top