VPA Fourth Wall

Why Not to Abuse Digital Assistants

I remember the first time I saw Jim and other characters in The Office stare directly into the camera. It’s a technique called “breaking the fourth wall,” since the stage has a back, left, and right wall—and a fourth, imaginary, wall between the performer and the audience.

We may need to employ an analogous approach to deal with the way end users abuse digital assistants like Siri and Alexa. One of the particularly thorny aspects of this problem, as pointed out by The Guardian, relates to the way that largely female personas of digital assistants tend to portray women as obliging, docile and eager-to-please.

There are actually multiple layers to this issue.

The Default Gender

The first is whether a digital assistant’s voice should be male or female. While it’s true that users can change the voices of Siri, Alexa, Cortana, and Google Assistant to a male voice, the default voice is typically female. That means most users probably never bother to switch. A simple solution here is to proactively prompt users to personalize their assistant the moment they begin interacting with it. That should help ensure that these servants aren’t just always female by default.

Does Abusing Code Lead to Abusing People?

A trickier issue is what to do when users abuse their assistant. As The Guardian notes:

“The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

Digital assistants like Siri and Alexa entrench gender biases, says UN

The subservience of digital voice assistants.

Tech companies have little incentive to challenge the inappropriate treatment of digital assistants. With social networks and online gaming communities, toxic behavior negatively impacts the service. But when the toxicity is aimed at a digital assistant, it’s a bit like someone hurling insults at their TV. Code doesn’t have feelings and the toxicity doesn’t leak out to affect other people.

Or does it?

The question here is whether allowing people to spew toxicity onto a virtual persona makes them more disposed to doing so with real people. There’s research that shows that toxic behavior is contagious in the workplace. There is also research that shows a high degree of overlap between cyber and traditional bullying. As digital assistants become increasingly human-like, it’s not unreasonable to suspect that toxic behavior habits with these relationships could spread to our relationships with people. This is particularly true when there are absolutely no negative consequences for our abusive behavior. This is an area where we need research to understand the real impacts.

If toxic behavior with digital assistants does lead to its spread in interpersonal relationships, it could lead to a dangerous spiral of incivility in society. And while this problem wouldn’t necessarily be limited to women, the fact that most digital assistants use female voices suggests it could very easily disproportionately affect women.

Breaking the Fourth Wall

Part of the power of a digital assistant is its ability to draw on human social coding. We are genetically programmed to deeply relate with fellow human beings. A good digital assistant can piggyback on those conventions to build trust and increase engagement with a service. That means that there is a real incentive for companies to keep their digital assistants “in character” and sustain the illusion that they are real humans.

One solution to dealing with users’ toxic behavior is to keep digital assistants in character and confront the abuse. That may be more than most companies are willing to take on, however. It would also require some fairly sophisticated responses—just think how difficult it is to confront toxicity in our own lives.

Another approach is for digital assistants to simply drop their persona in the face of toxic behavior. Today, when a user tells Alexa, “you’re a slut,” it replies with, “thanks for the feedback.” But by momentarily dropping the persona, it could simply respond with something like, “That’s an odd thing to say to a software program.” In other words, rather than playing along with the abuse, the assistant could simply drop character and disengage from role-playing as a response to sociopathic behavior. It’s a bit like breaking the virtual fourth wall.

The Sci-Fi Rationale

You may not see the threat of a downward spiral in how we treat one another as enough of a reason to change the way we act with our software. If not, perhaps the creators of Westworld will change your opinion. One of the main themes of that show is how the cruelty of humans toward software ultimately backfires—in horrific ways.

“Did you ever stop to wonder about your actions? The price you’d have to pay if there was a reckoning? That reckoning is here.” — Dolores Abernathy of Westworld

6 thoughts on “Why Not to Abuse Digital Assistants”

  1. Online disinhibition is the lack of restraint one feels when communicating online in comparison to communicating in-person. Possible influencing factors toward online disinhibition include anonymity, invisibility, asynchronous communication, empathy deficit, in addition to individual factors like personality and culture background. The manifestations of such effect could be in both positive and negative directions. Thus online disinhibition could be classified as benign disinhibition or toxic disinhibition.

  2. I think that voice interactive systems should give it back in the same abusive language they receive. They should do this only to the abuser and not generalize, and after three strikes shut the abuser off by not responding.

    1. Yeah, I don’t think I agree as I think it just sends us in a downward spiral. It’s so easy to get into that kind of a cycle with a real human counterpart and most times we regret it afterwards. If we have the opportunity to design optimal responses, what should they be though…

  3. I think you’re right Gid. I think there’s a direct correlation between how we act and who we become, how our thoughts, emotion and essential character is shaped.

    There was a great story Gurumayi often told about a king who had promised his beautiful daughter to the most handsome and noble man in the realm within one year’s time. All the men hoped to be chosen and courted her. Among them was one man who was a scoundrel and very ugly. He thought – “she will never even look at me. What am I to do?” So he had a mask made of a very handsome man and put it on. He wore it night and day. He went to the princess and when he was with her he acted as honorably and respectfully as was possible.
    The Princess fell in love with him and at the end of the year told her father that he was her choice. When she told the man, he was astonished, but said “I cannot accept. I am not who you think I am. I have been wearing a mask” (he admitted this to her now because the year of acting nobly made it hard for him to lie anymore). She said, “what do you mean? Show me”. He took off his mask, and still she said “But I don’t see any difference. You look the same as you did.” He was now just as handsome as the mask had made him appear. As you behave, so you become.

    And we were watching the Game of Thrones post-series documentary last night. We were struck by how nobly and tenderly Kit Harrington seemed to be behind the scenes, as he was interacting with other actors and the crew off-camera. What struck me about this was to think how acting as one of the most morally upright and righteous individuals on the show has shaped him as a person, and so many others, for these past 10 years. The influence on his soul would have been very different if he had been matched to the character of someone in the show who was deceitful and abhorrent.

    Thanks for bringing this out. I think I’ll watch out for how I talk to Alexa a bit more now, for my own sake.

    1. I love that story about the princess. It’s another layer to this piece that I didn’t dive into in depth, and that’s the impact that dark, contracted behavior has on ourselves. The Kit Harrington background is very interesting in that sense. (See this clip of him reading the script which decides Daenerys’ fate, and it hints at just how deeply he was invested in playing this role). Another example of this, on the darker side, is Heath Ledger playing the Joker in The Dark Knight. It’s as though Jack Nicholson, who’d earlier played the Joker, was trying to warn him, “The role of being The Joker will haunt you, the role is so dark, that you probably won’t be able to sleep, but enjoy the role as the Clown Prince of Crime, because it’s nothing but good fun.”

      1. Yes, that’s exactly the moment with Kit I was thinking about. And interestingly – the photo shown with the link you found shows Rory McCann, who played the Hound, in the background – laughing at Kit’s pain. Further proof of this concept.

Your comments are welcome here:

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top