Anthony Pinter, a Ph.D. student in information science at the University of Colorado Boulder, recently completed a study on people’s experiences with upsetting and unexpected reminders of an ex on Facebook.
His team’s findings are examples of algorithmic cruelty – instances in which algorithms are designed to do something and do it well, but end up backfiring because they can’t fully grasp the nuances of human relationships and behavior.
How has social media made breakups more difficult?
Anthony Pinter: Breaking up with a loved one has always meant making difficult choices: who gets the couch, who gets the fridge, who gets the cat.
But before social media, once the messy details were sorted, it wasn’t too difficult to create the physical, mental and emotional space that research has shown to help with the healing process. In the past, you could simply stop going to your ex’s favorite coffee shop. You could box up photos and put them in storage.
Social media has complicated things. Platforms like Facebook are designed to encourage connecting with your network and reminiscing about the past. It recommends upcoming events, suggests people to add as friends, resurfaces old memories and photos and highlights what your friends are doing.
But after a breakup, you probably don’t want to be alerted about a new friend your ex has made on your news feed. Nor do you want to see an old photo with your ex reappear as a “Memory.” And with access to your ex’s online life just a search and a click away, it’s easy to succumb to forms of “Facebook stalking,” in which you periodically check in on their profile to see what they’re up to and whom they’re hanging out with.
Not surprisingly, Facebook has been shown to prolong the healing process of a breakup. Conversely, you might also start to realize your ex has already moved on, which can be just as painful.
“Just block your ex,” you’ll hear people say. Why isn’t this enough?
Pinter: First, blocking or unfriending isn’t as simple as it sounds. It can be done in as little as three clicks. But once you’ve done it, it’s hard to walk back from; if you ever decide to unblock someone or refriend them, social media platforms will often alert the ex that you’ve done so – which can send ambiguous signals and expectations.
But yes, platforms like Facebook, Twitter and Instagram have features meant to prevent these unwanted encounters – unfollow, unfriend or block. A few years ago, Facebook even developed a feature called Take A Break, which effectively mutes someone for a set period of time.
However, people are still seeing reminders of their exes on social media – even when they’ve actively taken advantage of features that supposedly prevent these encounters.
My colleagues and I conducted in-depth interviews with 19 people who had had an unexpected and upsetting reminder of an ex on Facebook.
One participant mentioned that the mother of an ex’s new partner was suggested as a possible friend. Another saw their ex commenting on a mutual friend’s post. In one case, an old photo that Facebook resurfaced via the Memories feature – from a beach vacation the two had taken when they’d been a couple – didn’t even include an image of the interviewee’s ex. But being prompted to think about that vacation was upsetting enough.
What’s really going on here?
Pinter: This is happening because the algorithms still don’t fully understand humans.
While you can tell Facebook you don’t want to see your ex anymore, the algorithm doesn’t realize that this might also include peripheral reminders of your ex, like a photo of his or her best friend, or a comment he or she has made on a mutual friend’s wall.
Context matters, but algorithms often don’t have the ability to understand it. Even though that photo from the beach might not have anyone in it, it’s loaded with memories that you’d rather not think about.
In our work, we want to bring attention to what we call the “social periphery” – the satellites of a relationship, romantic or otherwise. Systems like Facebook are built to cultivate community, but the algorithms that undergird the system often rely on simplistic representations of people’s experiences like “relationship status” or “blocked.” Features meant to prevent upsetting encounters in the wake of a breakup or other fraught events similarly rely on these simplistic settings, ignoring the realities of a social periphery.
To the algorithm, the suggestion of the ex’s new partner’s mother is a perfectly reasonable suggestion – you probably share mutual friends that alert some sort of internal metric. But a human would know better than to make that suggestion.
Why do these findings matter?
Pinter: Algorithms are becoming more integrated into our everyday lives, and social media isn’t the only place where we’re seeing these undesirable outcomes occur. For example, as people begin to rely more heavily on voice assistants like Siri or Alexa to send texts, we inevitably run into situations in which the programs mishear us and, for example, send a wildly inappropriate message to a boss or parent.
Our findings present a challenge for designers and developers: How can we create algorithms that are better attuned to the deep, lived experiences of the humans who will use these systems? It’s unlikely that there is a one-size-fits-all solution to this problem. On Facebook, features like Take a Break or blocking can be seen as important steps. But it’s clear that there’s a lot more work to do.
Anthony Pinter, University of Colorado Boulder
Anthony Pinter, Ph.D. Student in Information Science, University of Colorado Boulder
This article is republished from The Conversation under a Creative Commons license. Read the original article. https://creativecommons.org/licenses/by-nd/4.0/