Don't click that video!: the moral dilemma of going down the YouTube rabbit hole
Earlier this year, Mike Rugnetta tried the, "How to run a 5k," experiment on YouTube.
It's a simple exercise, he said, but its implications are far-reaching.
The experiment is the idea of sociologist Zeynep Tufekci, who noticed YouTube seems to constantly push viewers towards more extreme content. For example, Tufekci says, watch a video about jogging and the algorithm will follow up with a video about ultramarathons.
"It seems as if you are never 'hard core' enough for YouTube's recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes," Tufekci wrote in a New York Times opinion editorial.
So … Rugnetta types out the phrase, "How to run a 5k," and hits, "search."
Among the results were a video entitled, "How to Win a Street Fight," and another that explained why women find some men unattractive.
This is exciting … maybe I do want to learn how to win a street fight.- Mike Rugnetta
Rugnetta told Tapestry's Mary Hynes that his results reflect a widespread and potentially troublesome aspect of online media platforms.
Pop culture, he said, whether in the form of a YouTube video or a catchy song, can influence the way a person views the world.
In the past, Rugnetta said, advertising and word-of-mouth recommendations were common avenues for people to discover new content; but today, major platforms like Spotify, Netflix and YouTube use algorithms to nudge people along.
Often, the algorithms' recommendations seem intuitive: Spotify, for example, may suggest the band Queen, after someone has listened to David Bowie. The platform's objective is to get you to continue listening, so its algorithm orients you towards what it calculates to be complementary music.
You may be surprised by the recommendation … say, if you dislike Queen intensely … but the algorithm's misstep is harmless.
The power of a recommendation
But on other sites — and in an era when content platforms are routinely accused of hosting hateful content and misinformation — unexpected search results and recommendations can be far from innocuous. They can also put you in an ethically tricky situation, he said.
As an example, Rugnetta pointed to his, "How to run a 5k," experiment.
Rugnetta isn't normally drawn to videos about fighting, but when, "How to win a Street Fight," came up, he was both repulsed by the content ... and enticed.
"This is exciting … maybe I do want to learn how to win a street fight. What if I'm ever in a street fight? I would want to know how to win," he said, describing the unexpected compulsion to click on something he'd normally steer clear of.
But he said, he realized that following his urge might benefit the content in question.
"By me clicking on this, am I saying to the algorithm, 'Hey when I search for running videos, I'm actually searching for a street fighting video?' And additionally am I saying, 'Hey, when other people search for running videos they're also search for street-fighting videos'?"
Rugnetta said that the functioning of the algorithms of most major platforms is a mystery, since they're closely-guarded trade secrets.
While he wants platforms to become more transparent, in the meantime, he said, users may want to think about the ethical implications of clicking on troubling search results … like his sudden urge to watch a street-fighting clip.
"You just have to look at that emotion. You have to turn it around in your hands. You have to appreciate it for what it is. And then just throw it away."
Mike Rugnetta is the former host of a YouTube series called PBS Idea Channel, which used popular media to explore deep philosophical ideas.
Click "listen," above to hear Mary's interview with Mike Rugnetta.