A case for reading—and weeding—the comments on social media platforms like Facebook

0
101
September 2, 2014- Joseph Reagle, Assistant Professor of Communication Studies.
- Advertisement -

By Molly Callahan

News at Northeastern

The ousting of broadcaster and noted conspiracy theorist Alex Jones from several social media platforms was a watershed moment, said Joseph Reagle, an  associate professor at Northeastern who studies online culture. For the first time in such a unified way, social platforms are formally defining the type of interactions they want to foster rather than letting users run wild.

For much of their existence, social media sites such as Apple’s podcast platform, Facebook, YouTube, and Twitter—and more broadly, blog platforms and comment sections—were seen as marketplaces of ideas, where everyone was welcome to say whatever they wanted.

Recently, Reagle, who wrote a book on the topic called Reading the Comments: Likers, Haters, and Manipulators at the Bottom of the Web, has been thinking of them differently.

“As I saw online communities overtaken by haters and trolls, I started thinking about them more like gardens,” he said. “You can’t just allow people to litter and expect you’ll have a beautiful garden. You have to provide fertilizer; we have to weed the venues where we want to spend our time.”

Each platform is weeding itself differently, Reagle said. Companies like Apple (whose podcast platform Jones used), Facebook, YouTube, and Vimeo banned Jones, InfoWars, and other users whose posts “could lead to harassment or violence,” he said. In not banning Jones or InfoWars, Twitter has taken “more of an absolutist free speech stance,” Reagle said.

Don’t read the comments. Or do.

With or without Jones, internet discussion platforms are notoriously hostile places that quickly devolve into name-calling or Nazi-invoking speech.

Reagle said the biggest problem for online communities that are trying to foster civil discourse is that there are often just too many people chiming in.

“Sometimes you’ll see a call to action at the bottom of a news article that says, “6,000 people have posted in response to this; join the conversation.”

03/23/18 – BOSTON, MA. – Stock photography of Facebook on an iPhone on March 23, 2018. Photo by Adam Glanzman/Northeastern University

“Imagine 6,000 people in a room trying to have a conversation. That just doesn’t even really make sense,” he said.

High-quality discussion gets lost in the mix when so many people are “joining the conversation.”

In his book, Reagle cited the research of evolutionary psychologist Robin Dunbar to describe what is likely the ideal scale for creating high-quality discussion.

Through studying the social habits of primates, Dunbar discovered that monkeys can generally keep track of 150 of their peers. Subsequent studies found that humans, too, can maintain no more than roughly 150 relationships.

“After that, those relationships break down and you’re more likely to encounter people you don’t know,” Reagle said.

Gossip and discussion can be found plastered in public places going back to ancient Rome and Greece. But this internet age is the first time people have been presented with such an overwhelming glut of discourse and discussion, Reagle said.

Making sense of the chaos

So, how do social media companies help people find the conversations they actually do want to join? Companies like Reddit use an up-voting and down-voting system. Users can up-vote posts they think are worthy and down-vote ones that aren’t. Posts with more up-votes rise to the top of a thread of discussion and those with more down-votes sink to the bottom.

Earlier this year, Facebook rolled out a similar rating system among a small group of users in the United States and New Zealand to test whether it would encourage better public discussions.

This is a start, Reagle said, but the system ultimately falls victim to a series of complex issues.

There’s the problem of reducing complex human behavior down to a simple up or down vote, he said. Even more sophisticated measurement scales, such as the 10-point pain scale used widely in medical care, is subject to each person’s individual experience of pain.

Joseph Reagle, associate professor of communication studies. File photo.

Reagle offered an example from his own life. He noticed that the same reviewer gave his book a certain number of stars on one website and a different number of stars on another website, though both sites used the same scale. When he asked why, the reviewer explained that the different audiences had different understandings of the rating system on each site.

People assign different meanings to online ratings and rankings, Reagle said. “Without understanding the motivation, it’s difficult to standardize them.”

Look no further than the reviews of any Amazon product to see the discrepancy in our understanding of how to rate something.

Even if all of humanity could agree upon standard use for rating systems, such systems can still fall prey to “gamification,” Reagle said.

Rather than working together to promote thoughtful discussion and discourage glib comments, often people will work together to promote comments that align with their personal beliefs, regardless of their quality.

“Where up- and down-voting is supposed to be associated with the sentiment in the post, there’s a long history of people [forming a brigade] to up-vote people and posts that share a similar ideology as themselves,” he said.

“The concern is that people can band together to make something falsely popular,” Reagle said.

What has become clear, in the wake of Alex Jones and Reddit, is that there is no easy answer to the question of what to do with the comments section.

“In this world of so many comments, how do we make sense of it all?” Reagle asked. “By no means is there a panacea. We have to keep going.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here