Young People Want Greater Punishments For Poor Behavior On Social Media

The last few years have seen considerable growth in the so-called “culture wars”, with accusations of a cancel culture sitting alongside a rise in the “Karen” meme typifying a world that some argue is too fragile and quick to demand action against opinions or content that they find objectionable.

This perhaps reached an epoch when Twitter banned Donald Trump from its platform during the insurrection of Capitol Hill by his supporters.  The episode marked the end of an ugly stream of misinformation and tantrums from the former president, with opponents confident that the platform would be better for his absence.

It’s the kind of action that new research from the University of Michigan suggests young people today want to see more of.  The study shows that 62% of adolescents and young adults aged between 14 and 24 want a sliding scale of retribution for wrongs committed online, ranging from an apology to banning the offenders, via deletion of the content in between.

Social harm

The survey also showed that many young people don’t actually have much faith in the ability of social media to provide a fair and satisfactory resolution to any problems they may face.

“Young people’s responses likely mirror shifting tides in how the general population in the U.S. views social media companies right now—with a mix of uncertainty, distrust and concern,” the researchers explain. “In light of these reactions, it’s not surprising that youth would also be experiencing this kind of distrust.”

“Young people are growing up in an era where they are very online, and many will be exposed to critiques of the economy, capitalism, social inequalities and other issues,” they continue. “They have an understanding that social media companies are profit-driven enterprises, and this might clash with companies’ ability or desire to create safe and just experiences.”

Content moderation

All social media platforms have some form of community guidelines, which tend to rely on a combination of human moderation and automated technology.  When violations are spotted, the typical response sees the content removed and the offender either warned or banned.  It’s rare, however, for any kind of remediation offered to the victim themselves.

Previous researchers have likened this response to a criminal justice system in which the perpetrator is punished but the victim lacks any sense of justice being served.  They argue that a better response would to more restorative justice that emphasizes accountability for the offense with reparation for the victim.

“Across studies, we see that there are significant differences in the kinds of harms and severity of harms people experience when they are online,” the researchers explain. “We also see differences in their preferences for how companies respond to those harms.”

Despite the unique nature of the various social media platforms, the researchers suggest that most rely on a one-size-fits-all approach to managing online harassment.  This fails to adequately take into account the unique nature of the community or the needs and preferences of the individuals that frequent it.

“I think it’s time to rethink many of the fundamental premises of social media,” the researchers conclude. “This could include creating regulation that is focused on reducing harm and increasing well-being, or it could include a healthier public-funded ecosphere of smaller social media platforms that create diverse opportunities to participate online, or it could include more creative approaches to design that prioritize justice, healing, slowing down, community accountability and other alternative approaches for how people might be together in share spaces.”

Facebooktwitterredditpinterestlinkedinmail