A hot potato: If an alien race decided whether to destroy humanity based on how people act online, we'd be in a lot of trouble. But research suggests that while the internet can be a cesspit of hate-filled toxicity, the vast majority of this content comes from a small percentage of highly active users.

The online world isn't a true reflection of everyday life for most people. We're not met with a barrage of abuse from angry strangers as we walk down the street. There's no palpable sense of anger and hatred simmering away under the surface all the time, exploding for seemingly trivial reasons.

Jay Van Bavel, a professor of psychology at New York University, along with Claire Robertson and Kareena del Rosario, published a paper looking at the disconnect between the online and real worlds.

According to the trio's findings, the internet, or social media specifically, is a funhouse mirror version of social norms. It amplifies extreme voices while muting the moderate, nuanced, sensible, and diplomatic ones.

Much of this distortion comes from a just a handful of people who are almost always online and posting. Van Bavel writes that just 10% of users produce around 97% of political tweets.

Former Twitter platform X is a good example of this phenomenon. It has hundreds of millions of users, but the vast majority of political posts on the site come from a tiny fraction of its users. One example is owner Elon Musk, who posted 1,494 times in his first 15 days of implementing government cuts while head of DOGE.

It's a similar story with misinformation. Just 0.1% of users share 80% of fake news. A report by the Center for Countering Digital Hate (CCDH) cited by the White House in 2021 found that the vast majority of Covid-19 anti-vaccine misinformation and conspiracy theories on Facebook originated from just 12 people, who were dubbed the "disinformation dozen."

Instagram, Reddit, and other platforms all have the same problem. While only a comparatively small number of users engage in toxic behavior or share misleading content, they post so much and so often that they are responsible for the overwhelming majority of it.

It's these super-users who are responsible for our collective impression of the internet being the worst side of humanity, writes Van Bavel. Instead of seeing a representative sample of opinions, we're exposed to a flood of extreme, emotionally charged content.

Platform design and algorithms also play a part, amplifying these posts for maximum engagement so more users see them. This can cause people to exaggerate their beliefs or repeat outrageous claims to get more attention – rage bait posts have become particularly popular in recent times, mostly because this engagement tactic tends to work.

Because of this, many people believe the world is a more polarized, angrier, hate-filled, and deluded place than it actually is. Not everyone has an us-versus-them attitude, or an unnerving willingness to threaten you with death because you liked/didn't like the new Superman movie.

In a series of experiments, users were paid a few dollars to unfollow the most divisive political accounts on X. After a month, they reported feeling 23% less animosity towards other political groups. It led to almost half the participants refusing to refollow the accounts after the study was over. So, if you want to be less pessimistic about people and the world we live in, try to ignore or avoid the few angry online trolls who represent the worst of us.

Masthead: Ziko liu