Social media (SoMe) has become an inextricable part of our lives, connecting us with loved ones, informing us and entertaining us. But SoMe also harbours significant dangers.

Tempers flared at the January 2024 hearing of major social media platform heads in the US. Senators and top managers from Meta (Facebook, Instagram), TikTok, X (Twitter), Snap and Discord exchanged heatedly about the dangers SoMe services pose for children and young people.

A deluge of misleading messages

The daily volume of data published on social media is astonishing. According to Statista, a staggering 68.7 billion social media posts were published worldwide every day in 2022. Of these, 53.6 billion were text posts, 13.4 billion were photos and 1.7 billion were videos. This figure is projected to have climbed to 72.1 billion daily posts in 2023.

The average person spends a considerable two hours and 28 minutes daily on social media. A 2023 Pew Research Center study found that 84 per cent of social media users encountered political content and 66 per cent saw fake news in the previous month.

Estimates suggest that every user sees an average of 100 posts daily. Of those deemed newsworthy, a worrying 50 per cent are now believed to be fake news or propaganda designed to manipulate with misinformation. The shortage of verified posts from reputable media sources suitable for fact-checking is concerning, given the explosion of user-generated SoMe content.

Algorithms and the filter bubble trap

Social media thrives on the ‘bait and hook’  approach, tailoring content to appeal specifically to each user. This keeps users engaged, thanks to the platforms’ sophisticated, often AI-powered algorithms.

Users unwittingly reveal their preferences by posting, liking, sharing and commenting, allowing the algorithms to learn what ‘ticks their boxes’, what they want to communicate, see, hear and what stirs their emotions.

From 2008 to 2014, Stanford professor Michał Kosiński, a leading psychometrics specialist, developed an algorithm for Facebook posts that could assess users better than their friends with just 70 likes. With 150 likes, it outperformed their parents, 300 likes surpassed their partners, and a mere 350 likes were enough to outdo even the users themselves.

This is precisely how the narrow selection of 100 daily posts, presented to each user from a vast pool of over 72 billion published posts, comes about. That’s a mere 0.0000000014 per cent of the entire SoMe sea. Figuratively, it’s like holding just one grain of sand from a square kilometre of the desert.

Recognising the challenges... empowers us to take action
 

An objective view of the world? Hardly

Does this paint an accurate picture of reality, of the world? Is it well-founded, verified information? Absolutely not!

In an era where our sense of perspective, objectivity and certainty erodes, reliable, fact-checked information is more critical than ever. Yet, countless users allow themselves to be steered by it, forming opinions and judgements, essentially undergoing a degree of brainwashing.

They consume only what aligns with their pre-existing worldview and assumptions about the world, people, politics and current events.

This is what they focus on, value and find captivating. In turn, their assumptions and views are repeatedly confirmed and reinforced, creating a distorted reality that aligns solely with their expectations.

This ultimately leads to a negative, often fear-laden spiral of worry about the future, fostering withdrawal from reality and responsibility. This is why we view the power of social media with considerable concern.

Is social media the true culprit?

No, not necessarily. Social media platforms themselves aren’t the true villains. Instead, they expose a deeper issue: the tendency of many individuals to neglect actively seeking accurate and diverse information. They opt for the convenience of being passively ‘fed’ content, often falling prey to algorithms that reinforce their existing biases.

This, unfortunately, leads them to miss out on the rich tapestry of reality, encompassing the positive, the successful, the beautiful, the good and the objective. They become trapped in a distorted echo chamber, perceiving themselves as helpless victims of the world, rather than empowered shapers of their own lives.

This is strongly reminiscent of the famous social experiment ‘The Third Wave’, which history teacher Ron Jones (1941) carried out in 1967 with pupils from Cubberley High School in Palo Alto in Silicon Valley and which was made into a film in 1981 as The Wave.

Jones wanted to familiarise his students with the topic of autocracy during a project week and let them experience how a dictatorship is created. By manipulating information and the game’s rules, the situation quickly spiralled out of control. The momentum triggered by the experiment forced him to cancel the experiment on the fifth day.

“The Wave is frighteningly realistic, timeless and highly topical,” critics wrote.

Despite all the ‘realism’, let’s remain optimistic!

Recognising the challenges with social media doesn’t mean succumbing to negativity. Rather, it empowers us to take action. Let’s actively seek diverse information, engage in civil discourse and uplift others with messages of hope and action.

By fostering a level-headed, yet hopeful community, we can navigate the complexities of the online world and create a more positive impact. Despite all our “realism”, let us remain one thing above all: optimistic!

Reinhold M. Karner, FRSA, is an entrepreneurship and start-up evangelist, multiple chairman (e.g. AP Valletta), corporate philosopher, entrepreneur, author, university lecturer and fellowship connector of the Royal Society for Arts, Manufactures and Commerce (RSA) for Malta and Austria.

Sign up to our free newsletters

Get the best updates straight to your inbox:
Please select at least one mailing list.

You can unsubscribe at any time by clicking the link in the footer of our emails. We use Mailchimp as our marketing platform. By subscribing, you acknowledge that your information will be transferred to Mailchimp for processing.