The Negative Societal Effects of Algorithmic Content featured image

The Negative Societal Effects of Algorithmic Content

When Facebook was launched in 2004, most "social media" sites at that time served content in a chronological feed—users simply saw the most recent content from those they followed, and it had a definitive end-point in which scrolling would catch the user up on everything they missed and content would begin repeating itself. Since then, as the major social media players have become financialized, everything shifted towards personalized algorithmic content.

Algorithmic content is now everywhere: music, videos, search engines, etc. It is the dominant content serving model, largely for the purposes of targeted advertising, which funds most of the internet. This switch has had negative effects on the world overall, although as a Canadian, I can only speak to the effects in my country with confidence, and generally these approaches have made Canada less pleasant for most who reside here, even for individuals who do not use a computer or phone.

Some negative outcomes that have been created by, or exacerbated by, algorithmic content include:

YouTube's Algorithm: A Concrete Example

YouTube's content algorithm is extremely polarizing and poorly constructed. While Facebook (Meta) seems to receive most of the criticism, YouTube's content algorithm is equally, if not more, dangerous, as it is consumed much more frequently by younger internet users. Once a user clicks on a topic, they are fed an endless stream of echo-chamber content, mostly based on engagement metrics, and with little content value. Additionally, if a user tries searching for a very specific topic which undoubtedly exists somewhere its archives, they are instead fed what YouTube "thinks" they want to see.

As a male user who mostly browses YouTube while logged out with fresh cookie sessions, there is one type of content YouTube tries to push to me more than anything else, especially when I have clicked on anything remotely related to it. This type of content is hard to label with a single word—if I had to give it a genre, I would say it's sad & lonely alpha-male content. This type of content floods the algorithm if YouTube thinks you are a typical male, and includes topics like guns, politics, getting rich, and many pointless videos whose thumbnail images are close ups of women's bodies in tight clothes. Below are a few examples of videos YouTube shows me in a fresh session mixed into other sports video content, after viewing a single "NHL highlights" video from an Ontario IP address to identify myself as a man, and refreshing the YouTube home page:

Angry Political Content

Pierre Poilievre YouTube Video Thumbnail

Shill Content

YouTube Shill Shilling Home Gym Products Video Thumbnail

Women's Body Thumbnails

YouTube thumbnail using a woman's body as click-bait

If I am to click on any of the above videos and refresh the home page again, I am once again fed more of the same. To continue with this example, I clicked on the angry political piece above, then returned to the home page and refreshed. I am now served sports and politics videos with some content like this mixed in:

Transphobic Content

YouTube thumbnail shaming Eliot Page for being trans

Gun Content

YouTube thumbnail promoting guns

Misinformation Content

YouTube thumbnail promoting false information

...and more women's body thumbnail click-bait.

Within two clicks, starting from sports, it's easy to see how the YouTube algorithm begins to promote negative content quickly to the Typical Canadian Man. I believe this is partly why Ontario is such a collectively miserable province—not from YouTube alone, but from a combined sum of negative algorithmic content consumption by generally unhappy people (which makes them even less happy). If I happen to be an individual already sympathetic to a portion of those views, it is likely I will click on some of these other videos, and end up in the long-run with a highly customized feed of negative content, without again ever being exposed to differing viewpoints.

The effects of this radicalization towards extreme view points and lack of any sort of mechanism for validating the integrity of video content (which is not possible, nor am I advocating for), do have very real ramifications for the general public. In the case of the Canadian Man, in case it is not obvious, we get a heavy dose of content in a couple clicks that promotes hatred, violence, and emotionally-driven false information. YouTube's algorithm undoubtedly breeds a sentiment of mistrust, hate, and the potential for violence in individuals who may be most at-risk of consuming this content in large quantities.

The Business Case for Reverting to Chronological Content Feeds

Reverting to user-curated, chronological content feeds is the best way to avoid the pitfalls of algorithmic content. This is the reason many comparatively small content sites continue to exist despite the vast difference in staff and funding when comparted with major internet players—the mechanism for content discovery is still controlled by the user. YouTube happens to be an exception because it is a near monopoly in its domain—they can afford to botch content delivery so badly and not lose many users.

Most of the now most-used web properties began as chronological feeds: Google, Facebook, Instagram, LinkedIn, etc. That is what users wanted, so they adopted these services, and that mass adoption effect is what gave them their power. When given a choice, the majority of users prefer chronological feeds to personalized algorithms. This is partly why Bandcamp continues to maintain its popularity even against the capital of Spotify—chronological feeds are a much better system for music discovery.

If you are building a content-based product, use a chronological feed, and abandon the failed experiment of personalized algorithms.