×
About Services Case Studies Service Status Blog Environment Contact Us

Caught in the Web: Personalised Algorithms and the Impact on the Wellbeing of Young People

News 16th November 2023


What is algorithm personalisation?

Social media algorithms collect data from users based on their interactions with different types of online content to recommend similar content that the user is likely to enjoy. Probably the most pervasive example of this today is TikTok, a social media platform showing short-form videos.

The main feature of this platform is the ‘For you’ page (FYP), which offers users endless content based on their previous interactions. This means users are primarily viewing content that they are not subscribed to. You don’t have to be following a creator for their content to be viewed.Since the TikTok boom, many other platforms, including YouTube and Instagram, have created versions of the For You Page, feeding users content that they have not actively followed. Whilst this allows users to discover more content aligned with their interests, there are a series of risks associated with personalised algorithms.

What are the risks?                                                               

Exposure to unsuitable content

One such risk is the element of unpredictability regarding what content the user sees. Users are often exposed to content beyond what they explicitly choose to follow. This makes it difficult to control what young people see on the internet, meaning that they could be exposed to unsuitable content which, if they interact with, they may see more of.

This challenges the ability of parents and guardians to curate a safe online environment for their children, as the content may include elements or topics that are contrary to their values or deemed inappropriate for their age.

Targeted advertising

Personalised algorithms allow companies to market products to a specific audience. Creators being paid to talk about certain products or using affiliate links (a link where if a purchase is made through clicking this link, the creator receives a commission) may encourage exaggerated or false advertising.

The targeted nature of personalised algorithms amplifies this risk, as users are constantly bombarded with content and endorsements that align with their preferences. This constant exposure can create a sense of trust and familiarity, making it challenging for individuals to critically assess the authenticity of the information or the merits of the promoted products.

Infinite Scrolling

Many platforms based around personalised algorithms, including TikTok and Instagram, have an infinite scrolling feature, meaning the tailored content is never-ending. Short-form videos or pictures are highly engaging, and when tailored to the user, they can become addictive.

Young people, still developing self-regulation skills, may find it challenging to resist continuously scrolling, with reports that many can spend around 2 hours a day on TikTok alone. This can negatively impact sleep, mental health, and academic performance, highlighting the need for responsible online usage and balanced screen time.

Echo-Chambers

Personalised algorithms restrict exposure to diverse opinions and perspectives, contributing to confirmation bias of view. This can create online echo chambers, an online environment where individuals are predominantly exposed to information, opinions, and perspectives that align with their existing beliefs and preferences.

This is disproportionately dangerous to young people as they are still in the process of forming their values and beliefs and have less exposure to real-world perspectives. One of the many benefits of the internet is that it allows people to learn about people from different backgrounds and their views; however, targeted content may endanger this.

 

What can be done to minimise the risk?

Parents, teachers, and caregivers can take a proactive role to minimise the risk personalised algorithms pose. Traditional advice regarding monitoring internet activity and ensuring young people only follow appropriate accounts, whilst still important, is not the complete answer.

As young people are digital natives, having grown up with technology as a central component of daily life, they are likely to be one step ahead of their parents, teachers and caregivers when it comes to internet usage. Additionally, a recent report from the EU Kids Online survey has shown that young people don’t tend to report seeing inappropriate content to their parents. Therefore, equipping them with the correct information and tools to make them positive digital citizens is likely the best way forward.

Along with the traditional block, unfollow, and mute functions that are detailed in most current online safety recommendations, users can now select options such as ‘not interested’ in the content displayed because of a personalised algorithm. This allows young people to dictate the types of content they don’t want to see more of. Additionally, ensuring young people know the consequences of interacting with content (that it will show up more often) and encouraging them to only interact with the content they want to see more of will prevent increasing the amount of inappropriate content they see and additionally discourage leaving hateful or upsetting comments. These two actions can help young people to control personalised algorithms.

Despite the reduced control over the content that is presented to you, age restrictions are still in place for most platforms, including TikTok and Instagram. Ensuring young people have set the correct age on their profile will prevent content that is deemed by the platform to be inappropriate for their age from being shown to the user.

Screen-time notifications can be set up on most devices to prevent young people from spending too long scrolling. Whilst it sounds controlling, it can also just be used as a gentle reminder of how long they have spent on a particular app that day, which may encourage them to take a break.

Encouraging digital literacy is less of a quick fix than the advice given above; however, it is perhaps the most important, allowing young people to take back control of their internet use. Teaching young people to be critical about what they see online, from influencer marketing to political views, is arguably the most effective way to protect them from harm.

Southwest Grid for Learning (SWGfL) has some valuable resources on current online safety topics. Furthermore, ProjectEVOLVE is a free digital literacy toolkit with a variety of teaching and learning resources.

External Links

SWGgL: www.swgfl.org.uk/

ProjectEVOLVE: www.projectevolve.co.uk/

 

We’re here to help

If you’d like to get in touch, there are a number of ways you can contact us. Phone, Email, Contact form or chat.

If you're an existing customer looking for support, drop an email or call us:

Explore our blog