
What Are Algorithms Teaching Our Children?
Algorithms – the hidden rules that decide what we see online – are now a constant part of childhood. Social media and search engines (from TikTok to YouTube and Google) all use algorithms that learn from what a child clicks, watches or likes, then “serve up” more of the same. Childnet explains that these systems “learn from our online behaviour and decide what content to show us’. No two children see exactly the same feed: what one kid gets “For You” on TikTok or Instagram might be completely different from another, even with similar interests.
These algorithms can have positive side‑effects (like helping a child find more slime videos or craft tutorials if that’s their hobby). But they also pull children deeper into particular worlds – from influencer culture to fashion feeds to political content – often without us realising how fast it happens. Ofcom finds that 6 in 10 UK children aged 8–17 are aware that platforms use algorithms to tailor content. Many even admit they feel a “lack of control” over this process. In practice, this means children can be subconsciously guided into narrow interests, ideals or purchases simply by what they’ve already viewed.
Skewed Career Aspirations
Ask your child what they want to be when they grow up, and increasingly they name jobs like “YouTuber,” “influencer,” or other online content roles. Recent UK surveys confirm this shift. For example, a 2019 LEGO poll found about 30% of 8–12 year olds (UK and US) named vlogger/YouTuber as their dream job – three times as many as those who wanted to be an astronaut. Likewise, a UK study of 11–16 year olds found 17% aiming to be a “social media influencer” and 14% a YouTuber – putting these roles ahead of veterinarian or teacher in popularity. Even back in 2018, telecoms giant O2 reported 30% of British children wanted to be vloggers, with another 14% eyeing software developer careers.
Why the boom? Algorithms on Instagram, TikTok and YouTube relentlessly highlight online fame and riches. Every day, kids see videos of influencers earning money, travelling, and gaining followers. The constant exposure to these curated success stories skews their idea of a “normal” career path. Without context, a child can assume getting rich and famous online is easy – an aspiration amplified by platforms that reward flashy content.
Why it matters: This influencer culture can make realistic jobs seem dull by comparison. It also subtly teaches that popularity (likes/followers) equals success. UK regulators have noticed. Ofcom’s media literacy research shows many children struggle to identify paid promotions, and one 14-year-old girl admitted “I have bought things that influencers use because it looks really good in their videos.” Over time, children may come to measure their worth by online attention rather than personal goals.
What parents can do: Talk about career paths beyond social media. Highlight adult role models and explain how algorithms work. Emphasise that influencers are a small group – remind them that for every online star, there are many people with steady, respected jobs. You might even invite them to watch some normal work-themed videos together (e.g. school library videos or a STEM content creator) so their feed isn’t all pop-star content. Above all, keep the conversation open about what they enjoy learning and creating, not just what’s popular on-screen.
Body Image and Self-Esteem
Algorithms and visuals go hand-in-hand on image-heavy platforms like Instagram or TikTok. By design, they keep showing children more pictures and videos similar to what they liked before. This can be dangerous for young minds forming their self-image. Research shows that 40% of UK teens admit social media images have made them worry about their own bodies. The pressure is highest on girls: in one UK survey 54% of girls (versus 26% of boys) said photos on social media made them feel ashamed or anxious about how they look.
Algorithms tend to exaggerate these pressures. If a child “likes” a popular fitness or fashion post, the system will serve up more content pushing that same ideal body or look – be it super-slim models, “perfect” selfie filters, or heavily edited celebrity snapshots. Studies find kids often internalise these ideals: one 14-year-old girl told researchers, “Makeup and skincare have definitely become more viral on TikTok… You get to see new products and influencers tell you about them… I have bought things that influencers use because it looks really good in their videos”. In short, she’s learning that buying the trending makeup yields social validation.
Why it matters: Continual exposure to polished images can harm self-esteem and body confidence. Many young people compare themselves unfavourably to what they see. UK mental health experts warn this can lead to anxiety, dieting or even depression in severe cases. Ofcom notes girls are less likely than boys to recognize influencer adverts(only 51% of girls spotted paid posts vs 62% of boys), making them more susceptible to marketing that promotes an “ideal look.” When children can’t distinguish reality from PR, they learn to judge themselves by unrealistic standards.
What parents can do: Encourage critical thinking about images. Explain that social media is full of editing and sponsorship. For example, point out that even pretty Instagram photos often have filters or lighting tricks, and celebrities are often photoshopped. Promote body positivity: praise qualities like confidence and health over appearance. Consider following accounts that celebrate real bodies (there are UK-led campaigns and teachers who promote “body confidence” content). And if you notice your child fixating on images, gently ask questions (“How does this picture make you feel? Do you think it’s real or edited?”) to help them process. Remember, regular conversations about online life build resilience. Lastly, use safety tools: many apps allow you to report or hide content that’s upsetting, and some platforms (like Instagram) have “Restrict” or “Muted Words” features for hate/pressure.
Political and Ideological Leanings
Algorithms don’t just push images; they also influence what news or opinions young people see. Because the system feeds children more of whatever they clicked, a child who watches one political or religious video may soon find their feed dominated by similar content. Ofcom’s research finds that children are well aware of this “rabbit hole” effect. One 15-year-old described it plainly: “It’s like an algorithm. If you watch [violent content], you get more of it.”. A 12-year-old added, “With the algorithms, if you have looked at gore, you are more likely to see the [animal cruelty] video.”.
This same mechanic applies to ideas and news. If a teenager watches one conspiracy video or reads an extreme viewpoint, the algorithm is likely to serve more of the same – because it thinks that’s what the user “likes”. Over time this can create echo chambers where a child is rarely exposed to alternative perspectives. UK regulators are paying attention: under the new Online Safety Act, platforms must now assess how their algorithms affect the spread of harmful content to kids. The law explicitly points out that harm can occur “when an algorithm repeatedly pushes content to a child in large volumes over a short space of time”.
Why it matters: If left unchecked, this can subtly skew a young person’s worldview. A child might assume an extreme or misleading viewpoint is “normal” if their feed rarely shows the opposite. For example, even UK teens have been accidentally exposed to hateful online groups or polarised political messages via shared posts or algorithmic recommendations. The effect can be anxiety, fear or even radicalisation if they stumble into harmful communities. Ofcom research also notes that children feel upset or anxious when violent or hateful content suddenly appears on their feeds. So, it’s not just adults at risk; kids can be nudged toward beliefs too.
What parents can do: Stay involved in what your child views online. Encourage a diverse diet of information: for example, show them multiple news sources or talk through current events (ask, “What did you think of that video you saw on topic X?”). If you spot them watching something extreme, gently discuss it and provide context. The most important tip is to talk openly about online experiences: make sure they know they can ask questions if something confuses or worries them. You can also adjust settings – many platforms let you block or “mute” accounts that spread hate, and you should report any illegal or extremist material directly (and remind kids they can do so too). Finally, teach them the word “algorithm” itself: letting a child know “this is a computer program picking more videos like that” can demystify why they suddenly see a lot of similar content.
Consumerism and Material Values
Lastly, algorithms often steer children toward consumer culture. Every “like” they make, every video they watch, becomes data that can trigger new ads or product placements. For instance, if a child browses toy unboxing videos, their feeds will likely fill up with similar flashy toy ads or influencer toy reviews. The goal of many social media platforms (and the brands that advertise there) is to encourage buying and status symbols.
Young people are savvy shoppers in the making: studies show influencer marketing massively boosts sales. One UK survey noted that sales driven by social influencers jumped 37% year-over-year. Children see peers or stars promoting branded clothes, games or beauty products and internalise that “new stuff” = being cool. We already saw a UK teen telling researchers she bought a makeup “shimmer” just because it looked great in an influencer’s video. Over time, this teaches children to measure worth by what they own.
Why it matters: Algorithms can amplify materialistic messages. Rather than valuing creativity or kindness, kids may start thinking they need the latest sneakers or gadgets to fit in. This can hurt self-esteem (if they can’t afford trendy items) and lead to unhealthy spending or screen addiction. In fact, Internet Matters reports parents worry that excessive screen time and online pressure are affecting children’s health and values.
What parents can do: Limit exposure to commercial content where possible. Explain that many videos or posts they see are trying to sell something. When browsing together, point out obvious ads: “Look, that game is advertised by a celebrity – do you really need it?” Consider using ad-blockers or apps’ built-in “ad-free” kids modes (YouTube Kids, for example, has fewer ads than regular YouTube). Encourage hobbies that don’t revolve around buying things – like sports, crafts or books. A simple trick: have “offline days” with no screen time, to remind kids that life is fun without the latest phone notifications or shopping.
Tips for Parents: Building Critical Thinking and Healthy Habits
-
Talk regularly about their online life. As Internet Matters puts it, the most powerful step is to “check which apps your children use” and “talk regularly about online safety”. Make it a normal family habit to ask “What did you watch today? What was good or bad about it?” without judgment. This open dialogue alone builds your child’s confidence to question what they see.
-
Explain algorithms in simple terms. You might say: “Social media has a little robot that notices what you watch and then shows you more like it.” When your child gets, say, ten more fashion videos after liking one, point out it’s the algorithm at work. Helping them recognize patterns (if you click one thing, similar things come next) lets them take a step back.
-
Use platform safety tools. Make sure privacy settings and parental controls are on. For example, YouTube and TikTok have “Digital Wellbeing” or “Family Safety” modes, age filters and screen-time limits. Activate these together with your child and explain why. Teach kids how to report or hide content: for example, TikTok can “reset” recommendations by clearing watch history. If your child keeps seeing upsetting or inappropriate content, show them how to block it or start fresh.
-
Set boundaries on screen time. Algorithms prey on limitless scrolling. Agree on reasonable daily limits and encourage breaks (the “rule of 2s” for two hours, two breaks every hour, etc.). Internet Matters recommends balancing “active” (interactive) screen time with offline activities. Maybe charge devices outside the bedroom at night to ensure sleep.
-
Model healthy behaviour. Kids learn by example. Be mindful of your own screen habits and talk about yours: “I sometimes find myself scrolling too much on Instagram; I try to take breaks.” Show curiosity and scepticism about everything online, so they see it’s normal to double-check facts and motives.
-
Encourage diverse interests. Help them follow a variety of accounts – some fun, some educational, even some foreign-language or nature feeds – so their “algorithm feed” isn’t a loop of the same narrow content. Share books, documentaries or offline experiences to broaden their horizons.
Algorithms are not going away, but by understanding them we can lessen their grip. Support your children in questioning what they see online, and in seeking out balanced, healthy content. By talking openly, setting rules together, and teaching them to hit “pause” on constant feeding, we help them build a critical filter of their own.