Start with the context of kids begging their parents for a phone or to join different social media platforms
Snapchat - to take photos with cool filters (59% of teens)
Instagram - to stay connected with friends (62% of teens)
TikTok - to watch funny cat videos (⅔ of teens use it)
Parents says yes—so what are you in for?
Stock video from behind of a child texting on their phone or tapping on their phone
Zoom in
Screen-recording of making a new social media account on three main platforms:
Instagram
TikTok
Snapchat
Set a timer
Statistics
Facebook, Instagram and TikTok are allowing children, some as young as 13 years old, to be directly targeted within 24 hours of creating an account with a stream of harmful content.
New TikTok accounts in our study were recommended self-harm and eating disorder content within minutes of scrolling the app’s For You feed.
Suicide content was recommended within 2.6 minutes
Eating disorder content was recommended within 8 minutes
A new TikTok account set up by a 13-year-old user that views and likes content about body image and mental health will be recommended that content every 39 seconds.
About half of teens (48%) say social media platforms have a mostly negative effect on people their age
In the online world, many young people are exposed to violent, abusive, misleading, or sexual content that they're not developmentally ready for. In fact, a 2022 eSafety report states that 62 per cent of teens had been exposed to harmful content online.
Suddenly, they see something harmful
Zoom out into a wider mosaic of harmful content reports
Blurred
视频信息
答案文本
视频字幕
Kids today are constantly asking their parents for smartphones and access to social media platforms. Snapchat attracts 59% of teens who want to take photos with cool filters. Instagram draws 62% of teens who want to stay connected with friends. And TikTok captures two-thirds of teens who love watching funny cat videos and entertaining content.
Once parents give permission, children quickly set up accounts across multiple platforms. They create Instagram profiles to connect with friends, join TikTok to access entertaining videos, and sign up for Snapchat to use fun filters. But what happens next? The clock starts ticking - and within just 24 hours, these platforms begin targeting young users with potentially harmful content.
The statistics are alarming. Research shows that TikTok's algorithm can recommend suicide-related content to new accounts within just 2.6 minutes of scrolling. Eating disorder content appears within 8 minutes. For teens who engage with body image content, harmful material is recommended every 39 seconds. Studies reveal that 62% of teenagers have been exposed to violent, abusive, misleading, or sexual content online that they're not developmentally ready to handle.
Despite the risks, the reality is complex. Nearly half of teenagers - 48% - acknowledge that social media platforms have mostly negative effects on people their age. Yet they continue using these platforms due to peer pressure, fear of missing out, and social connection needs. This creates a troubling paradox where young people recognize the harm but feel compelled to participate anyway, making them particularly vulnerable during crucial developmental years.
So what are parents really in for when they say yes to social media? They're opening the door to immediate exposure to harmful content, algorithm-driven targeting of their child's vulnerabilities, and significant mental health risks during crucial developmental years. The loss of childhood innocence happens faster than ever before, requiring constant parental monitoring and intervention. The key is parental awareness and digital literacy - understanding these platforms' true nature and implementing protective measures before handing over that first smartphone.