A recent study suggests that YouTube’s recommendation algorithm may expose children to graphic and violent gun-related content, despite the platform’s content moderation policies. Researchers created profiles for typical nine-year-old and 14-year-old boys, finding that the accounts that followed YouTube’s recommended videos were faced with an onslaught of violent content including videos related to school shootings.
AP News reports that despite Google’s strict YouTube content moderation policies, a recent study contends that the platform’s recommendation algorithm may expose kids to graphic and violent content.
Researchers at the Tech Transparency Project, a nonprofit organization, created a study to better understand the relationship between YouTube videos and gun violence. They made YouTube personas mimicking the actions of typical nine- and fourteen-year-old boys interested in video games. The team found that the accounts that adhered to YouTube’s recommendations were bombarded with 12 firearm-related videos on average every day, including disturbing material relating to school shootings, tactical gun training, and how-to videos for firearm modifications.
“Video games are one of the most popular activities for kids. You can play a game like ‘Call of Duty’ without ending up at a gun shop — but YouTube is taking them there,” said Katie Paul, director of the Tech Transparency Project. “It’s not the video games, it’s not the kids. It’s the algorithms.”
The study also showed that similar videos frequently appeared again with slightly different names after being taken down for breaking YouTube’s rules. This calls into question YouTube’s ability to protect young users from harmful content and the effectiveness of its content moderation.
The business has defended its child safety measures, noting that accounts for users under 13 are linked to a parental account and that users under 17 must have their parents’ permission before using the service. A YouTube spokesperson said, “We offer a number of options for younger viewers… which are designed to create a safer experience for tweens and teens.”
Breitbart News reported in 2022 that YouTube was failing to allow video creators to mark their horror videos as only appropviate for viewers over the age of 18. In fact, the platform insisted that horror videos were safe for kids:
Ars Technica reports that Google’s content moderation process has made yet another error, this time flagging horror videos as “for kids” and preventing creators from changing the age range to 18 and up. One example comes from the horror series Local58TV, created by Kris Straub.
The YouTuber checked his account over the weekend to find that his not-for-kids content was picked up by YouTube’s moderation AI and marked as for kids.
“For Kids” means that the content can be included in the “YouTube Kids” app designed for children which features “safe” curated YouTube videos for younger users. The “Kids” designation means the videos are forced to comply with the U.S. Children’s Online Privacy Protection Act (COPPA) which requires that comments on videos are disabled.
This study echoes similar worries expressed about TikTok, which has also come under fire for allegedly encouraging its younger users to view harmful content. The demand for stricter content moderation and protection mechanisms increases as these platforms continue to be popular among kids and teenagers.
Read more at AP News here.