TikTok Reportedly Leads Child Accounts to Pornographic Content In Just a Few Taps
As reported by a new study, TikTok has been observed to direct children's accounts to adult videos after only a few taps.
Testing Approach
A campaign organization set up simulated profiles using a 13-year-old's birth date and activated the app's "restricted mode", which is designed to restrict exposure to inappropriate content.
Investigators discovered that TikTok proposed sexualized and explicit search terms to the simulated accounts that were created on clean phones with no prior browsing data.
Troubling Search Prompts
Search phrases proposed under the "recommended for you" feature contained "very very rude skimpy outfits" and "explicit content featuring women" – and then escalated to terms such as "graphic sexual content".
For three of the accounts, the adult-oriented recommendations were proposed instantly.
Rapid Access to Explicit Content
Following just a few taps, the study team found pornographic content ranging from women flashing to graphic sexual acts.
The organization stated that the content sought to avoid detection, typically by showing the content within an benign visual or video.
Regarding one profile, the process took two interactions after signing in: one interaction on the search feature and then a second on the recommended term.
Regulatory Context
The research entity, whose mandate includes researching technology companies' influence on societal welfare, reported performing two batches of tests.
The first group occurred before the activation of minor safety measures under the British online safety legislation on 25 July, and a second set subsequent to the rules took effect.
Concerning Discoveries
Investigators noted that two videos included someone who appeared to be below the age of consent and had been reported to the Internet Watch Foundation, which tracks online child sexual abuse material.
The research organization alleged that the social media app was in non-compliance of the digital protection law, which mandates digital platforms to block children from viewing inappropriate videos such as explicit content.
Government Position
An official representative for Ofcom, which is tasked with overseeing the legislation, said: "We appreciate the work behind this study and will analyze its conclusions."
Ofcom's codes for adhering to the act state that online services that carry a medium or high risk of showing harmful content must "modify their programming" to remove inappropriate videos from young users' timelines.
The platform's rules forbid adult videos.
Platform Response
The video platform said that after being contacted from the research group, it had taken down the violating content and introduced modifications to its suggestion feature.
"Upon learning of these assertions, we took immediate action to investigate them, take down videos that contravened our rules, and introduce upgrades to our search suggestion feature," commented a spokesperson.