🔗 Share this article The Popular Video Platform Allegedly Leads Children's Profiles to Pornographic Content Within a Few Clicks According to a new study, the widely-used social media app has been found to guide profiles of minors to pornographic content after only a few taps. Testing Approach A campaign organization established simulated profiles using a date of birth for a minor and turned on the platform's content restriction feature, which is meant to limit exposure to adult-oriented content. Investigators observed that TikTok proposed sexualized and explicit search terms to the simulated accounts that were set up on new devices with no prior browsing data. Concerning Search Suggestions Keywords proposed under the "you may like" feature featured "very very rude skimpy outfits" and "inappropriate female imagery" – and then escalated to keywords such as "graphic sexual content". For three of the accounts, the sexualized searches were recommended right away. Fast Track to Adult Material Following just a few taps, the study team came across pornographic content from women flashing to explicit intercourse. Global Witness claimed that the content tried to bypass filters, usually by displaying the video within an harmless image or video. Regarding one profile, the process took two interactions after logging on: one click on the search bar and then one on the proposed query. Legal Framework Global Witness, whose scope includes researching digital platforms' effect on societal welfare, reported performing two batches of tests. Initial tests occurred prior to the enforcement of child protection rules under the British online safety legislation on 25 July, and another following the rules took effect. Alarming Results Researchers added that multiple clips included someone who seemed to be below the age of consent and had been sent to the online safety group, which tracks online child sexual abuse material. The campaign group asserted that the video platform was in violation of the UK safety legislation, which obligates digital platforms to stop children from encountering harmful content such as explicit content. Government Position A spokesperson for the UK communications regulator, which is charged with regulating the law, commented: "We acknowledge the work behind this research and will review its results." Ofcom's codes for complying with the legislation specify that online services that carry a significant danger of showing harmful content must "modify their programming" to filter out inappropriate videos from children's feeds. The app's policies ban adult videos. Platform Response TikTok said that following notification from the organization, it had removed the offending videos and made changes to its search recommendations. "Upon learning of these allegations, we acted promptly to examine the issue, remove content that violated our policies, and implement enhancements to our recommendation system," said a official speaker.