The UK’s data regulatory authority is currently in the process of collecting information about Snapchat. Their goal is to determine if the instant messaging application from the United States is taking sufficient measures to eliminate underage users. Two individuals who know the situation disclosed this information.
Back in March, Reuters published an exclusive report revealing that Snap Inc (owner of Snapchat) had removed only a small number of children under the age of 13 from its platform in the UK during the previous year. This is in contrast to the estimate provided by the UK media regulator Ofcom, which suggests that there are thousands of underage users still using the platform.
According to the data protection regulations in the UK, social media enterprises are obligated to obtain parental consent before processing the data of children below 13 years of age. While social media companies typically establish a minimum age of 13 for users, they have encountered varying degrees of success in preventing children from accessing their platforms.
Snapchat chose not to divulge specific actions it may have implemented to decrease the count of underage users.
A spokesperson from Snapchat commented, “We share the objectives of the Information Commissioner’s Office (ICO) in ensuring that digital platforms are appropriate for different age groups and adhere to the responsibilities outlined in the Children’s Code.”
Before initiating a formal inquiry, the ICO typically collects pertinent information about a reported violation. It might issue an information notice, which is a formal solicitation for internal data that could assist in the investigation. This step is taken before determining whether to impose penalties on the subject individual or organization that is under scrutiny.
Last year, Ofcom discovered that 60% of children aged 8 to 11 had at least one social media account, often by using a fake birthdate. Ofcom also found that Snapchat was the most popular app among underage social media users.
ICO received several complaints from the public about how Snap handled children’s data, according to a source familiar with the situation. Some of these complaints were related to Snapchat’s insufficient efforts to prevent young children from using its platform.
To assess any potential breaches by Snap, the ICO has engaged with users and other regulatory bodies, as per the sources.
An ICO spokesperson informed that they are continuously monitoring and evaluating the measures taken by Snap and other social media platforms to prevent underage children from accessing their services.
A decision regarding whether to officially investigate Snapchat will be reached in the upcoming months, as stated by the sources.
If the ICO determines that Snap has violated its regulations, the company could potentially be fined up to 4% of its annual global earnings.
Social media companies like Snapchat are facing worldwide pressure to improve their content monitoring on their platforms.
The NSPCC (National Society for the Prevention of Cruelty to Young Children) obtained data indicating that 43% of cases involving the distribution of indecent images of children through social media were linked to Snapchat.
Earlier this year, the ICO fined TikTok £12.7 million ($16.2 million) for mishandling children’s data. The ICO stated that TikTok did not take adequate measures to remove underage users.
TikTok had previously stated that it had invested significantly to prevent users under 13 from using the platform, and its safety team of 40,000 individuals worked tirelessly to maintain a secure environment.
While Snapchat does prevent users from signing up with a birthdate indicating they are under 13, other apps take more proactive steps to prevent underage children from accessing their services.