© Reuters. FILE PHOTO: A keyboard is placed in front of a displayed Snapchat logo in this illustration taken February 21, 2023. REUTERS/Dado Ruvic/Illustration/File Photo
By Martin Coulter
LONDON (Reuters) -Britain’s data regulator is gathering information on Snapchat to establish whether the U.S. instant messaging app is doing enough to remove underage users from its platform, two people familiar with the matter said.
Reuters reported exclusively in March that Snapchat owner Snap Inc (NYSE:) had only removed a few dozen children aged under-13 from its platform in Britain last year, while UK media regulator Ofcom estimates it has thousands of underage users.
Under UK data protection law, social media companies need parental consent before processing data of children under 13. Social media firms generally require users to be 13 or over, but have had mixed success in keeping children off their platforms.
Snapchat declined to give details of any measures it might have taken to reduce the number of underage users.
“We share the goals of the ICO (Information Commissioner’s Office) to ensure digital platforms are age appropriate and support the duties set out in the Children’s Code,” a Snap spokesperson said.
“We continue to have constructive conversations with them on the work we’re doing to achieve this,” they added.
Before launching any official investigation, the ICO generally gathers information related to an alleged breach. It may issue an information notice, a formal request for internal data that may aid the investigation, before deciding whether to fine the individual or organisation being investigated.
Last year, Ofcom found 60% of children aged between eight and 11 had at least one social media account, often created by supplying a false date of birth. It also found Snapchat was the most popular app for underage social media users.
The ICO received a number of complaints from the public concerning Snap’s handling of children’s data after the Reuters report, a source familiar with the matter said.
Some of the complaints related to Snapchat not doing enough to keep young children off its platform, the source said.
The ICO has spoken to users and other regulators to assess whether there has been any breach by Snap, the sources said.
An ICO spokesperson told Reuters it continued to monitor and assess the approaches Snap and other social media platforms were taking to prevent underage children accessing their platforms.
A decision on whether to launch a formal investigation into Snapchat will be made in the coming months, the sources said.
PLATFORM PRESSURE
If the ICO found Snap to be in breach of its rules, the firm could face a fine equivalent to up to 4% of its annual global turnover, which according to a Reuters calculation would equate to $184 million based on its most recent financial results.
Snapchat and other social media firms are under pressure globally to better police content on their platforms.
The NSPCC (National Society for the Prevention of Cruelty to Young Children), has said that figures it obtained showed that Snapchat accounted for 43% of cases in which social media was used to distribute indecent images of children.
Richard Collard, associate head of child safety online for the NSPCC, said in response to the Reuters report on Tuesday that the charity was hugely concerned about the use of Snapchat by children under 13.
“Snapchat users as young as 11 and 12 are talking to Childline about how they are sending nude images and communicating with adults on the platform,” he said.
“It is vital we see stronger action to ensure young children are not using the platform and older children are being kept safe from harm.”
Earlier this year, the ICO fined TikTok 12.7 million pounds ($16.2 million) for misusing children’s data, saying the Snap competitor did not “take sufficient action” to remove them.
A TikTok spokesperson said at the time that it “invested heavily” to keep under-13s off the platform and that its 40,000-strong safety team worked “around the clock” to keep it safe.
Snapchat does block users from signing up with a date of birth that puts them under the age of 13. However, other apps take more proactive measures to prevent underage children accessing their platforms.
For example, if an under-13 year-old has failed to sign up to TikTok using their real date of birth, the app continues blocking them from creating an account.
($1 = 0.7833 pounds)
Read the full article here