How TikTok Violated EU’s Privacy Laws

TikTok App. Photo by Solen Feyissa on Unsplash

Popular social media platform TikTok has been fined €345 million by European regulators for a series of lapses in its settings that could put children at risk

by David Bass

September 25, 2023

Used by 134 million people monthly in Europe alone, TikTok is one of the world’s most popular social media platforms. 

The €345 million fine was issued by Ireland’s Data Protection Commission, the “independent authority responsible for upholding the fundamental right of individuals in the EU to have their personal data protected.” It is the biggest ever privacy fine for TikTok, and the fifth-largest ever imposed on any tech company under the General Data Protection Regulation (GDPR), the European Union’s new data privacy and security law.

The penalty comes amid high tensions between the European Union and China, following the EU’s announcement that it plans to probe Chinese subsidies of electric cars.

The issues

The €345 million fine concerns the way that the application processed children’s data, including how age verification processes operated and how young people’s data was shared.

One issue concerns the time period from July to December 2020, when TikTok unlawfully made accounts of users aged 13 to 17 public by default, making it potentially possible for anyone to watch and comment on their posts. 

It is also claimed that the wide-reaching consequences of making content and accounts public have not been fully explained to young people using the platform, and that the platform did not tackle risks that under 13 users could use the app. 

Another concerning issue was that TikTok’s “family pairing” scheme gave an adult control over a child’s account settings without checking whether the adult paired was a parent or guardian.

Further issues with the “Family pairing” feature include a lack of strictness on its settings — allowing adults to turn on direct messaging for users without their consent.

Concerns over children and social media

The European Union has for some time been worried about the possible dangers of unchecked social media use by children. In previous years, although legislation was not put in place, proposals have been made to ban Facebook, Twitter, Instagram and other services from processing data belonging to under 16s without parental consent.

The regulatory bodies have also highlighted the disturbing trend of so–called “dark patterns.” These are user interface designs that are alleged to subtly manipulate users into disclosing excessive information.

It was found that TikTok was manipulating teenagers to make their accounts and videos through pop-ups and has ordered the misleading designs to be altered within the next three months.

The ruling shows that European regulatory bodies are now taking direct action on the issue — and the size of the fine would also indicate that they are making a statement on safeguarding children from the harmful side of the platforms that are almost an obsession with many young people.

Tik Toks defence

TikTok has stated that it feels the decision regarding the Irish Data Protection Commission is wrong, specifically the size of the fine, and that many of the criticisms are no longer relevant.

“We respectfully disagree with the decision, particularly the level of the fine imposed,” said Morgan Evans, a TikTok spokesperson. “The [Data Protection Commission]’s criticisms are focused on features and settings that were in place three years ago, and that we made changes well before the investigation even began, such as setting all under-16 accounts to private by default.”

The company has also made assurances that they are now doing more to address the issue of underage users, saying that it will change any misleading designs and add measures such as changes to the pop-ups youngsters get when they first post. 

TikTok has faced similar, if less heavy fines in the past. It was fined £12.7m in Britain previously for illegally processing the data of 1.4 million children under 13 who were using its platform without the consent of their parents by the Information Commissioner’s Office (ICO), and the app is banned from UK government phones due to security concerns.

Subscribe to our newsletter.

This article was originally published on IMPAKTER. Read the original article.

0 Shares