fbpx

Record-Breaking Fine For Video Sharing App

TikTok Faces Massive Fine For Underage Users Short-form video sharing app TikTok has been handed the largest ever fine for a US case involving children’s data privacy. The company...

TikTok Faces Massive Fine For Underage Users

Short-form video sharing app TikTok has been handed the largest ever fine for a US case involving children’s data privacy. The company has agreed to pay $5.7 million and implement new measures to handle users who say they are under 13.

The Federal Trade Commission (FTC) said the Musical.ly app, which was later acquired and incorporated into TikTok, knowingly hosted content published by underage users. It has ordered TikTok to delete the data.

Additionally, as of Wednesday, TikTok users in the US will be required to verify their age when they open the app. However, like many social networks, age verification is implemented on a trust basis – a person signing up simply has to lie about their date of birth in order to get around the check.

“We care deeply about the safety and privacy of our users,” the firm said. “This is an ongoing commitment, and we are continuing to expand and evolve our protective measures in support of this.” Despite this, TikTok said it would not be asking existing users in other countries, including the UK, to verify their age as the settlement only applied to the US.

After being one of the most downloaded apps of 2018, TikTok has an estimated base of 1 billion users worldwide. But the FTC was concerned about how old some of those users were. Its report said the Musical.ly app had 65 million users in the US, a “large percentage” of which were underage.

TikTok’s parent company, China-based ByteDance, acquired Musical.ly in 2017, and incorporated it into TikTok, discontinuing the original Musical.ly app. The apps allowed members to create short videos, set to music, to share with other users.

“For the first three years [of its existence], Musical.ly didn’t ask for the user’s age,” the FTC’s statement read, “Since July 2017, the company has asked about age and prevents people who say they’re under 13 from creating accounts. But Musical.ly didn’t go back and request age information for people who already had accounts.”

The FTC noted media reports suggesting adults on Musical.ly had contacted children who were obviously under 13 because “a look at users’ profiles reveals that many of them gave their date of birth or grade in school”.

According to the regulators complaint, Musical.ly was contacted by more than 300 concerned parents in just a two-week period in September 2016. While the profiles of the children involved were subsequently deactivated, the content the child had posted was not deleted.

The FTC said TikTok would be fined because of what it saw a Musical.ly’s failure to adhere to the basic principles of the Children’s Online Privacy Protection Act, known as Coppa. Obligations include being upfront in how children’s data is collected and used, as well as a mechanism by which to inform parents their child is using the service, and obtain their consent.

The company was also said to have not responded adequately to parents’ requests to delete data, and subsequently held onto that data for longer than was reasonable. TikTok would not share estimates on how many underage users had been, or still were, on the platform.

Follow @TwistityNews