EU probes TikTok over alleged failure to protect minors

U.S wants TikTok removed from Apple, Google app stores over data concerns
The TikTok logo is seen on an iPhone 11 Pro max in this photo illustration in Warsaw, Poland on September 29, 2020. The TikTok app will be banned from US app stores from Sunday unless president Donald Trump approves a last-minute deal between US tech firm Oracle and TikTok owner ByteDance. US authorities say the Chinese video sharing app threaten national security and could pass on user data to China. (Photo by Jaap Arriens/NurPhoto via Getty Images)

The European Commission has initiated legal proceedings to determine if TikTok violated the Digital Services Act (DSA) in areas such as child protection, among others.

The Bytedance-owned company is also being probed for issues relating to advertising transparency, data access for academics, and risk management of addictive design and dangerous content.

The Commission stated that the investigation is based on preliminary findings, which include an analysis of TikTok’s risk assessment report from September 2023, as well as TikTok’s responses to the Commission’s formal Requests for Information on illegal content, minor protection, and data access.

READ ALSO
U.S wants TikTok removed from Apple, Google app stores over data concerns

Under the EU’s DSA, which went into effect last month, penalties for substantiated breaches can reach up to 6%.

Providing details of the probe, the European Commission in a statement released on Monday, said the proceedings will focus on:

  • “The compliance with the DSA obligations related to the assessment and mitigation of systemic risks, in terms of actual or foreseeable negative effects stemming from the design of TikTok’s system, including algorithmic systems, that may stimulate behavioural addictions and/ or create so-called ‘rabbit hole effects.
  • “Such assessment is required to counter potential risks for the exercise of the fundamental right to the person’s physical and mental well-being, the respect of the rights of the child as well as its impact on radicalization processes. Furthermore, the mitigation measures in place in this respect, notably age verification tools used by TikTok to prevent access by minors to inappropriate content, may not be reasonable, proportionate and effective;
  • “The compliance with DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems;
  • “The compliance with DSA obligations to provide a searchable and reliable repository for advertisements presented on TikTok;
  • “The measures taken by TikTok to increase the transparency of its platform. The investigation concerns suspected shortcomings in giving researchers access to TikTok’s publicly accessible data as mandated by Article 40 of the DSA.”
Stay Connected , follow us on: Facebook: @creebhillsdotcom, Twitter: @creebhillsblog, Instagram: @creebhills, Pinterest: @creebhills, Telegram: @creebhills
To place an advert/Guest post on our site, contact us via [email protected]

LEAVE A REPLY

Please enter your comment!
Please enter your name here