NewsletterNewslettersEventsEventsPodcastsVideosAfricanews
Loader
Find Us
ADVERTISEMENT

Online platforms fail to assess risks in annual reports, study says

The use of online services by minors is often linked to addictive design which in turn influences their mental health.
The use of online services by minors is often linked to addictive design which in turn influences their mental health. Copyright Claire Savage/Copyright 2023 The AP. All rights reserved.
Copyright Claire Savage/Copyright 2023 The AP. All rights reserved.
By Cynthia Kroet
Published on Updated
Share this articleComments
Share this articleClose Button

Reports required under the Digital Services Act must be sent to the European Commission every year.

ADVERTISEMENT

Reports that major online platforms must draw up under EU rules designed to assess risks on their platforms, including features that could influence users’ mental health, are falling short, according to a study by the DSA Civil Society Coordination Group (CSCG) published on Monday.

The group, which includes the Center for Democracy and Technology and privacy advocates Mozilla and Access Now among others, said that the reports do not “adequately assess and address the actual harms and foreseeable negative effects of platform functioning.”

The first batch of annual reports by websites designated very large online platforms (VLOPs) by the European Commission in 2023 under the Digital Services Act (DSA) - including Facebook, TikTok and Google - came out last November. 

The CSCG report says that the exercises should focus “more thoroughly on risks stemming from platform design, in particular recommender systems, which amplify harmful content that contribute to risks such as mental health issues and political polarisation.”

“Design choices, particularly those driven by engagement metrics, can significantly contribute to systemic risks […]. Despite this, many reports focused primarily on content moderation rather than addressing how platform design itself might be a root cause of harm,” the report said. 

Besides these exercises, platforms are also required to conduct ad hoc risk assessments before launching new features or products in the EU.  

In the case of TikTok, last year, the company decided to withdraw its TikTok Lite rewards program from the EU market, after concerns raised by the Commission about the impact on the mental health of users.

The Commission has opened several investigations under the DSA and sent requests for information to platforms about their recommender systems, including to X and Temu.

Go to accessibility shortcuts
Share this articleComments

You might also like