In a report adopted with a broad majority in the European Parliament’s Internal Market and Consumer Protection Committee, lawmakers argue that digital platforms should be less addictive, focusing on child protection and the harms of social media.
The issue of addictive design, which refers to capturing users’ attention so they spend as much time on platforms as possible, has been on the table since last year when the Internal Market Committee (IMCO) prepared to deliver its resolution on “Addictive design of online services”.
One justification for the need for the legislation is that while there are regulations for addictions, such as drugs, alcohol, tobacco, or gambling, there has not been one for addiction to digital platforms or social media.
“The IMCO Committee is united: no self-discipline can beat the addictive design tricks we all face online today,” rapporteur Kim Van Sparrentak told Euractiv.
“This can have a huge impact on mental health and even brain development. If we do not act now, this will impact generations to come. The EU needs to lead the way and act against the addictive design of online services,” she explained.
Reviewing legislation focusing on minors
In the text, dated 18 October, the Parliament called on the Commission to look into which present legislations or policy initiatives are needed against addictive design, such as reviewing the Commercial Practices Directive, the Consumer Rights Directive, and the Unfair Contract Terms Directive.
When doing so, vulnerable groups, for example, children, should be considered. The text also suggests reviewing the definitions of “consumer”, “vulnerable consumer”, and “trader”.
Further research on addictive design should be funded by the Commission, especially on children and adolescents, to understand “underlying issues” and potential solutions.
The document also calls on the Commission to prohibit practices and addictive techniques not yet prohibited in other legislations.
Policy initiatives should be implemented on “safety by design digital services and products for children which can foster compliance with children’s rights”, the report says.
Regarding social media platforms, users should be able to access third-party apps, the report argues.
Consumers should also be able to turn off “attention-seeking features” (“right not to be disturbed”) but also to turn them on if they would like to, “possibly with an attached mandatory warning”.
According to the text, this would offer consumers “real choice and autonomy” instead of an “information overload”.
Moreover, platforms should not have features that monopolise users’ attention or subconsciously influence them.
For minors especially, automatic locks “after a preset time of use”, restricting the use between certain times, or a weekly summary about screen time, should be provided. Screen time summaries should come with awareness-raising campaigns inside the application, focusing on problematic behaviours online, as well as potential risks.
The text says that self-control strategies should be included in educational guidelines, prevention plans, and awareness-raising campaigns.
Designs are created to maximise “activity, engagement, content production, network development and data sharing”. This applies to data monetising, the text says. This means that data is used for a measurable economic benefit for the company.
However, other services might work with subscription-based models, which can also contain addictive designs.
The text notes children “rarely disconnect from social media” and feel insecure without their phones. Social media pressure can result in mental health issues, making the younger generation more sensitive to this. Gaming addiction is also “recognised as a mental health disorder by the World Health Organisation”.
More research is also needed on this topic, considering that while some features may not affect adults, they can harm children. Continuous research is especially needed considering the speed at which social media platforms are developing.
The lawmakers said that features on social media, such as endless scrolling, autoplay, or setting goals, such as “streaks”, should be removed.
Digital asymmetry, dark patterns, and recommender systems
The text references the EU’s Unfair Commercial Practices Directive, which aims to regulate unfair commercial practices while selling products or services between businesses and consumers.
The amendments suggest that the directive should integrate digital asymmetry with consumers and enforcers, who are often “in the dark on what happens behind the interfaces of online services due to a lack of knowledge and insight”.
National authorities, or the Commission, should also ensure that services come without dark patterns, referring to practices that trick users into something they did not mean to do, such as buying overpriced insurance.
Ensuring that services do not have a misleading or addictive design “by design” is also part of the new text, and so is the suggestion for sharing the online service providers’ experimentation dashboards to provide more transparency.
Instead of banning interaction-based recommender systems, the amendments now suggest the Commission’s assessment of addictive and mental health effects. Such systems can recommend products to users based on their previous interactions.
Once the Parliament’s report is adopted in plenary, it will feed into the EU executive’s ongoing fitness check of the current consumer law. While there is no exact date for the plenary vote yet, it will most likely occur in December or January.
[Edited by Luca Bertuzzi/Nathalie Weatherald]
Read more with EURACTIV