The EU has agreed another ambitious piece of legislation to watch the world online.
Early Saturday morning, after hours of negotiations, the bloc agreed to the broad terms of the Digital Services Act, or DSA, which will force tech companies to take more responsibility for the content that appears on their platforms. The new obligations include removing illegal content and goods more quickly, explaining to users and researchers how its algorithms work, and taking stronger measures against the spread of misinformation. Companies face fines of up to six percent of their annual turnover for non-compliance.
“The DSA will update the basic rules for all online services in the EU,” European Commission President Ursula von der Leyen said in a statement. “It gives practical effect to the principle that what is illegal offline should be illegal online. The larger the size, the greater the responsibilities of online platforms.”
Margrethe Vestager, the European Competition Commissioner who has spearheaded much of the bloc’s tech regulation, said the law would “ensure platforms are held accountable for the risks their services may pose to society and citizens.”
The DSA should not be confused with the DMA or Digital Markets Law, which was agreed in March. Both acts affect the world of technology, but the DMA focuses on creating a level playing field between companies, while the DSA deals with how companies control content on their platforms. Therefore, the DSA is likely to have a more immediate impact on Internet users.
Although the legislation only applies to EU citizens, the effect of these laws will also be felt in other parts of the world. Global tech companies may decide that it is more profitable to implement a single strategy to police content and take the comparatively stricter regulations of the EU as a benchmark. While lawmakers in the US interested in reining in Big Tech with their own regulations have already started looking to the EU rules for inspiration.
- Targeted advertising based on a person’s religion, sexual orientation, or ethnic origin is prohibited. Minors also cannot be targeted advertising.
- “Dark patterns” – confusing or misleading user interfaces designed to guide users to certain decisions – will be banned. The EU says that, as a general rule, canceling subscriptions should be as easy as subscribing.
- Large online platforms like Facebook will need to make the workings of their recommendation algorithms (eg, used to rank content in News Feed or suggest TV shows on Netflix) transparent to users. A “non-profile-based” recommendation system should also be offered to users. In the case of Instagram, for example, this would mean a chronological feed (as recently introduced).
- Hosting services and online platforms will need to clearly explain why they have removed illegal content, as well as provide users with the ability to appeal such removals. However, the DSA itself does not define what content is illegal and leaves it up to each country.
- The largest online platforms will have to provide key data to researchers to “provide more insight into how online risks are evolving.”
- Online marketplaces must maintain basic information about merchants on their platform to track down people selling illegal goods or services.
- The big platforms will also have to introduce new strategies to deal with disinformation during crises (a provision inspired by the recent invasion of Ukraine).
The DSA, like the DMA, will distinguish between technology companies of different sizes, placing greater obligations on larger companies. The largest companies, those with at least 45 million users in the EU, such as Meta and Google, will face the most scrutiny. These technology companies have pressed a lot dilute DSA requirements, particularly those related to targeted advertising and the provision of data to outside researchers.
Although the general terms of the DSA have now been agreed upon by EU member states, the legal language has yet to be finalized and the act officially voted into law. However, this last step is considered a formality at this point. The rules will apply to all businesses 15 months after the law becomes law, or from January 1, 2024, whichever is later.