The Digital Services Act: The Internet’s Safety Enforcer in Europe

On August 25, 2023, the European Union’s Digital Services Act (DSA) went into effect and it began to hold technology companies like Google and Facebook accountable for the content posted to their platforms. Despite only encompassing the EU, it is not to say that there are no far-reaching effects from the DSA as many companies will have to comply with the DSA’s enforcement and adjust their current content policy. As the primary goal of the DSA is to “foster safer online environment,” this means that companies are legally accountable for their platforms’ safety by preventing and removing “illegal goods, services, or content” while also allowing users the ability to report such posts. This act also aims to “establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally.” By enforcing the DSA, the European Commission now has the power to investigate the efforts of these tech companies to safeguard their platforms and fine them “up to 6% of their global annual revenue if they don’t comply.” As technology companies navigate this new regulatory landscape, the DSA serves as a pivotal development in shaping the future of the digital realm.

The unprecedented Internet law applies to any online platforms aiming to have a presence in the EU, holding them legally liable for everything on their platform such as misinformation, Russian propaganda, and criminal activities. The EU defines “digital services” as a “large category of online services, from simple websites to internet infrastructure services and online platforms,” focusing more on “online intermediaries and platforms” like e-commerce, social media platforms, and app stores. The DSA is divided into various tiers where the most stringent laws apply to “very large online platforms” like Facebook and Amazon as well as “two ‘very large online search engines”—referring to Google and Bing. This call for transparency is among the many efforts that the EU aims to establish in hopes of having a compliant and safe Internet as those who do not comply face large fines (hundreds of millions of Euros) and an “EU-wide ban.”

The DSA revolves around 5 central issues: illegal products, illegal content, protection of children, racial and gender diversity, and ban on “dark patterns.” The sale of illegal products and services (e.g., drugs, solicitation) is forbidden under the DSA, affecting e-commerce platforms such as Amazon and Facebook Marketplace. Illegal content refers to things such as Russian propaganda, election interference, hate crimes, and online crimes like harassment and child abuse. This focus on eliminating illegal content focuses on protecting “fundamental rights recognized by law across Europe” like freedom of speech and data protection. Another issue is the protection of children as parents cannot police everything that their children encounter on the Internet, prompting the EU to prohibit children from being targeted “with advertising based on their personal data or cookies.” Social media giants are now required to reconfigure their policies and systems to ensure a “high level of privacy, security, and safety of minors” and demonstrate their compliance to this new rule to the European Commission. In addition to that, platforms are required to “redesign their content ‘recommender systems’ to reduce risks to children” and undergo “a risk assessment of negative effects on children’s mental health and present it to the commission in August.” In terms of racial and gender diversity, social media platforms are unable to use “sensitive personal data including race, gender and religion to target users with adverts.” With regards to e-commerce, the EU also implements a safeguard measure where shoppers will be protected from “manipulative practices [that] exploit[ed] consumers’ vulnerabilities.” The EU justice commissioner also highlighted the reality of “dark patterns” as a total of 42 websites “used fake countdown timers with fake deadlines for purchasing, with 70 sites ‘hiding’ important information such as delivery costs or the availability of a cheaper option.”

Beyond those 5 central issues, consumers will now be able to report any non-compliance and see implemented changes. The DSA also stressed that tech companies are banned from “ranking their own services more favorably than others” as well as that they must “stop making it difficult to uninstall pre-loaded software and apps.” According to a European Commissioner, Big Tech companies face higher “societal stakes” as the DSA’s rules consist of “additional measures to assess and mitigate societal risks that are linked to their advertising systems, for example in catalyzing disinformation.” The European Commission also introduced a voluntary code of practice where it was endorsed by 44 Big Tech companies including Google and Facebook. However, it is not to say that all Big Tech companies agree with the introduction of the DSA as X (formerly known as Twitter) had quit this voluntary code of practice in early 2023 before stating that they would comply. Elon Musk, X’s owner, had also quit the EU’s voluntary code of practice on disinformation before reversing his decision from EU officials threatening an EU-wide ban. As a part of its voluntary code of practice exercise, EU representatives went to X’s headquarters to “carry out a mock exercise with Twitter staff to test its controls on Russian propaganda, fake news, and criminal activity.” Due to the fact that the EU elections are happening in 2024, Thierry Breton, the European Commissioner for Internal Market responsible for the DSA’s enforcement, stressed that the EU will be “extremely vigilant” on all tech companies’ compliance to the DSA.

The EU’s focus on transparency and user safety also expanded into the Digital Markets Act (DMA) where “gatekeepers”—tech giants Alphabet, Amazon, Apple, ByteDance, Meta, and Microsoft—are regulated from abusing their market power and allowing new companies to enter the market. The DMA also forbids gatekeepers from tracking user activity across the Internet for targeted advertising without users’ explicit consent. The EU’s General Data Protection Regulation (GDPR) was another example of the EU’s desire to ensure online privacy. Unlike their American counterparts, Breton states that with the implementation of the DSA, “Europe is now effectively the first jurisdiction in the world where online platforms no longer benefit from a ‘free pass’ and set their own rules.” Breton further added: “Technology has been ‘stress testing’ our society. It was time to turn the tables and ensure that no online platform behaves as if it was ‘too big to care.’”

As the DSA has already gone into effect, companies are reshaping users’ online experience such as removing their content with notice, limiting certain posts’ visibility, or stopping some content from being monetized with a proper explanation. Instagram and Facebook are among the first to announce their compliance with the DSA, announcing that European users can “tailor their feeds to see posts shared by accounts they follow, or in chronological order.” In contrast, TikTok stated that users can choose to be shown videos based on their location or popularity rather than depending on the company’s algorithm. Companies like Snapchat further stated that they were complying with the DSA by “making it impossible for advertisers to use teenagers’ data to show them personalized ads.” These measures are borne out of the EU’s recognition of the “major long-term risks that the[se] platforms pose for society” such as mental health, disinformation, and propaganda. Renate Nikolay, the deputy director of the directorate supervising the DSA, spoke with activists, consumer representatives, lawyers, tech executives, and academics in a discussion: “My expectation is that throughout the DSA enforcement saga, we will see a change in the business structures of platforms.”

However, the safe environment the DSA promises is only as successful as its actual enforcement. The DSA’s sister laws—the DMA and the GDPR—focus on the overall market concerning compliance in terms of privacy. Meanwhile, the DSA places accountability on the companies and has a harsher penalty for those resisting its rules by taking 6% of the companies’ annual revenue (in comparison to the GDPR’s 4%). The European Commission will also have similar powers as antitrust investigators to ensure that these companies are complying with the DSA and its regulations, being funded yearly a €45 million for 2024 out of an annual levy from the Big Tech companies. Breton also has the aid of various artificial intelligence (AI) and computer scientists from the European Commission’s European Center for Algorithmic Transparency (ECAT), cooperating with national EU digital regulators. The intensity of the EU’s efforts to enforce the DSA stems from the decades-long criticism of the EU’s lack of governance and tech companies’ lack of accountability. Julian Jaursch, a digital policy expert at Berlin-based think tank Stiftung Neue Verantwortung, noted that there is “a lot of pressure on the Commission, and the Commission has acknowledged that publicly, that it needs to deliver and deliver early and fast.”

At the same time, the obstacles coming at the DSA continue—one being the lack of resources (staff, competence, and money) to adequately address all tech companies’ lack of compliance. The Commission is aiming for “123 full-time staff to enforce the DSA in 2024 and estimates it will roughly need an extra 30.” For comparison, the British regulator estimates needing 350 people to monitor 30-40 tech companies for its iteration of the DSA, the Online Safety Bill. However, e-commerce websites like Amazon and Zalando—a fashion company—stated that they are not “very large online platforms” and thus, should not be subject to the additional obligations of the DSA. They are not alone in this as Facebook and Twitter have done the same to not explain their operations in further detail. Suzanne Vergnolle, a technology law professor at the Cnam Institute, is critical of these tech companies’ objections: “We shouldn’t be fooling ourselves. Companies will come with their armies of lawyers to find all of the smallest procedural flaws to crush cases just like they do in competition or data protection cases.”

On the EU’s side, some aspects of the DSA are not falling into place as well, indicating a potentially difficult ease into the DSA’s implementation. The EU nations have until February 2024 to assign their national representatives to enforce the DSA such as reviewing researchers who “will access platforms’ data.” The Board of Digital Services Coordinators, the national regulators’ network, is planning to approve additional “detailed standards” for online platforms in terms of enforcing the DSA’s fight against disinformation. Law professor Martin Husovec at the London School of Economics and Political Science states that the EU “don’t have the full DSA operational because of these missing institutions.” Despite the DSA’s claim that users can report any non-compliance or question the takedown of their content, the actual process is not spelled out according to Politico. Beyond that user reporting process, other detailed rules and procedures on “how large companies need to assess and limit major societal risks” are still being discussed rather than being formalized into legal writing.

Another area of concern regarding the DSA is the European Commission’s current stage of determining the auditing process of “companies’ assessment and mitigation reports.” The Big Four of auditing companies—PwC, Deloitte, Ernst & Young, and KPMG—are being called as the only companies that can successfully manage this auditing process, according to Sally Broughton Micova, professor at the University of East Anglia. In its first days of being implemented, EU officials have stated that their first cases for the DSA will primarily be on “low-hanging fruit, like explicit legal obligations requiring platforms to release specific information at a deadline.” Due to the complexity of issues such as disinformation and the deterioration of mental health, the EU acknowledges that cases related to such issues will take longer than others as the “evidentiary thresholds aren’t immediately obvious.”

The DSA has enormous potential to make the Internet a safer space for users, but it also is upon the European Commission to ensure that they are treading carefully in their process. Even though it is only implemented in EU countries, the DSA’s effects may expand beyond geographical borders as the EU hopes to make the DSA a “standard for effective intervention for the protection of fundamental rights online.” The EU has been successful in its efforts of establishing standards for both the EU and the world as seen in the iPhone 15 series’ charging port being a USB-C port globally after the EU’s mandate. The EU is also aware of its impact with the DSA being implemented, hoping to “setting a benchmark for a regulatory approach to online intermediaries also at a global level.” As the DSA) continues to evolve and take effect, it becomes increasingly clear that this endeavor to create a safer digital environment is a multifaceted process with dynamic outcomes. While the initial implementation occurs within the European Union, its influence extends far beyond the borders of the EU, marking a significant shift in the global digital landscape.

Do not forget to follow us and stay updated with useful information on our website. For detailed guidance, please contact us at