The Digital Services Act (DSA) is one of the most significant pieces of legislation in the European Union’s effort to create a safer and more transparent digital space. Introduced alongside the Digital Markets Act (DMA), the DSA focuses on regulating how digital services operate, aiming to protect consumers, ensure accountability for online platforms, and foster a fair digital economy.

For businesses operating in or offering services within the EU, understanding the DSA is crucial. It brings new obligations, from content moderation to transparency, and introduces stiff penalties for non-compliance. In this post, we’ll break down the key elements of the DSA, explore its impact on different types of businesses, and discuss how companies can prepare for this new regulatory landscape.
1. What is the Digital Services Act?

The Digital Services Act (DSA) is designed to regulate the way digital platforms and services handle content, data, and user interactions within the European Union. It applies to a broad range of digital services, from small websites and online retailers to social media giants and search engines.
The DSA introduces rules aimed at:
- Protecting consumers from illegal content, counterfeit goods, and dangerous products.
- Ensuring greater transparency in how online platforms moderate content, recommend services, and handle user data.
- Improving accountability by requiring large platforms to mitigate systemic risks such as disinformation, cyber threats, and illegal content.
In short, the DSA sets out to create a more secure, trustworthy, and transparent digital environment for businesses and users alike.
2. Key Provisions of the Digital Services Act

The DSA outlines a number of important obligations for businesses, depending on their size and role in the digital ecosystem. Here are the key provisions that companies must be aware of:
- Content Moderation Responsibilities: Platforms are required to put measures in place to identify, remove, or limit the spread of illegal content. This could include hate speech, counterfeit products, or unlicensed services. Additionally, businesses must provide a clear mechanism for users to flag illegal content.
- Transparency on Algorithms: Platforms must disclose how their recommendation systems work, especially those driven by algorithms. For example, online platforms that use automated systems to recommend content or products must explain the factors that influence these recommendations.
- Ad Transparency: Online platforms are required to provide transparency on online advertisements, including the origin of the ads, why users were targeted, and the parameters used to determine who sees the ad.
- Risk Mitigation for Large Platforms: The DSA introduces specific obligations for Very Large Online Platforms (VLOPs), defined as platforms with more than 45 million monthly active users in the EU. These platforms must assess and mitigate systemic risks, such as the spread of harmful content, data breaches, and disinformation campaigns. This includes conducting annual independent audits to evaluate their risk management processes.
- Due Diligence for Online Marketplaces: E-commerce platforms must take steps to ensure that sellers on their platforms comply with EU regulations. This includes verifying the identity of sellers and ensuring the traceability of products to prevent the sale of counterfeit goods and dangerous items.
- User Empowerment: Users must have greater control over the content they see online, including the ability to modify or turn off algorithm-based content curation. Platforms should make this process easy and accessible to all users.
- Reporting Mechanisms: Platforms need to provide regular reports on the actions they have taken to remove illegal content and address the risks identified. These reports must be accessible to regulators, users, and the general public.
3. Impact on Different Types of Businesses

The DSA imposes different obligations based on the size and nature of the business. Here’s how it impacts various types of companies:
Small and Medium-Sized Businesses (SMBs)
- Lower Compliance Burden: Small and medium-sized businesses are generally subject to lighter regulatory burdens under the DSA. However, they still must ensure that they comply with basic rules on illegal content, transparency, and user protection.
- E-Commerce Platforms: Online marketplaces, even smaller ones, will need to take steps to verify the identity of sellers and ensure the safety of products sold on their platforms. This could involve developing better monitoring systems or partnering with third-party verification services.
Very Large Online Platforms (VLOPs)
- Higher Compliance Requirements: Platforms with over 45 million users in the EU face stricter regulations. They must implement risk management frameworks, conduct independent audits, and provide greater transparency around how their services operate. For platforms like Facebook, Google, and Amazon, this means an overhaul of content moderation processes and internal controls.
- Systemic Risk Management: VLOPs are required to address broader societal risks, such as disinformation, political manipulation, and public health threats. This may involve working with external experts, adopting advanced content moderation technologies, and collaborating with EU regulators.
Digital Advertisers
- Ad Transparency: Businesses that rely on targeted digital advertising must comply with the new transparency rules. Advertisers will need to disclose how they use consumer data to target users and provide clarity on ad placements, which could impact how ads are bought and sold.
4. Compliance Challenges and Penalties

Non-compliance with the DSA can result in serious consequences for businesses. Penalties can range from fines of up to 6% of a company’s global revenue to temporary or permanent bans on operating within the EU market.
For businesses, the biggest challenges include:
- Managing Content Moderation at Scale: Ensuring that illegal content is swiftly identified and removed can be difficult for large platforms. This may require investing in advanced moderation technologies and expanding moderation teams.
- Balancing Transparency and User Privacy: Providing transparency on how algorithms work and how data is used in advertising can conflict with user privacy requirements. Businesses will need to strike a careful balance to avoid infringing on users’ privacy while adhering to the DSA’s transparency mandates.
- Operational Costs: For very large platforms, implementing the required risk mitigation measures, audits, and compliance checks could result in significant operational costs. Smaller platforms may also struggle with the additional burden of ensuring compliance with new rules on content moderation and seller verification.
5. Opportunities for Businesses

While the DSA introduces new compliance obligations, it also presents opportunities for businesses to build trust with users and differentiate themselves in the marketplace:
- Improved Trust and Brand Reputation: By adhering to the DSA’s transparency and content moderation standards, businesses can build greater trust with their customers. Consumers are becoming more aware of how their data is used, and companies that prioritize transparency may gain a competitive edge.
- Enhanced User Experience: Platforms that give users more control over the content they see, including the option to turn off algorithmic recommendations, could enhance user satisfaction and engagement. Businesses that prioritize user empowerment will likely see higher retention rates.
- Safer Marketplace Environments: For e-commerce platforms, the DSA’s focus on verifying sellers and ensuring product safety creates a more secure shopping environment. This can lead to stronger consumer confidence and reduced risks of liability for counterfeit or dangerous goods.
6. Preparing for the DSA: Steps for Businesses

To ensure compliance with the DSA and take advantage of its benefits, businesses should:
- Conduct a Compliance Audit: Review existing practices around content moderation, data transparency, and seller verification to identify gaps in compliance.
- Invest in Content Moderation Tools: Ensure that your platform has robust tools in place to identify and remove illegal content. For larger platforms, consider using AI-powered tools to assist with moderation at scale.
- Improve Algorithm Transparency: Work on providing clear explanations of how your recommendation algorithms function. This may involve updating user-facing policies and offering more user control over algorithmic content.
- Verify Sellers and Products: If you operate an online marketplace, establish clear procedures for verifying seller identities and ensuring product safety. This may require partnering with third-party verification services or developing new in-house systems.
- Stay Informed on Regulatory Changes: The DSA is part of a broader set of digital regulations in the EU. Ensure that your business stays updated on future amendments and related laws such as the Digital Markets Act (DMA).
Conclusion
The Digital Services Act represents a significant shift in how digital platforms are regulated in the European Union. While it introduces new obligations for businesses of all sizes, it also creates an opportunity for companies to enhance transparency, improve user trust, and foster safer online environments.
Businesses that proactively embrace the DSA’s requirements and align their operations with its principles will be well-positioned to succeed in the evolving digital landscape of the EU.