Skip to main content

2025 Data law trends

4. New global regulations are changing our digital operations

By Rachael Annear, Richard Bird, Gernot Fritz, Rixa Kuhmann, Janet Kim, Laura Knoke, Tristan LockwoodChristina Moellnitz, Sean Quinn, Lutz Riede

IN BRIEF

Over the past year, a global push to regulate the safety, accountability, and transparency of online services have begun to crystalize. In late 2023, the EU Digital Services Act came into force alongside the passage of the UK Online Safety Act, signaling a significant shift in how digital intermediaries are regulated.

While the US has yet to pass federal legislation, both state and federal regulators invoking concerns about privacy and consumer rights and state lawmakers focusing on children’s safety, have worked to address the gap.

Beyond the EU, UK, and US, laws like the Australian Online Safety Act are contributing to an expanding landscape of digital regulation. The full impact – both intended and unintended – of these developments will unfold over the coming years.

title

Digital intermediaries have long been subject to general laws and an assortment of targeted obligations. However, the EU Digital Services Act and the UK Online Safety Act reflect first attempts at the comprehensive regulation of online harm, as well as various other perceived risks and challenges arising from digital intermediaries related to transparency and accountability. They come at a time when lawmakers and regulators are also keenly focused on competition and consumer issues in digital ecosystems, with reforms such as the EU Digital Markets Act and UK Digital Markets, Competition and Consumers Act imposing parallel obligations on so-called digital ‘gatekeepers.’

Adopting the lexicon of Australia’s 2021 Online Safety Act – an early, industry-led framework passed by federal lawmakers in Australia – many jurisdictions are increasingly framing the issue of digital risk as a question of online safety, especially that of children.

In the US, the Kids Online Safety Act – a sweeping Bill passed by the Senate that would impose a duty of care on covered platforms, along with various safeguarding, disclosure and transparency requirements – reflects mounting bipartisan efforts at a federal level to regulate in this space. Despite uncertainty as to whether it has the necessary traction to pass the House, the law signals the intent with which many lawmakers are confronting the issue.

QuoteMarks_34x25px_Blue.png

The debate over online safety is just beginning; emerging technologies and processes that are being developed now may well fundamentally change our expectations of the way we participate in life online.

Rachael Annear, Partner

 

UK Online Safety Act

EU Digital Services Act

Australia Online Safety Act

Extra-territorial scope

Yes

Yes

Yes

In force

Yes – requirements coming into force on a rolling basis until 2026

Yes – all provisions in force

Yes – requirements coming into force on a rolling basis

Key topics

Child safety, illegal content, adult user empowerment, fraudulent advertising

Illegal content, societal risk, digital traders

Child safety and illegal content

Services subject to the most extensive obligations

Categorized services that meet both UK monthly active user and functionality thresholds

Very large online platforms and very large online search engines (< 45 million monthly active EU users)

Social media, electronic messaging, search engines, app distribution

Regulator

Ofcom

European Commission and Member State enforcement agencies

eSafety Commissioner

Fines

£18 million or 10% of global annual revenue

6% of the worldwide annual turnover

Up to AU$782,500 (2024)

QuoteMarks_34x25px_Blue.png

While the US debates the merits and constitutionality of laws seeking to improve online safety, accountability and transparency, the UK, EU and various other jurisdictions have moved forward with robust reforms that may ultimately drive global standards.

Tristan Lockwood, Senior Associate

US state lawmakers have been more successful in passing various narrower online safety reforms, with an increasing number of states adopting laws requiring age verification to access online pornography and requiring age verification and parental consent for minors to access social media. However, constitutional challenges have halted the enforcement of many such laws. In July 2024, the US Supreme Court decided to hear a challenge to a Texas law requiring age verification to access online pornography, potentially set to bring some certainty to the future of such requirements in 2025.

QuoteMarks_34x25px_Blue.png

The prospect of US federal online safety legislation, a growing number of state initiatives and mounting state and federal enforcement actions make for an uncertain compliance landscape in the US.

Janet Kim, Partner

A free speech challenge has also halted the enforcement of the California Age-Appropriate Design Code Act ahead of its July 2024 effective date. The law, which is modelled on the UK’s Age-Appropriate Design Code, requires businesses to prioritize children’s privacy and protection when designing digital products or services likely to be accessed by under-18s.

Despite constitutional uncertainty surrounding age-gating and age-appropriate design requirements in the US, such laws are gaining traction elsewhere. The UK Online Safety Act and draft Codes of Practice issued by the online safety regulator, Ofcom, seek to impose potentially sweeping requirements to enforce highly effective age assurance to prevent children accessing pornographic and other harmful content. Jurisdictions elsewhere in the world are looking to the UK’s design-focused Age Appropriate Design Code as a model. For example, the Singaporean privacy regulator this year adopted Advisory Guidelines for Children’s Personal Data that mirror many of its requirements. Likewise, the EU Digital Services Act requires online platforms to introduce measures to ensure a high level of privacy, safety and security of minors, with the European Commission planning to issue detailed guidelines outlining specific expectations in 2025.

QuoteMarks_34x25px_Blue.png

The EU Digital Services Act was a watershed moment. But with a broad interpretation of risk assessment and mitigation requirements, pro-active enforcement and codes of practice and guidelines in the pipeline, its full implications remain to be seen.

Lutz Riede, Partner

Looking forward, the debate around the costs and benefits of such laws, especially how they may impact the free speech and other interests of adult users, looks set to intensify.

A common thread in the legislative efforts canvassed above are increasing requirements to provide user transparency around content moderation rules and outcomes, along with the operation of recommender systems on platforms. In various jurisdictions around the world, a lack of transparency is also increasingly being used as a hook by regulators and private litigants in privacy and consumer cases targeting online platforms.

In the US, the concept of ‘dark patterns’ has been formalized in several state consumer privacy laws, including prohibitions on the use of dark patterns to obtain consent. Additionally, the Federal Trade Commission has continued to express its keen interest in dark patterns through several actions, public workshops and a staff report titled Bringing Dark Patterns to Light, which argues that dark patterns are an unfair or deceptive business practice that may be subject to enforcement action.

This emphasis on transparency is also apparent in the EU’s AI Act, which imposes transparency obligations aimed at enabling users to understand that they are interacting with an AI system and to detect synthetically generated content and deepfakes, and deployers to understand the AI systems’ design and be informed of their use. This allows accountability for AI-based decisions made by companies and public authorities and ensures additional risk management and transparency of training data for very capable and impactful AI models.

Mounting transparency expectations are also apparent in more traditional contexts, such as the enforcement of privacy laws, with many privacy regulators emphasizing the importance of transparency when issuing guidance on the development and deployment on AI systems.

Looking ahead

As we move forward, we anticipate that more jurisdictions will introduce laws aimed at enhancing the safety, accountability, and transparency of digital intermediaries. As these regulations evolve, we expect regulators to:

  • Leverage new laws to tackle perceived risks and address control deficiencies.
  • Utilize transparency mechanisms to bridge the information gaps between digital service providers and consumers.
  • Focus on service providers that fail to adhere to their terms of service and public statements, particularly regarding content moderation.

With this shifting regulatory landscape, it’s essential for providers to consider any structural changes necessary to ensure that their product development, launch, and monitoring processes, along with compliance design and assurance frameworks, are robust and fit for purpose in the medium and long term.

Back to top