Regulatory & Legal Updates

Marina El Hachem

Associate marina.elhachem@bsalaw.com
  • Published: January 13, 2026
  • Title: UAE Federal Decree Law No. 26 of 2025 on Child Digital Safety: Liability for Digital Platforms and Internet Service Providers
  • Practice: Technology, Media & Telecommunications (TMT)
  • Authors: Marina El Hachem

Federal Decree Law No. 26 of 2025 on Child Digital Safety (the “CDS Law”) establishes a comprehensive federal framework governing the protection of children in the digital environment in the United Arab Emirates (UAE). Issued on 1 October 2025 and entering into force on 1 January 2026, the CDS Law aims to create coordinated governance mechanisms at both federal and local levels.

Scope of Application:
A defining feature of the CDS Law is its broad jurisdictional scope. It applies not only to digital platforms and internet service providers (“ISPs”) operating within the UAE, but also to foreign platforms and service providers that target users (companies or individuals) in the country. The CDS Law consistently adopts a dual nexus, “operating within the State (the UAE) or targeting users in the State”, across its provisions, including those relating to privacy, age verification, platform controls, and disclosure obligations. As a result, foreign entities engaging UAE users, particularly where children may access or be exposed to their services or content, are brought within the regulatory perimeter.

Governance and Regulatory Oversight:
The CDS Law introduces a multi-layered governance structure by establishing the Child Digital Safety Council, chaired by the Minister of Family. The Council is mandated to set national strategy, propose legislation, coordinate implementation across authorities, and advise on standards, tools, and indicators related to child digital safety. In parallel, the Cabinet is tasked with issuing a platform classification system that will tier digital platforms (whether domestic or foreign) according to risk, usage, and impact. This classification regime will determine the intensity of obligations, verification procedures, and disclosure requirements applicable to each category.

Regulatory oversight is further supported by “concerned entities,” which are responsible for monitoring platform content, notifying security agencies of criminal content involving children, and coordinating with the Public Prosecution and the Ministry of Interior on investigations and judicial processes. Authorities are empowered to impose administrative measures for non-compliance, including partial or full blocking and closure, as well as to conduct periodic verification of compliance and reporting obligations. An Administrative Penalties Regulation, to be issued by the Cabinet, will specify enforcement mechanisms, responsible authorities, and appeal procedures.

Privacy and Data Protection Obligations:
Digital platforms are prohibited from collecting, processing, publishing, or sharing personal data of children under the age of 13 without explicit, documented, and verifiable parental consent. Platforms must also provide mechanisms for easy withdrawal of consent and clearly disclose their privacy practices. Access to children’s personal data must be restricted to authorised personnel on a data-minimisation basis, and the use of such data for commercial purposes or targeted advertising to children is expressly prohibited.

The Cabinet is expected to further define the categories of data that may be collected from children under 13 and the procedures for obtaining and verifying parental consent. The CDS Law also contemplates potential exemptions for educational and health platforms, subject to Cabinet decision and the implementation of appropriate safeguards.

Age Verification and Platform Controls:
Platforms must implement age verification mechanisms that are proportionate to the risks associated with their services and aligned with the forthcoming classification system. Controls are to be calibrated based on the potential impact of content on children. Beyond age assurance, platforms are required to deploy enhanced child protection measures, including default high-privacy settings for child users, tools to enforce age limits, content blocking and filtering, age classification systems, and restrictions on targeted advertising.

The CDS Law also mandates features designed to mitigate excessive engagement by children, such as limits on usage and mandatory breaks, alongside robust parental control tools. Platforms must provide clear and accessible reporting mechanisms for child pornography and harmful content, deploy technical capabilities (including artificial intelligence and machine learning) for proactive detection, and immediately report relevant matters to concerned entities. Compliance with takedown orders, reporting directives, and periodic transparency requirements is expressly required.

Obligations of Internet Service Providers:
ISPs are subject to a distinct but complementary set of obligations. Policies and regulatory instructions issued by the Telecommunications and Digital Government Regulatory Authority (“TDRA”) will require ISPs to implement network-level content filtering, safe-use measures when the user is a child, and caregiver agreement terms linked to parental control tools. ISPs must provide parental control software and comply with immediate reporting obligations. The TDRA is empowered to review ISP policies and oversee ongoing compliance, reinforcing accountability at the infrastructure layer.

Prohibited Content and Activities:
The CDS Law prohibits children’s access to betting and online commercial games. Both platforms and ISPs are required to implement the technical and administrative measures necessary to prevent such access, including age verification, parental controls, and content blocking mechanisms.

Compliance Timelines and Legal Risk:
The CDS Law takes effect on 1 January 2026. Entities subject to its provisions at the time of issuance are required to align their operations within one year from the effective date, subject to any extension granted by the Cabinet. For both domestic and foreign platforms, the combination of extraterritorial reach, immediate reporting obligations, mandatory cooperation with authorities, and the potential for blocking or closure creates significant compliance exposure, even in the absence of a physical presence in the UAE.

Practical Implications for Digital Platforms and ISPs:
The CDS Law establishes a demanding framework. Privacy-by-default for children, robust age verification aligned with platform risk, and verifiable parental consent for under 13 data processing are now baseline expectations, coupled with strict prohibitions on the commercialisation of children’s data and targeted advertising. ISPs face enhanced supervisory oversight and network-level safety obligations.

While the forthcoming Cabinet classification decision and Administrative Penalties Regulation will further shape operational controls, transparency requirements, and appeal pathways, regulated entities should already be preparing for compliance. In practical terms, digital platforms and ISPs, whether established in the UAE or targeting its user base, should recalibrate content governance, safety engineering, and data protection frameworks in anticipation of the CDS Law’s entry into force, and prepare for sustained regulatory engagement under the new regime.