A Guide to AI Regulation in the DIFC

      Although artificial intelligence (‘AI’) has risen in popularity more recently, generating widespread discussion about its transformative potential, its underlying principles and development have a much longer history. This progression is particularly significant when considering the complex landscape of data protection that AI's increasing capabilities now intersect.

      Today, with AI integrated into numerous processing activities and data protection laws widely established worldwide, regulators are now considering a critical question: should existing data protection frameworks govern AI, or is a new regulatory approach necessary?

      The DIFC’s approach – Regulation 10

      While the EU opted for a separate AI-specific regime, the Dubai International Financial Centre (‘DIFC’) has chosen to integrate AI regulation within its existing Data Protection Law (‘DPL’). Consequently, the DIFC issued the revised Data Protection Regulations in September 2023, with Regulation 10 specifically addressing the requirements for AI systems.

      Regulation 10 uses the encompassing term ‘System,’ which includes AI and similar tools. Its definition aligns with global standards such as the Organisation for Economic Co-operation and Development (‘OECD’) guidelines. Although broadly defined to cover systems with autonomous decision-making capabilities, the regulation permits development and use of those AI systems capable of processing personal data for purposes that are human-defined, human-approved, or defined by the AI system itself solely based on human-defined principles and within human-defined constraints.

      The regulation also defines ‘Deployer’ as the equivalent of a controller and ‘Operator’ as the equivalent of a processor under the DPL. Thus, a firm developing an AI system for its own use acts as the deployer, while a third-party vendor supplying an AI system would be the operator, and the acquiring firm would be the deployer.

      What is the Starting Point of Compliance?

      Compliance considerations should commence at the very stage of conceptualising the development or ahead of acquiring an AI system. The DIFC DPL Regulation 10 must be adhered to when personal data is processed for use in, or to facilitate the learning processes of, any AI systems. Conduct a deeper investigation into the requirements by reviewing the helpful guidance issued by the DIFC Commissioner. You will note that the requirements include:

      Data Protection Impact Assessment (‘DPIA’): Given AI’s novel nature and potential for heightened risks to individuals’ rights and freedoms, a DPIA is mandatory under the DIFC DPL. The DPIA should specifically address the risks associated with the AI system and outline mitigation strategies.

      Principles: Beyond the privacy by design principles already embedded in the DIFC DPL, AI systems should incorporate principles such as ethics, fairness, transparency, security, and accountability into their design and administration.

      Clear and explicit notices: Regulation 10 introduces additional transparency requirements to those already stipulated under DIFC DPL. Any website or application utilising AI systems must provide a clear and explicit notice at the time of initial use or access. This notice should detail:

      • the human-defined purposes, principles, and limits governing the processing of personal data
      • the output generated and its usage
      • the underlying principles of the AI system’s development and design
      • the potential impact of the AI system’s use on individuals’ rights
      • any relevant codes, certifications, or principles guiding the AI system’s design or development.

      Demonstrating the evidentiary capacity of AI systems: This is a crucial requirement under Regulation 10, involving both technical and organisational measures. Evidence must be provided to affected parties, relevant entities, or the regulator, as applicable, regarding:

      • the system’s compliance with audit and/or certification requirements;
      • any algorithm(s) designed to trigger human intervention when personal data processing might lead to unfair or discriminatory impact
      • a risk and impact assessment concerning potential unjust bias or high-risk processing
      • any algorithm(s) that prompts human intervention when personal data is processed by the AI system needs to be accessed by competent authorities, or when processing might violate consent requirements for cookies and direct marketing.

      AI Register: Firms are also required to maintain an AI register, analogous to the record of processing activities mandated by the DIFC DPL. This register should detail:

      • the processing activities conducted by the system
      • its use cases
      • how individuals can access information within the AI system
      • third parties with whom data is shared
      • the lawful basis for such sharing.

      Additional Obligations for AI systems Used for High-Risk Processing Activities: Recognising the heightened privacy risks associated with AI systems used for high-risk processing activities, Regulation 10 imposes supplementary obligations on firms. If an AI system is used for commercial purposes and involves high-risk processing, the following additional requirements apply:

      • The AI system must be certified under the certification scheme established by the DIFC Commissioner
      • An Autonomous Systems Officer (‘ASO’) must be appointed, possessing competencies, status, role, and tasks as advised by the DIFC Commissioner.

      The DIFC has published a certification framework outlining the requirements and procedures for obtaining certification. Firms must comply with these requirements, including establishing an internal governance framework for AI systems and implementing other organisational and technical measures. Certifications will be granted by accreditation bodies approved by the DIFC.

      Recent Developments

      Accelerator program: The DIFC recently soft-launched the Regulation 10 Accelerator program, designed to encourage positive AI adoption in alignment with regional ethical values and the DPL and Regulation 10. This AI accelerator serves as a sandbox where AI systems can be tested against privacy by design principles and Regulation 10 requirements. The Regulation 10 Accelerator allows firms developing, acquiring, or currently using AI systems to assess their system’s compliance posture against the DPL and Regulation 10, as well as to identify and understand the inherent risks associated with the use of AI systems.

      Enforcement date: The DIFC has clarified that the enforcement of Regulation 10 is planned to commence early in 2026, and that firms should be working towards achieving certification of appropriate systems.

      ASO Public Survey: The DIFC recently launched a public survey, inviting individuals to provide feedback on the expected role of the ASO. The survey can be accessed here.

      Guidance release: Whilst the DIFC has already released numerous helpful guides, it is expected that a further guidance document will be released in due course to further clarify the requirements, the ASO role and the application process for the Accelerator program.

      Practical Steps for Firms

      1. Identify AI systems in use or planned: conduct a thorough inventory of all existing and planned AI systems within the organization that process personal data. This includes understanding their functionalities and intended purposes.
      2. Review available resources:
      • Regulation 10 here
      • FAQs here
      • Guidance document here
      • Accreditation and Certification Framework here
      1. Conduct a DPIA: for any AI system processing personal data, especially new or novel technologies, initiate a DPIA early in the process. This assessment should specifically focus on the unique risks posed by the AI system to individuals’ rights and freedoms
      2. Embed AI principles into the design: ensure that privacy considerations and AI principles are integrated into the design and development (or acquisition and implementation) of all AI systems from the outset.
      3. Consider the Regulation 10 Accelerator Program: if developing, acquiring, or currently using AI systems, consider participating in the DIFC’s Regulation 10 Accelerator program. Firm’s can benefit from legal, technological, privacy, and policy testing of the AI systems. At present firms may apply via [email protected]
      4. Plan for an ASO Appointment: if engaged in high-risk processing, identify and plan for the appointment of an ASO. Consider whether an existing employee can fulfill this role or if a new appointment is necessary based on the DIFC Commissioner’s future guidance.
      5. Prepare for Compliance and Certification: start implementing the requirements under Regulation 10. If the AI system is used for commercial purposes involving high-risk processing, begin understanding the requirements of the DIFC Commissioner’s certification framework. Once accreditation bodies are established, prepare to undergo the certification process

       

      How Can Waystone Help?

      We have assisted more than 80 clients in the ADGM, DIFC, and the UAE onshore with their data protection requirements, including implementing complex, multi-jurisdictional data protection frameworks, advising on cross-border transfers, incorporating data protection principles, and drafting suitable documentation in accordance with the relevant data protection regulations and laws.

      Waystone Compliance Solutions is well-positioned to support you in maintaining a compliant data protection framework, providing an experienced outsourced Data Protection Officer, providing data protection support or educating and training your in-house Data Protection Officer on the regulatory requirements.

      For further details, please contact the Data Protection Team.

      Contact us

       Next post
      Share

      More like this

      Navigating Conflicts of Interest: DFSA and FSRA Requirements for Authorised Firms

      Conflicts of interest may present a significant risk to the integrity of financial services. Both the Dubai Financial Services Authority…
      Read more

      Building a Compliant and Effective Whistleblowing Framework: Key Considerations for Firms

      A robust whistleblowing framework is essential for firms operating under the Dubai Financial Services Authority (‘DFSA’). Ensuring your organisation complies…
      Read more

      Preparing for FinCEN’s New AML Rule: Why Investment Advisers Should Act Now

      Effective January 1, 2026, investment advisers registered with the Securities and Exchange Commission (RIAs) and exempt reporting advisers (ERAs) will…
      Read more

      Global IT Issue on 19 July

      On Friday, 19 July a global IT issue impacted industries around the world.
      Read more

      Navigating DORA compliance: A practical guide for SMEs

      To address the rising threats of cyber-attacks and digital disruptions in the financial sector, the European Union has introduced the…
      Read more

      Shaping the Future of Cybersecurity

      As digital threats continue to escalate in complexity and frequency, Waystone Compliance Solutions is spearheading cybersecurity solutions under the guidance…
      Read more