25/05/2023
Briefing

In many cases, fintech providers themselves will also need to ensure compliance with certain regulatory regimes as well as helping their customers to do so. In this article, we provide an overview of some key regulations fintech providers should be aware of when offering services to their regulated customers, including new obligations under the Digital Operational Resilience Act (“DORA”), the revised Directive on security of network and information systems (“NIS2”) and the soon to be finalised Artificial Intelligence Act (“AI Act”).

Existing regulatory obligations

Data protection

  • Fintech providers that process personal data must comply with the relevant requirements of the European General Data Protection Regulation (“GDPR”) and the relevant domestic implementing legislation (in Ireland, this is the Data Protection Act 2018). In many cases, a fintech provider is likely to process personal data on behalf of its customer when providing services so will act as a processor. However, to the extent that a fintech provider uses customer personal data for its own purposes (e.g. for product improvement), it will likely act as a controller.
  • As a processor, the GDPR compliance priorities for fintech providers will include ensuring appropriate security measures are in place. Compliance with industry standards such as ISO 27001, ISO 27002, SOC 2 and/or PCI DSS (Payment Card Industry Data Security Standard) can assist with this. Putting in place robust information security policy documentation and conducting tabletop exercises are also important means for helping to ensure compliance with security requirements under the GDPR.  Fintech providers should also be mindful to ensure that their customer contracts include appropriate processor provisions so as to help their customers comply with Article 28 of the GDPR.
  • In circumstances where a fintech provider acts as controller of customer personal data, compliance with the GDPR security requirements as outlined above will also be important.  Another GDPR compliance priority for fintech providers that act as controllers will be ensuring compliance with the transparency requirements of the GDPR.  This can be achieved by ensuring that an appropriate data protection notice is communicated to data subjects (e.g. staff or clients of fintech provider’s customers) either by the fintech provider itself or by its customers.

NIS Directive

  • It is possible that some fintech providers (e.g. providers of cloud computing services) will be subject to the Network and Information Systems Directive (EU) 2016/1148 transposed in Ireland by the European Union (Measures for a High Common Level of Security of Network and Information Systems) Regulations 2018 (SI 360 of 2018) (or the “NIS Directive”).
  • The NIS Directive is the first EU-wide cybersecurity law not specific to personal data and is focussed on ensuring that in-scope entities apply minimum cybersecurity standards and comply with reporting obligations in relation to significant cyber incidents.
  • The NIS Directive is due to be replaced by NIS2, which we discuss below.

Contractual requirements

CBI Outsourcing Guidance

  • The Central Bank of Ireland Cross-Industry Guidance on Outsourcing (“CBI Outsourcing Guidance”) is applicable to all regulated firms operating in Ireland. These outsourcing rules are focussed on ensuring that regulated firms effectively manage and mitigate risks associated with the outsourcing of critical services.
  • Where a fintech provider contracts with a regulated firm in Ireland, the contract between the parties will need to incorporate certain provisions prescribed by the CBI Outsourcing Guidance. These provisions in many cases are focussed on granting the regulated firm (i.e. the fintech provider’s customer) with broad rights designed to help it effectively oversee and manage its relationship with the service provider, including audit rights, information rights, termination rights and exit rights. While compliance with the CBI Outsourcing Guidance will ultimately be the responsibility of the fintech provider’s regulated customer, the fintech provider will nonetheless need to be familiar with the CBI Outsourcing Guidance and particularly, the contractual provisions that must be included in the parties’ contract. Please see our previous article here which provides more detail on contractual requirements under the CBI Outsourcing Guidance.

Changing regulatory landscape

NIS2

  • NIS2 is an evolution of the requirements of the NIS Directive rather than a drastic overhaul of existing cybersecurity requirements in the EU.  In particular, NIS2 will apply to an expanded scope of entities which now includes ICT managed service providers, ICT managed security service providers and trust service providers who issue qualified certificates and signatures for the purposes of the eIDAS Regulation. This means that NIS2 may apply to a wider scope of fintech providers than the existing NIS Directive so it will be important for fintech providers to determine if they are within the scope of NIS2 and if so, they should seek to align their existing incident response and cybersecurity risk management framework to meet these new requirements.
  • In particular, fintech providers will need to ensure that their internal incident management processes align with the shorter incident notification timelines under NIS2 (now 24 hours down from 72 hours) and more prescriptive obligations as to the cybersecurity measures that must be maintained by in-scope entities (e.g. incident handling processes, access controls, encryption, multi-factor authentication, supply chain security).
  • NIS2 is due to come into force by 18 October 2024 so there is some lead time for in-scope fintech providers to meet the relevant requirements.

DORA

  • DORA is focussed on operational resilience in the financial sector and is part of the European Commission’s Digital Finance Strategy. DORA is designed to uplift existing ICT risk management requirements for financial institutions and to consolidate these requirements into a single legislative instrument.
  • Interestingly, while DORA applies to a range of regulated firms such as credit institutions, insurance undertakings and investment firms, it will also apply to certain major ICT service providers, which could include some major fintech providers.  This will result in such providers becoming subject to direct supervision by European financial regulators for the first time.  This will happen where such providers are designated as being “critical” to the proper operation of the financial sector in the EU by European financial regulators. Following this designation, the provider will be assigned a lead European financial regulator.[1]  That lead regulator will assess whether the provider has in place a comprehensive and effective framework to manage the ICT risks it presents to its regulated firm customers. The lead regulator can also issue recommendations to the provider and require it to take remedial action (e.g., requiring certain provisions are included in contracts with regulated firms and imposing certain controls around sub-contracting).
  • DORA comes into force on 17 January 2025 and in advance, major fintech providers should start to consider whether they may be deemed a critical provider.  If so, it would be sensible to invest time and resources into identifying and leveraging existing cybersecurity, business continuity and governance policies and procedures that could help the provider align with key requirements of DORA and identify any material compliance gaps within the provider’s organisation as against key requirements of DORA.

AI Act

  • The AI Act, which is yet to be finalised, will regulate AI systems which create or may be used to create a “high risk” for individual’s rights and freedoms. High risk AI systems include (amongst others) any AI system that is a safety component of an already regulated product like a medical device or is used to make decisions relating to an individual’s employment. Importantly for some fintech providers, it also includes AI systems that are used for risk assessment and pricing in relation to life and health insurance or are used to evaluate a credit score or determine creditworthiness of an individual. In practice, most AI systems, such as systems to flag fraudulent transactions or to verify a customer’s identity, are unlikely to be considered high risk and the AI Act will not be relevant to such systems. However, creation or use of AI systems that are deemed to be high risk will require consideration of the following:

o   Risk assessment: Providers of high risk AI systems will need to conduct a risk assessment to identify potential risks and harms associated with their AI systems including assessing risks related to bias, discrimination, and privacy

.o   Transparency: Any company using a high risk AI system must ensure that a clear explanation is provided as to how the system makes decisions to enable users to interpret its output and use it appropriately.

o   Training and testing: Data sets used to support training, validation and testing must be subject to appropriate data governance and management practices and must be relevant, representative, accurate and complete.

o   Conformity assessment and monitoring: Providers of high risk AI systems must ensure the system undergoes the relevant conformity assessment procedure (prior to placing the system on the market/putting the system into service) and implement and maintain a post-market monitoring system by collecting and analysing data about the performance of the high risk AI system throughout the system’s lifetime.

  • For fintech providers that create high risk AI systems this means that they will need to: (i) exercise care in relation to the providence of data sets used to train their AI models; (ii) conduct appropriate risk assessments to identify and address any potential risks that the AI system may create (not unlike a data protection impact assessment); (iii) comprehensively document how the AI model makes a decision so that regulated firms can meet their obligations to explain how the AI system works to their individual customers; and (iv) prepare to undertake a self-conformity assessment with regard to quality management and technical documentation of the AI system.
  • At the time of writing, the Council of the EU and the European Parliament have provided their comments on the AI Act which incorporated amendments relating to “foundational models” and “general purpose” AI such as ChatGPT. Trilogue discussions will begin shortly and it is currently expected that the AI Act will be finalised and come into force this summer with its provisions applying 24 months later (based on the 16 May draft), likely Q3 2025.

[1] The lead financial regulator will be one of the European Banking Authority (EBA), the European Securities and Markets Authority (ESMA) and the European Insurance and Occupational Pensions Authority (EIOPA).