Response to consultation on proposed RTS in the context of the EBA’s response to the European Commission’s Call for advice on new AMLA mandates

Go back

Question 2: Do you have any comments regarding Article 6 on the verification of the customer in a non face-to-face context? Do you think that the remote solutions, as described under Article 6 paragraphs 2-6 would provide the same level of protection against identity fraud as the electronic identification means described under Article 6 paragraph 1 (i.e. e-IDAS compliant solutions)? Do you think that the use of such remote solutions should be considered only temporary, until such time when e-IDAS-compliant solutions are made available? Please explain your reasoning.

  1. Fourthline understands that the EBA’s overall objective is to propose a RTS that is risk-based and proportionate where possible, and conducive to effective outcomes while keeping associated compliance costs to a necessary minimum. Fourthline also understands that for CDD requirements, the EBA wishes to further harmonise the way due diligence measures are applied across the EU by specifying what information obliged entities shall collect to comply with their CDD, SDD and EDD requirements.
  2. In general, Fourthline supports these purposes of the Draft RTS. Fourthline would like to emphasize its alignment with the AMLR’s clear statement of allowing market players to develop secure, accessible and innovative means of verifying customers identity and performing customer due diligence, including remotely, while respecting the principle of technology neutrality (Whereas (74) of the AMLR). Fourthline also underlines the importance of facilitating the remote performance of customer due diligence obligations by means of identification solutions set out in Regulation (EU) No 910/2014 (Whereas (66) of the AMLR).
  3. Fourthline would like to take the opportunity to respond to some questions in the Draft RTS and point out some further points that would benefit from further clarification.
  4. An assumption that can be distilled from the phrasing of Question 2 is that the level of protection against identity fraud is by definition high in case of eIDAS compliant solutions. Fourthline would like to explain by way of this letter that this is not necessarily the case for all eIDAS solutions when used for complying with obliged entities’ CDD requirements. It is important to note that there should be a sufficient trustworthy mechanism in place that establishes a link between the eID or EUDI wallet and the legitimate holder (“holder binding”) that wants to onboard as a client with an obliged entity. An eID usually requires the user to login in with a pincode. For the EUDI wallet, it is expected that users will need to authenticate themselves with pincodes, swipe patterns and/or biometric checks such as face or fingerprint recognition (https://github.com/orgs/eu-digital-identity-wallet/projects/36/views/1?pane=issue&itemId=88863108&issue=eu-digital-identity-wallet%7Ceudi-doc-architecture-and-reference-framework%7C348). 
  5. Fourthline would like to point out that when a natural person uses an eID or EUDI wallet for the purposes of opening a bank account, the obliged entity does not obtain sufficient assurance that the person authenticating with the eID or EUDI wallet (the holder) is indeed the legitimate owner of the eID or the EUDI wallet. This risk has been pointed out by other parties, including by the Dutch Partnership for Cyber Security Innovation in a recent post (https://pcsi.nl/en/news/protect-the-european-digital-identity-wallet-with-a-biometric-lock/). Pincodes or device-native biometric methods that rely on e.g. swipe strokes, face or fingerprint recognition have a number of security vulnerabilities. For example, criminals can disable biometric authentication systems by turning off Face ID on an unlocked device, or can steal the credentials by looking over the victim’s shoulder (shoulder surfing). Alternatively, identity fraud may establish itself through the criminal buying these credentials or forcing the victim to hand these over.
  6. Identity fraud becomes much more difficult in case server-side biometric authentication systems are used. In this case, biometric data provided by the owner of the EUDI wallet (or eID) at activation can be compared with biometric data at the moment of authentication. The existing EBA guidelines on liveness detection as included in the EBA Guidelines on the use of Remote Customer Onboarding Solutions under Article 13(1) of Directive (EU) 2015/849 could be applied both at onboarding and authentication.
  7. In summary, while the Draft RTS facilitates the use of technological remote onboarding solutions for purposes of CDD requirements, obliged entities must, pursuant to Article 6 (1) of the RTS, assume holder binding in cases where the person presenting the eID or EUDI wallet may not be the holder. Fourthline foresees that obliged entities may instead use only QES, or alternative routes for onboarding their clients, including the method prescribed under Article 6 (2) of the Draft RTS. This seems detrimental to the purpose of the AMLR and the Draft RTS. For obliged entities to facilitate the use of eID and EUDI wallets by their clients, we recommend exploring whether article 6 (1) of the Draft RTS may be amended in such a way that server-side biometric authentication of natural persons is included as a prerequisite. To the extent that Article 31 (2), (3) and (4) of the Draft RTS leave room to request server-side biometric authentication on a case-by-case basis, we would advise to include this as a mandatory field to create a level playing field and to strengthen anti-fraud measures. 

Question 7: What are the specific sectors or financial products or services which, because they are associated with lower ML/TF risks, should benefit from specific sectoral simplified due diligence measures to be explicitly spelled out under Section 4 of the daft RTS? Please explain your rationale and provide evidence.

  1. We note that Article 76(5) AMLR stipulates that obliged entities may adopt decisions resulting from automated processes including profiling or AI systems subject to three conditions. One of these conditions is that certain important decisions in relation to the (prospective) customer, such as the decision whether to enter into a business relationship with a customer, should be subject to meaningful human intervention to ensure the accuracy and appropriateness of such a decision (Article 76(5)(b) AMLR).
  2. We understand the rationale behind this specific restriction to be to avoid biased automated systems that could be detrimental to (prospective) customers. It is doubtful whether for cases where the risk assessment is low or average and where all the identity verification checks are "greened out", human intervention could be of any added value to the CDD process. Such file would in principle be approved by the obliged entity.
  3. It does therefore in our view not impact any of the (prospective) customer's rights to fully automate the CDD process for such a CDD file. Accuracy of such fully automated processing is already sufficiently covered by the GDPR (Articles 5(1)(a) and 5(1)(d) GDPR). In practice this entails that such a fully automated process should be thoroughly tested before being put to use and regularly monitored to eliminate any risk of inaccuracy or bias.
  4. Moreover, we recall that one of the key principles of the AMLR is that a risk-based approach should be followed by obliged entities as set out, amongst others, in recital 52 and 65 AMLR. It is, in our view, appropriate to apply such a risk-based approach for low- and medium risk CDD files where all checks have been passed as set out above.
  5. We would therefore kindly request AMLA and/or EBA to provide guidance to this effect with respect to the meaning of "meaningful human intervention" as set out in Article 76(5)(b) AMLR, with a focus on interpreting “meaningful” in the context of added value instead of time spent.

Name of the organization

Fourthline B.V.