As the EBA are seeking harmonisation by introducing these Guidelines, we consider these requirements should be set at a lower level; however, the Guidelines appear to be pushing technical requirements towards the application of EDD for many remote customer onboarding sessions, regardless of AML risk. This has a significant impact, especially for e-money products where a central feature is the ease of use. Some sections of the Guidelines set a high threshold for customer onboarding irrespective of the risk associated with the business relationship.
Guideline 4.2 does not define what information to collect, but rather sets the requirements for the process itself. The EBA’s objective is clear – standardisation of requirements across the market. However, the risk assessment forms a substantial part of CDD, to determine which controls are to be applied to every customer. The extensive requirements that have been set out, for example to obtain information on the purpose and intended nature of the business relationship and on the source of funds, are required for EDD, but not for CDD or SDD. This is not in line with the risk-based approach to CDD, does not appear to achieve the EBA’s objective, will also raise operational costs and impact customers detrimentally, creating friction at onboarding.
The EMA also wishes to comment on the proposed timeline of application of the final version of the Guidelines. The impact of the proposed changes to existing remote customer onboarding arrangements of many PSPs will be significant. This includes IT implementation, which may be required for a number of financial sector operators. Outsourced service providers’ ability to meet the conditions under the Guidelines would need to be checked, and potentially providers may need to be changed. Therefore, we would propose that the time afforded to in-scope entities to align with the requirements in these Guidelines is extended to 12 months after publication in all official EU languages.
The definition of Digital Identity refers to “a material or immaterial unit that contains person identification data, which is used to verify the identity of the user and for authentication purposes in an online service”. The current definition is confusing and is not aligned with the subsequent definition of Digital Identity Issuer. Specifically, the container/unit (physical or digital) that contains digital identity credentials or attributes does not form part of these credentials or constitute digital identity. The reference to authentication is also confusing since user authentication is distinct from user identification and requires the deployment of additional security controls to control user access to issued digital identity credentials/attributes. We suggest that the Digital Identity definition is revised to point only to “credentials or attributes which are used to verify the identity of the user”.
The ‘Biometric Data’ definition includes a reference to “the behavioural characteristics of a natural person”, which can be used to allow or confirm the unique identification of a natural person. Unlike biometric data that leverage physiological characteristics (iris, face, fingerprint, voice), it is not clear that current state-of-the-art behavioural biometric technologies can be leveraged to deliver a unique identification of a natural person with equivalent levels of assurance. Additionally, all behavioural biometric solutions require multiple interactions with a user to establish a robust User Profile that can be subsequently polled to confirm user “identity”; such solutions will likely be of limited use for remote onboarding of new customers. We suggest that the reference to behavioural biometrics is removed from the definition of Biometric Data.
The current ‘Impersonation Fraud Risk’ definition is too narrow; it should be expanded to include cases where duplicate, forged or altered identity data are used to identify a person/entity. The impact of such identity fraud use cases is growing with the emergence of synthetic identities and of deep/AI-generated fakes.
Paragraph 23: “The use of a fully automated remote customer onboarding solution, which is highly dependent on automated algorithms, without or with little human intervention, does not exempt financial sector operators from their duty to carry out ongoing monitoring of the reliability and adequacy of the solution.”
EMA: A number of financial sector operators use Artificial Intelligence (“AI”) – based solutions to carry out at least part of the remote onboarding e.g. to review a photo submitted by the customer and establish its validity and likeness. Paragraph 23 introduces a requirement for those PSPs to review performance of such AI solutions which will require additional support from the supplier. To meet this requirement, quality controls that are currently privy to the provider of the solution (and may constitute core/proprietary IP) will now have to be shared with the financial sector operators.
Additionally, this requirement will require specialist in-depth knowledge of technological nature e.g. for AI-based solutions and may not be fitting for a compliance function. New skill set will have to be developed and this may not be possible in the short term and will have operations costs. Any data being fed into the systems can be reviewed for accuracy, but an in-depth knowledge of the AI solution and how it works would be required to ensure accuracy of its output.
Paragraph 28: The reference to customer device location (IP) data at the end of this paragraph is confusing. It is not clear how such data can be leveraged to establish a customer’s identity remotely.
EMA: We suggest that the last sentence of paragraph 28 is removed.
Paragraph 32: “Financial sector operators should implement specific steps during the remote customer onboarding process to obtain information on the purpose and intended nature of the business relationship in accordance with the provision 4.38 of the EBA Risk Factor Guidelines. In particular, they should take risk-sensitive steps to gather information from their customers to identify the nature of their personal, professional or business activities and expected source of funds, and verify the accuracy of this information as necessary.”
EMA: We note that paragraph 4.38 of the EBA Risk Factor Guidelines refers to risk-based measures to understand the purpose and the nature of the business relationship, it does not require firms to obtain or gather information from customers. The requirement to obtain information on the nature and purpose of the business relationship is provided in the context of EDD in paragraph 4.64(a)(ii) of the EBA Risk Factor Guidelines, rather than standard CDD, as proposed in these draft Guidelines .
This requirement to obtain or gather information from customers on the purpose and nature of the business relationship is excessive as it extends beyond the requirements under the EBA Risk Factor Guidelines as well as Directive (EU) 2015/849 on the prevention of the use of the financial system for the purposes of money laundering or terrorist financing (“4MLD”) provisions under Art. 13(c) which refers to ‘assessing and, as appropriate, obtaining information on the purpose and intended nature of the business relationship;’
An assessment under Art 13(c) 4MLD could also be assumption-based in circumstances where the purpose and intended nature of the business relationship is clear from the product proposition. The EMA requests the removal of the obligation to obtain information on the nature and purpose of the business relationship independent of the risk. The requirement to ‘assess’ the purpose and intended nature of the business relationship as a first step should be added to the draft guidelines to align them with 4MLD. Obtaining information should be an additional risk-based step.
Paragraph 35: “In situations where the customer’s own device allows the collection of relevant data, for example the data contained in the chip of a national identity card, financial sector operators should use this information to verify the consistency with other sources, such as the submitted data and other submitted documents.”
EMA: This appears to be an overly prescriptive requirement to use this information for verification in every instance where a customer’s device allows the collection of such data. The provision could be interpreted in a way that will mandate a specific CDD measure where devices allow such data to be collected. This would be an excessive control, given the number of technical factors involved in the CDD process, limitations related to user-defined configurations of customer devices that can interact with chip-based documents and security concerns about storage/transmission of such data over telephony networks. The EMA suggests this provision is further qualified by specifying that such data is not required to be collected in every instance but rather can optionally be collected – where technically possible – for verification through an additional source.
Paragraph 40: “Where the ML/TF risk associated with a business relationship is increased, financial sector operators should use remote verification processes that include liveness detection procedures examining whether the video, picture or other biometric data captured during the remote customer onboarding process belong to a living person present at the point of capture, or real-time videoconference.”
EMA: The use of a ‘liveness detection procedure’ is helpful for situations with increased ML/TF risk. However, this should not be the only control that can be leveraged to mitigate the increased risk. It would be useful to allow the application of other controls/technical solutions to confirm the authenticity of images captured during a remote customer onboarding session.
The rationale for the application of EDD measures under paragraph 46 in addition to the requirements under 40 which is intended to mitigate increased ML/TF risk is not clear. In our view, paragraph 40 should be included under the options provided under paragraph 46.
Paragraph 41: “In case of legal entities, financial sector operators should verify the identity and the information provided in the documents and attributes reviewed as part of the identification process, through a reliable and independent source of information such as public registers, where available.”
EMA: Although BO registers are available in the EU, in some member states (e.g. France, Germany) these are only available for a fee. Additional examples of sources would be welcome.
Paragraph 42: “The individual remote customer onboarding process should be discontinued and redirected, where possible, to a face-to-face verification, in the same physical location.”
EMA: The reference to ‘the same physical location’ in the context of a remote customer onboarding session is unclear – clarification would be helpful. For reference, financial sector operators already employ a variety of fallback processes to complete customer onboarding if the quality of the identification data that is provided in any remote session does not allow the unambiguous identification and/ or verification of the customer.
Paragraph 45: “Where possible, financial sector operators should use remote customer onboarding solutions that include randomness in the sequence of actions to be performed by the customer for verification purposes. Where possible, financial sector operators should also provide random assignments to the employee responsible for the remote verification process to avoid collusion between the customer and the responsible employee.”
EMA: (i) We perceive that a requirement to introduce Randomness in the sequence of onboarding actions is overly prescriptive and should not be identified as the only solution. The EMA would welcome the adoption of a more flexible approach, allowing individual financial sector operators to implement risk-based measures.
Overall, it would be more helpful to set out the outcome the EBA aims to achieve, leaving the implementation of specific controls to attain the intended outcome to financial sector operators.
(ii) Random assignments to avoid collusion is also a very prescriptive measure raising a number of issues:
● This requirement will particularly impact smaller firms which do not have a large number of employees to assign randomly to remote customer onboarding sessions;
● It is unclear how the EBA expects compliance in the context of outsourcing, which is common industry practice. The EBA already prescribes in the Guidelines on outsourcing arrangements, various requirements to ensure effective governance arrangements of financial sector operators, including risk assessments and overseeing operation of such functions. Article 19 of PSD2, requires that “outsourcing of important operational functions [...] shall not be undertaken in such way as to impair materially the quality of the payment institution’s internal control and the ability of the competent authorities to monitor and retrace the payment institution’s compliance with all of the obligations laid down in this Directive.” Given these requirements, the proposed introduction of additional random assignments to entities to which activities are outsourced appears to be redundant and overly prescriptive;
● It is furthermore unclear why employees should have random assignments, given that firms are required to establish internal policies, controls and procedures, including for employee screening under Article 8(4)(a) 4MLD. This provision appears to doubt the effectiveness of existing employee screening arrangements and the integrity of employees and goes beyond the scope of the 4MLD requirement.
The EMA suggests replacing these prescriptive, detailed measures with a requirement to put in place controls such as quality assurance (e.g. sample checking, reports, reviews of previously completed customer onboarding activities by a different person than the original employee) to avoid collusion between the customer and the responsible employee.
Paragraph 46: “In addition to the above, and where appropriate to the ML/TF risk presented by the business relationship, financial sector operators should use of one or more of the following controls:
a) the first payment is drawn on an account in the sole or joint name of the customer with an EEA-regulated credit or financial institution or in a third country that has AML/CFT requirements that are not less robust than those required by Directive (EU) 2015/849;
b) send a randomly generated passcode to the customer to confirm the presence during the remote verification process. The passcode should be a single-use and time-limited code;
c) capture biometric data to compare them with data collected through other independent and reliable sources;
d) telephone contacts with the customer;
e) direct mailing (both electronic and postal) to the customer.”
EMA: The EMA acknowledges the intention to specify methods to use in higher risk situations, which appears to be building on the EBA Risk Factor Guidelines. However, this list is overly prescriptive and limited in scope. Both 4MLD and the EBA Guidelines on ML/TF Risk allow the use of a wider range of controls. These measures must be risk-based rather than an obligation to pick one or a combination of only 5 specified controls.
Additionally, this prescriptive approach is unlikely to be future-proof, in the context of fast-paced development of new technical solutions.
To support a more flexible, technology-neutral, risk-based approach, an additional option should be added allowing for equivalent steps to be taken.
Paragraph 51: “Financial sector operators should ensure that when the customer is onboarded using their digital identity this occurs in a secure environment, and, where possible, strong authentication is applied when verifying their digital identity.”
EMA: The reference to “strong authentication” in this paragraph is not clear and strong authentication (multi-factor authentication?) is not defined in these Guidelines. It is also not clear why this paragraph is required at all. Paragraph 49 already requires financial sector operators to ensure that risks – related to the use of digital identities to perform remote customer onboarding - are identified and mitigated. In this context, we propose that paragraph 51 is omitted from the final draft.
The text of Paragraphs 52 and 54 is largely identical; the first paragraph should be retained and the latter removed.
Paragraph 62: “In addition to complying with requirements set out in the applicable EBA Guidelines on ICT and security risk management, financial sector operators should use secure communication channels to interact with the customer during the remote customer onboarding process. Secure protocols and strong and widely recognised encryption techniques should be used to safeguard the confidentiality, authenticity and integrity of the exchanged data at rest and in transit.”
EMA: It is not clear whether the use of the fixed line/mobile telephony networks or video-based teleconference applications (that are currently used by financial service operators during customer onboarding) qualify as secure communication channels that can be used to interact with customers. Additional clarity is required on the attributes of secure communication channels that can be used to service provider-customer interactions during a remote customer onboarding scenario.