With regards to the section on subject matter, scope and definitions, the European Banking Federation (EBF) stresses the importance of ensuring a level playing field, both in terms of regulatory requirements and supervisory standards. We caution that the proposed guidelines may create differences between those entities, who are subject to them, and those that are not included in its scope. Therefore, the guidelines should not be limited to obliged entities in the financial sector only, but to all obliged entities in scope for the AMLDs. We also note that the guidelines will have a major impact on the work on the EU Digital Identity Wallet, which will be used by other types of actors. Risks regarding remote onboarding are not limited to just AML-related risks and to obliged entities under the AMLDs, but to all types of actors that deal with customers through a remote channel that have to manage the related risks.
We highlight that during the course of the customer relationship, there may be additional circumstances where the use of remote channels may be appropriate (e.g. customer requests new products, financial sector operator carries out customer due diligence (CDD) measures). Clarification would be needed whether the proposed Guidelines are envisaged to also apply to other situations apart from onboarding. We maintain that the scope of the guidelines should be expanded to encompass ongoing customer reviews. Particularly given the significant costs related to the employment of such remote onboarding technology, we believe that limiting its application only to onboarding would be a missed opportunity. However, this should not require automatic reviews of the existing business relationships with dispropotionate impact over obliged entities.
Furthermore, the second subparagraph of Art. 13(1) of the AMLD4 requires that obliged entities shall also verify that any person purporting to act on behalf of the customer is so authorised and identify and verify the identity of that person. We seek clarification whether the proposed Guidelines apply also to cases where the financial institution carries out due diligence measures on the representative of the customer (both natural and legal persons). EBA should ensure that financial services operators have leeway and flexibility when applying CDD remote measures to customers’ representatives in accordance with the risk-based approach.
Regarding the scope of the guidelines, we notice that there is no reference to filtering customers against sanctions lists. We consider this issue to be of great importance, especially given that non face-to-face onboarding processes usually require a certain amount of immediacy, which sometimes goes against the technology available to carry out these processes and the capacity to execute the relevant analysis/decision making by back office/analysts, etc.
With regards to the definitions, the EBF would like to highlight the following points:
• The definition of “digital identity” does not personalise the user (e.g. customer). However, “Digital Identity Issuer” definition refers to “the customer’s identification”. In our view, the language used in the proposed definitions should be harmonised. We therefore suggest changing the definition to refer to “the user’s identification”.
• In the draft guidelines, the impersonation fraud risk is defined as “the risk that the customer uses another person’s (natural or legal) details without the consent or knowledge of the person whose identity is being used”. Nonetheless, it may not only be the customer who uses another person’s details, but e.g. the customer’s representative or another third party. Hence, we propose changing the word “customer” to “a person”.
• Additionally, we suggest clarifying whether the definition also encompasses cases where fraudsters use data with the consent/knowledge of the victim, e.g. the so-called “friendly frauds”. Such cases are linked to the definition of “payer acted fraudulently”. We would hence welcome clarification in that regard.
• With regards to the definition of “biometric data”, it would be better to use the GDPR definition which is more precise: “personal data resulting from a specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”. The definition in the draft guidelines does not include the highlighted part. However, this clarification is fundamental because, e.g. in the case of a photo, the photo itself is not a “biometric data”, unless a specific treatment is applied that will make it “biometric”.
• In addition, it is important to consider the possible correlation of the definition of "biometric data" with the requirements of PSD2, e.g. EBA Opinion on the implementation of the RTS on SCA&CSC (dated 13 June 2018), where it is stated that “An element based on inherence is typically based on biometrics (including behavioural biometrics), provided they comply with the requirements under Article 8 of the RTS”.
• We also suggest deleting the reference to the absence of physical contact, and amend the relevant paragraph as follows: “These guidelines set out the steps financial sector operators should take to comply with their obligations under Article 13(1) of Directive (EU) 2015/849 when performing the initial customer due diligence (CDD) to onboard new customers, using remote channels, [ DELETE without physical contact]”.
• As a general remark, we recommend aligning the guidelines with the relevant legal framework, e.g. the eIDAS Regulation and refer to the eID schemes laid down therein. The definitions used throughout the guidelines should be harmonised with other relevant regulatory acts.
• Point 9: In the definition of “Digital Identity”, it would be helpful to have a clarification on the meaning of “immaterial unit”.
• We suggest including a definition of “contract” and “multi-purpose device” considering also references in other regulatory acts such as art. 9.2 of RTS on SCA&CSC. The definition should leave enough flexibility for financial sector.
• Finally, for the purpose of establishing a unified and clear approach in the provisions of the proposed guidelines – we consider necessary to introduce:
i) the clear definition of the phrase “Credible and independent sources” (what this phrase implies”, which also leaves room for flexibility operators to make their own risk-based assessment on the quality of the source
ii) the obligation of the beneficial owner to present Identity documents. Financial operators should be able to rely on a stronger legal basis to ask from all legal clients the identity documents for their beneficial owners. The regulation regarding identifying beneficial owners and verifying the identity should be left up to the national legislation and aligned with Article 13.1(b) of the fourth AML Directive and the EBA ML/TF Risk Factors Guidelines 1.24.
According to the Article 13.1.(b) of the fourth AML Directive, identifying the beneficial owner and taking reasonable measures to verify that person's identity so that the obliged entity is satisfied that it knows who the beneficial owner. The EBA ML/TF Risk Factors Guidelines 1.24 state that “Initial CDD should include at least risk-sensitive measures to: a) identify the customer and, where applicable, the customer’s beneficial owner; b) verify the customer’s identity on the basis of reliable and independent sources and, where applicable, verify the beneficial owner’s identity in such a way that the firm is satisfied that it knows who the beneficial owner is.”
The EBF maintains that remote customer onboarding should be governed by clear policies and procedures to be able to follow the guidelines and to allow flexibility when carrying out remote onboarding. However, the detailed requirements proposed in Guideline 4.1 would be burdensome to implement and, possibly, overlap or interfere with the existing national regulations while adding administrative costs. In combination, these factors might hamper the adoption and use of remote onboarding which would undermine the objectives of the Guideline. As an example, according to paragraph 12, “the AML/CFT compliance officer should, as part of their general duty prepare policies and procedures to comply with the CDD requirements, prepare remote customer onboarding policies and procedures and ensure that those remote customer onboarding policies and procedures are implemented effectively, reviewed regularly and amended where necessary.” While we acknowledge the importance of these tasks, we do not deem it necessary that they should all be the responsibility of the AML/CFT compliance officer only. For example, the drafting of such policies would most likely require the involvement of risk and tech specialists to ensure that they are technologically neutral and focused on ML/TF risks.
Moreover, this approach seems to contravene Section 41 of the EBA Consultation Paper on Draft Guidelines on the Role of AML/CFT Compliance Officer, whereby the AML/CFT Officer’s role is to ensure that adequate policies and procedures are in place and maintained and not to draft such procedures. In this regard, from an operational perspective, the client onboarding process many a times will fall within the remit and responsibility of the front liners/business and not the AML/CFT Compliance Officer, whereby the role of the Officer is to provide feedback and guidance if necessary. We therefore suggest that this section is reworded whereby the role of the AML/CFT Officer is to oversee and provide guidance in the drafting of CDD procedures and not being solely responsible for drafting them.
Additionally, the provisions on internal policies and procedures do not seem to be sufficiently risk-based. Consequently, financial sector operators need to apply mandatory requirements that do not necessarily contribute to a more effective ML/TF risk management. The requirements apply to all cases of remote onboarding, regardless of the risk posed by the channel in question.
It further remains unclear whether the policies and procedures can also be used to support some of the ongoing monitoring obligations (e.g., updating customer information). If this is the case, we recommend to makie it explicit in the text.
Finally, the EBF would like to make the following specific recommendations in relation to this section:
• In para. 10:
o Para. 10(a) seems too unclear to put in policy/procedures. It would be preferable to have in an internal technical standard for the solution. Moreover, in its current form the provision does not include the financial products and services that the entity offers to new clients. Within a policy or a procedure, it is necessary to indicate the product or service being offered, since depending on the country, the onboarding process may contemplate different due diligence measures, and therefore the procedures should include the specifics of each one of them with an end-to-end vision.
o In letter c), we suggest amending the text to highlight that the financial sector operator exercises discretion. We hence propose the following rewording: «which solution the financial sector operator, on its discretion might apply to a certain category».
o In letter d), we suggest deleting “and the information that is necessary to identify the customer and verify their identity” since this is already foreseen in letter e).
o In the same spirit, regarding paragraph 10(d) and 10(e), we believe these requirements are already covered in standard CDD and EDD procedures and should not be repeated. Regarding paragraph 10g it seems unclear whether this should be applied on existing solution.
o Letter f): for the sake of clarity, we recommend adding “if any” given that for low-risk situations, no human intervention may be required.
• With regards to para. 14 whereby an end-to-end evaluation of the pre-implementation onboarding is requested, we suggest that this requirement be adapted in a way that reflects the evaluation of dependencies between the components/services that are part of the onboarding service. For those that do not have dependencies on others (only on the master process) we ask that an individual assessment of the process is accepted. In the cases in which there are dependencies between services and only in such cases, that the interoperability analysis of the services is delivered. In general terms, Section 4.1.3 on the pre-implementation assessment of the remote customer onboarding is not sufficiently detailed when referring to the scope of the pre-implementation assessment process. It would be useful that this part of the guidelines is clarified in order to achieve a common approach (i.e. Standardized matrix for the assessment of the onboarding solution).
• Paras. 19-24 contain very far-reaching requirements for routines and controls. However, several of the requirements should already be taken care of according to other legal requirements, e.g. when evaluating KYC procedures and controls. The remote channels are just one of the tools in the onboarding and the KYC processes. We also suggest including in para. 19 that this information is accessible by the entity with certain immediacy in case the service is provided by a third-party provider. Moreover, as regards this section, clarification and guidance would be necessary as to what monitoring is considered adequate in the fulfillment of the requirements while also carefully considering the appropriate level of detail to avoid excessive regulatory burden. In particular:
o If the monitoring of service indicators (an SLA with the supplier, for example) is sufficient.
o If it would be necessary to periodically review the parameterization of the tool/model.
o If it is also necessary to periodically test the effectiveness of the tool.
We highlight the importance of being aware which actions would be considered appropriate, given that each type of action requires different resources, personnel, etc.
As a general remark, this section of the draft guideline is less detailed and less helpful to financial institutions as regards the types of acceptable innovative technologies and acceptable forms of digital documentation, of which both were part of the original EU Commission request. This is understandable given the very differing situations across the EU. However, financial institutions would welcome clearer guidance of what is acceptable and not when onboarding a customer remotely for identification and verification purposes. This should be supplemented by clear legal bases laid down in the upcoming AML Regulation on the legal grounds for processing personal data for AML/CFT purposes, as well as provisions enabling financial institutions to make use of new technologies in accordance with the requirements and safeguards laid down in the GDPR.
As stated in the summary of the draft guidelines, the Commission is of the view that the CDD rules laid down in Directive (EU) 2015/849 do not provide sufficient clarity and convergence about what is, and what is not, allowed in a remote and digital context and has therefore asked EBA to issue guidelines. In our view, to meet this objective, the guidelines need to be more specific and detailed with regards to types of acceptable innovative technologies and acceptable forms of digital documentation.
Regarding paragraph 25(b) , the EBF cautions that capturing these kinds of data in all remote customer onboarding cases is not needed if financial sector operators can in other ways corroborate data from other reliable sources. Most natural persons establishing a business relationship with a financial sector operator will have a unique personal identification number which can be verified in a central public database. Furthermore, as a part of the signing of the agreement the new customer signs with an eID which is connected to the unique personal identification number rendering impersonation more difficult.
Data storage/record keeping requirements
Para. 26 appears to require that all pictures, videos, and other identification proof collected during the remote identification process would have to be stored securely. We would like to point out that Member States may have different requirements in terms of collecting and keeping of CDD information. Therefore, the guidelines should allow for more discretion for financial institutions to determine which information they consider necessary to keep in case the national legislation does not set binding record keeping requirements.
The EBF further notes that some of the requirements in the guidance seem to already be covered by the KYC requirements enshrined in the AMLDs. For example, para. 28 states that “Financial sector operators should have appropriate mechanisms in place to ensure the reliability of the information automatically retrieved, referred in the previous paragraph[…]”. To ensure the reliability of information is an existing requirement in the context of the KYC process. In this part the guidelines seem to reiterate existing requirements without providing practical guidance on how to fulfil these requirements.
Regarding paragraph 31, we would like highlight that a natural person may also have a representative. Paragraphs 27 and 28 would seem to apply only to natural persons when they are customers but not if they are acting on behalf another customer (be it a natural person or legal entity). Clarification is hence needed in this regard.
Furthermore, we believe that Section 4.2.3 on Identifying Legal Entities should provide more detail considering that legal entities can often have complex structures. This should include criteria that provide additional clarity and security based on measures that obliged entities may consider and apply to encompass those CDD dimensions in the remote onboarding context. These criteria should take into account that banks may still profile the prospective customer as a normal or low risk customer in remote context in fully compliance with AMLD requirements. This would be without prejudice to ongoing monitoring obligations regarding the business relationship, which would include monitoring the transational behaviour/pattern, operations, products and the dynamism of all the relevant individual ML/TF risk factors identified and managed after the innitial onboarding.
More guidance should also be provided under section 4.2.4 Nature and Purpose of the Business Relationship, especially in relation to source of wealth and how such information may be verified through remote means on a risk-sensitive basis. Potential high-risk scenarios such as dealing with PEPs are not dealt with in detail, leaving room for different interpretations by national authorities, which then may lead to fragmentation and contribute to a non-level playing field.
With regards to the provision in para. 25 whereby financial sector operators shall ensure that “any technical shortcomings that might hinder the identification process, such as unexpected connection interruptions, are detected and investigated”, in most cases such shortcomings fall outside of financial institutions’ responsibility. Therefore, often there is no ground for conducting an investigation. We therefore suggest amending the wording whereby in such cases the process is interrupted and subsequently resumed.
In paragraph 4.2.1 we would also suggest indicating good practices related to security (fraud), such as geolocalisation and identification of the device used, as in many cases they represent valuable information to be used in a subsequent investigations or, for example, to apply reinforced measures in more economically vulnerable geographic areas that are more prone to the possible appearance of "mules" or "digital strawmen". A fair balance needs to be achieved between the rights of an individual, including privacy, and the legitimate interests of financial sector operators.
In section 4.2.1. b we propose indicating the need for such data to be immediately accessible in case the service is provided by a third party provider.
The guidelines establish in paragraph 33 the possibility of accepting in the identity verification process the use of photocopies under certain conditions. A solid and strong identity verification process should only allow the use of authentic documents to ensure the security of the process. The capture of the authentic document can be done through one or more images (photographs of the original document) and/or a video (of the original document). Various regulations across EU Member States require the use of original documents and explicitly exclude the possibility of using photocopies or other types of representations of the document. From a security point of view, the use of a photocopy prevents the verification of the presence of some additional security measures present in the original document. These security measures include the presence of textures characteristic of the original print, micro-writing and holograms. An attacker can use a photocopy to perform a spoofing operation. The verification of the embedded security features in an ID document (specifically a Passport and EU/EEA ID-cards), are next to impossible to verify completely, if a natural person does not have the document in hand, and is able to twist, turn, and expose the document to different lighting.
The emphasis should be put on the validation that is possible when the document is not physically examined by an employee. These may include scanning of the RFID chip of passports and ID-cards, and comparing the picture in the chip with a live photo of the customer applying to be on-boarded. As a good practice, this adds an extra layer of security, on top of the OCR scanning of the MRZ and VIZ of the passport. Another advantage of RFID scanning is that the obliged entity will be certain that the person validating their identity has the actual document in hand – thus removing the risk of copied passports flowing around. RFID scanning will not mitigate the impersonation risks completely, but it is a great leap forward. Solely relying on MRZ scanning runs higher fraud and ML/TF risks, as MRZ zones are relatively easy to falsify, even with limited IT-qualifications. On the contrary, falsifying a RFID response is extremely difficult compared to falsifying MRZs. Section 35 mentions this in short, but to future-proof the guidance, it would be beneficial to outline the possibilities available to validate ID’s remotely.
The use of non-original documents may be relevant to low-risk situations, adjusting controls and remedial measures as the business relationship and transactional behavior/patterns evolves under effective and previously defined risk-sensitive basis/criteria.
In general, the guidelines introduce an approximation to the onboarding processes that is very different from the one established at the moment, based on detailed requirements to move to an accountability model, which would force banks to establish their own detailed procedures.
Furthermore, the EBF considers scans of paper-based documents, especially those without certifications or similar, as problematic from the perspective of verifying a persons’ identity as it increases the risk of fraud and ID theft. More generally, this section of the suggested guidelines seems outdated. This concerns e.g. the use of paper copies and scans of such. We consider that the physical identity document used should be chip-based and readable through Near Field Communication.
In section 4.2.1 Identifying the customer the following is stated: 26. The identification proofs collected during the remote identification process, which are required to be retained in accordance with Article 40(1) point (a) of Directive (EU) 2015/849, should be time-stamped and stored securely. The content and the quality of stored records including pictures and videos, should be available in a readable format and allow for ex-post verifications. The wording of this paragraph entails that pictures and videos of the remote onboarding session should always be record kept. It is worth pointing out that different countries set different requirements for recordkeeping of KYC information and the required level of detail varies. In light of the above, the guidelines should leave more room for discretion for the financial sector operator to conduct their own assessment on which information they consider necessary to keep record of if the national legislation does not include detailed record keeping requirements.
Quality of the evidence is key. In this sense, and regarding identity documents, it is worth mentioning that although the remote customer onboarding process may include a video recording (i.e. a video call or a non-assisted video), obtaining a separate photo of the identification document used by the customer is the best way of guaranteeing sufficient quality and clearness conditions, so that the evidence (the photo of the identification document) has enough quality. In addition, we propose including the following good practices for verifying the identity document as recommendations in the guidelines:
1. Verify that the document has not been photographed on a screen (replay-attack).
2. Verify that the photograph of the face printed on the document is authentic and has not suffered any type of alteration or impersonation, e.g. verifying that an attacker has not placed a photo on top of the original photo.
3. Exclude the possibility of using images from the user's device's camera roll. The use of this type of images does not ensure the integrity of the process, since the images from the camera roll may have physical or digital modifications or alterations.
4. Use of biometric technology to verify that the user's identity is not duplicated. This type of technology compares the face printed on the ID card against the printed face of users who have already registered in the system. This makes it possible to find duplicate identities and avoid situations in which an attacker uses different manipulated documents, all with the same printed face, but with different personal data (name, surname, ID number, etc.).
The introduction of this measure would help significantly to avoid the use of the financial system by false identities, frequently used by criminal organisations.
According to paragraph 33, “[financial sector operators] should take steps to have sufficient assurance as to the reliability of the copy provided. This may include verifying […].” We propose replacing the word “verifying” with, for example, “ensuring”, “examining” or “reviewing” to ensure that the requirement is not confused with the verification obligation under the AML/CFT legislation which might lead to additional requirements for financial sector operators.
As regards paragraph 33.a, clarification would be needed on the situations where the document cannot be compared to official databases because such databases do not exist.
As regards paragraph 33.b, the document may include personal or privileged information that is not necessary for the CDD purposes and should not be stored or otherwise processed due to data protection requirements. Therefore, it should be allowed to hide a part of the document when the financial sector operator has justified grounds.
With regards to para. 35, the provision laid down therein provides for the need to read the chip when the user device has the capacity to employ this technology and the document contains a chip. We think that the use of NFC reading technology should not be mandatory, but only a recommendation, or at least should be limited to high-risk transactions. We maintain that it should not be required for online customer onboarding.
NFC chip reading of ID documents is a process that generates a lot of friction from a user experience point of view. The NFC chip reader is located in different positions on each mobile device. Therefore, it is not obvious for a user to know how to position his/her device to read the chip in his/her document. Nor is it easy for the service provider to explain how the user should position his/her device, as this positioning differs depending on the mobile device and the document to be read. Due to the above reasons, the chip reading process is associated with significant drops in the conversion funnel of the process. This causes higher acquisition costs for financial institutions and hinders the digitization of citizens.
On the other hand, the wording of paragraph 35 establishes the need to read the document chip "where the customer's own device allows the collection of relevant data". In other words, it conditions the reading of the chip to the device used by the user. In practice, this wording forces the digital identity verification process to be carried out through a mobile APP (iOS or Android), and excludes the possibility of using web technology. This is because the web environment does not currently have the ability to read the chip of identity documents. Therefore, a device with chip reading capability would be forced to perform the digital identity verification process in a mobile APP, since the wording conditions its use only to the capabilities of the mobile device, and not to the capture environment used (iOS/Android or web).
Regulations and recommendations should be technologically neutral and not conditioning users and financial institutions to one predetermined system. The limitation of the digital identity verification process to the native environment is a very important limitation for financial institutions.
For the above reasons, we consider that the reading of the document chip should be an optional feature for remote processes, being only required as mandatory for high-risk processes
We further suggest the deletion of para. 36 given that the quality of copies of documents do not allow for verification of security features embedded in the official document.
Finally, we ask for confirmation or clarification that in cases different from the ones listed in Section 4.3, the guidelines would also apply in case of use of instruments regulated by the eIDAS Regulation.
As regards paragraphs 38 and 41, we would like to stress that as far as foreign nationals or foreign corporates are concerned, verifying the validity of official documents by checking against public registers is often an extremely difficult, if not impossible, task for a private sector financial sector operator. Thus, that should not be set as a minimum requirement until there are appropriate EU registers, network of interconnected national public registers, or other similar EU-wide mechanism that would enable such validity verifications and would be available for financial sector operators. We further maintain that national governments also need to provide documents that are sufficiently verifiable and reliable in order to fulfil the minimum requirements.
With regards to para. 38(c), we would like to point out that there are national differences in the rules regarding assurance of representation (mandate or entitlement to act). In our view, assurance of any mandate or entitlement to act should follow the rules and common practices of the jurisdiction in question. Furthermore, we believe that additional guidelines on how the mandate/entitlement to act can be conducted through a remote onboarding system would be beneficial given this type of verification usually involves a manual review of specific clauses of the constitutive documents of the legal entity in question which vary from one entity to the other, with differences being more marked when considering entities incorporated in different jurisdictions. In addition, it would be helpful if the Guidelines allow the validation to be conducted on a risk-based approach so that appointed directors are deemed to be entitled to act on behalf of a legal entity in respect to which they are appointed as directors without any further verification being required in all cases. In addition, given the national differences which affect the cross-border rendering of services, it would be beneficial if this matter is also handled in a uniform manner throughout all Member States by having legal representatives identified on easily accessible national business registers.
Para. 39 concerns the use of biometric data, and we would welcome more technical guidance on how this should be done remotely. Furthermore, the use of biometric data is only recommended, not imposed by law thus leaving the financial operators with only explicit consent of the data subject (art. 9 para. 2, let. “a” of the GDPR) as a legal basis to rely on for processing this special category of personal data. The problem that arises is that, when processing personal data on this legal basis and the data subject exercises their right to erasure under Art. 17 of the GDPR, the controller should respond to the request if there is no other legal ground for the processing. Unfortunately, as long as there is no other legal basis for processing of such data by financial operators, it would be almost impossible to keep biometric data if the data subjects request their deletion. Although it is not appropriate to require firms to use biometric identification, financial operators should be able to rely on stronger legal basis to permit them to process and store biometric data used in the remote customer onboarding processes where appropriate, while observing the pertinent technical and organisational security measures to be applied to such processing taking into account the associated risks.
Para. 39 also prescribes that the system should unambiguously verify that the person performing the process is the same person who appears in the document. We believe that it is necessary to establish quantifiable biometric accuracy requirements to ensure the use of sufficiently accurate technology. For this reason, we propose pointing out as good practice the use of an accuracy criterion that requires it to be evaluated among the top 50% of the best biometric engines evaluated at any given time. This avoids the use of biometric engines with low accuracy but maintains the possibility of using a large number of biometric engine manufacturers.
Furthermore, it appears difficult to ascertain the "unequivocally referable" to a single natural person of the biometric data indicated in the identity document (e.g. if the following data: height, colour of eyes and colour of hair are indicated in the identity document – as it is the case in certain jurisdiction, during the period of validity, the aforementioned data can change considerably).
Para. 40 prescribes certain measures “where the ML/TF risk associated with a business relationship is “increased”, which leaves room for interpretation and it is therefore unclear what is meant by the increased ML/TF risk. Existing AML/CFT legislation applies the terms lower and higher ML/TF risk, subject to ‘simplified’ and ‘enhanced’ due diligence measures, respectively, but it is common for financial institutions to have medium or normal risk as well. Increased risk may be interpreted as all situations except in lower ML/TF risk or situations or only in situation of higher ML/TF risk. We thus suggest using the term “enhanced” to clarify the scope.
Additionally, para. 40 introduces the need to employ a liveness detection process during digital user identity verification. We believe that requirements should be established to ensure that the liveness detection mechanisms implemented are sufficient to guarantee the security of the process. Specifically, ISO 30.107 "Information technology - Biometric presentation attack detection" establishes requirements on how biometric liveness detection technology should be evaluated. The use of liveness detection technologies that have been evaluated according to ISO 30.107 may be included as good practice.
Lastly in relation to para. 40, we are aware that certain national personal data protection authorities have limited the use of the biometric data for meeting the requirements of the AML/CFT legal framework. This approach is not aligned with the present proposed guidelines.
In light of the above, we ask EBA to enable the use of this technology through the present proposed guidelines. This could be achieved through explicit guidance on the use of this type of technology with the essential involvement of the national authority, which should issue clear instructions regarding the use of biometrics in non-face-to-face identification and authentication processes.
With regards to para. 42, we strongly recommend deleting the requirement of a face-to-face verification in the same physical location and rather require, for example, a video conference instead (cf. point 44).
In relation to para. 44, guidance may include best examples of what records are to be kept when a video conference is used as a mean to verify the identity of the customer, such as:
(a) at least an audio recording of the video call or the entire video call itself, which includes the entire conversation between the official of the financial sector operator and the customer;
(b) screenshots taken during the video call, which shall include an image of the customer as well as the date and time displayed by the video conference tool; and
(c) when the identification document is produced by the customer throughout the video call, screenshots of the identification document (all relevant pages or sides) will need to be recorded. The photographic evidence of identity, as well as all the information on the identification document, should be clearly visible and legible from the screenshots.
Additionally, we agree with the text of para. 44(b) the involvement of staff with sufficient knowledge of the applicable AML/CTF regulation and security aspects of remote verification. The staff should be sufficiently trained to anticipate and prevent the intentional or deliberate use of deception techniques. However, remote onboarding processes and tools are becoming highly sophisticated and encompasses several high-tech domains: biometrics acquisition, biometrics liveliness, authentication of the holder's identity of an electronic document. All of this may involve the use of AI and machine learning which poses the question of explainability. Banks thus need not only trained staff, but also a specialised team supplemented by a second line of defence. We believe the interaction with AI and machine learning with remote onboarding process should be addressed in these guidelines.
Para. 44(c) mandates financial institutions to “develop an interview guide defining the subsequent steps of the remote verification process as well as the actions required from the employee” which “should include guidance on observing and identifying psychological factors or other features that might characterise suspicious behaviour during remote verification”. This adds to the already burdensome requirements both under the proposes guidelines and those issued by national competent authorities. This burden is further aggravated by the provision set out in paragraph 44(b) which mandates staff to be sufficiently trained on the same issues. Consequently, we call for paragraph 44, including subparagraphs, to be redrafted in a way that would provide financial sector operators with more latitude to design, document, and implement their remote verification processes in a risk-sensitive manner.
With regards to para. 45, when human operators are involved in the process, providing random assignments to the employees responsible for remote verification should be a clear recommendation, not only “when possible”. It should be defined as part of the internal control framework, in order to avoid collusion between the customer and the responsible employee. However, in our view para. 45 adds confusion with regards to the proposed interview guide which requires randomness in the sequence of actions to be performed by the customer. If the intention is to mitigate possible conflicts of interest between the customer and the responsible employee, it should be noted that financial sector operators are already regulated in this regard. In any event, it would appear more proportionate and balanced to follow a risk-based approach and introduce mitigating measures for any specific areas where such risks have been identified. We further consider that the requirement is not clear in relation to the type of random actions that the user can perform. For example, a random action may be to prompt the user to say a random sequence of numbers displayed on the screen of the mobile device. Another option may be to prompt the user to move his head to different sides randomly and as a liveness detection mechanism. Clarification is required on the type of random actions to be performed by the user during the digital identity verification process.
Regarding para. 46, in line with risk-based approach and in order to allow financial sector operators the required flexibility, we believe the examples provided in the subparagraphs should be regarded as examples of possible controls rather than a prescriptive menu of options to be followed, notwithstanding technological developments, etc.
Para. 47 provides that “where financial sector operators resort to digital identity issuers to identify and verify the customer, which are qualified trust services in accordance with Regulation (EU) No910/2014, or to any other digital identity issuer regulated, recognised, approved or accepted by the relevant national authorities as referred to in Article 13(1)(a) of Directive (EU)2015/849, paragraphs 38 to 45 should not be applied”. We ask for clarification whether only paras. 38-45 should not be applied, or also Section 4.3 on Document Authenticity & Integrity.
As regards the use of Digital Identities, we would like to point out that even with nationally acceptable Digital Identities, not all national competent authorities consider them as sufficient and, thus, require additional measures to be performed. Combined with the substantial requirements on policies and procedures proposed in these Guidelines, the overall impact on the availability and use of remote customer onboarding could be substantial.
In para. 45, the guidelines provide that any contract with the customer must be entered into by using the same certificate. However, this approach does not take into account that not all contracts can be signed digitally according to national civil law. This concern is further aggravated by the fact that the term "contract" has not been defined in the guidelines. Moreover, particularly if the scope of the proposed guidelines remains limited to customer onboarding only, it should be noted that during the ongoing customer relationship, the certificate the customer uses could change, just as passports or other ID documents are updated. It would therefore be impossible for the customer to use the same certificate during the whole duration of the relationship with the bank.
In relation to para. 50, which refers to the fact that the guidelines allow for the acceptance of additional digital identity issuers, other than , qualified trust services in accordance with Regulation (EU) No 910/2014 or those regulated, recognised, approved or accepted by the relevant national authorities” and the fact that it requires the financial sector operators to assess the level of assurance of such digital identity, we ask for further clarification on the exact criteria and processes to be applied when such an assessment is to be performed. Furthermore, we ask to check these criteria also in light of the provisions of the e-IDAS Regulation so as not to undermine the use of qualified trusted service providers (QTSPs).
Finally, we ask for clarification whether “electronic certificate”, mentioned in the first sentence of para. 53, means “electronic signature”?
In our view, the proposed Guideline would seem to further add on to the existing regulatory complexity as regards reliance on third parties and outsourcing. Outsourcing is already an example of such extensive regulation. The provisions of the guidelines seem to add to the risk that obliged entities abstain from outsourcing due to over-regulation.
In accordance with para. 58 of section 4.6, an individual assessment of each component / third party is requested for each financial institution that will use the service. While this is an approach that may be suitable for individual FinTechs or financial institutions, it is customary for financial groups to generate efficiencies and homogeneity in risk coverage by relying on technology providers.
These suppliers, in addition to passing the bank Group's admission processes (Procurement, Third Party Risk, etc.) must pass the tests that Compliance Holding carries out to ensure the effectiveness and efficiency of the solution (CRR, filtering, monitoring, KYC, combinations thereof), always associated with a type of business / size of the entity / products offered to customers.
If the supplier's solution does not pass these tests, it cannot be used in the whole group. On the other hand, if it passes them, the solution of this supplier is approved for use within the Group, always associated with the type of business / size of the entity / products offered.
This is why we request that the complete analysis of the AML solution is not requested for each new Fintech that uses a specific AML solution, but that there is the possibility of delivering the original analysis carried out in the Group as well as a study of the suitability of this solution for the bank Group's Obligated Entity in which it is intended to be deployed.
Finally, we question the approach whereby the use of digital identities should not be considered as outsourcing as set out in para. 60.
The EBF fully agrees that ICT and security risk is of utmost importance, also in situations of remote customer onboarding. To onboard customers solely via audio would also increase the risk of fraud, as the fraudster could be the one in contact with the banks’ contact centre, and the prospective customer could be lured to use e.g. their digital ID simultaneously.