Response to consultation on draft Guidelines on the use of remote customer onboarding solutions

Go back

1. Do you have any comments on the section ‘Subject matter, scope and definitions’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

EBA’s proposal for guidelines on remote onboarding to financial services has been discussed thoroughly among some leading providers of identity proofing services in Europe. These providers are Ariadnext, ElectronicID, IDnow, Innovalor, Signicat, SK ID Solutions, Ubble, ZealID (+ likely more to be added). These providers agree on the following joint statement to EBA:

We welcome the publication of EBA’s guidelines on remote onboarding to financial services as an opportunity to firstly recognize remote customer onboarding as a viable alternative to physical presence, and secondly to harmonize requirements for remote customer onboarding across the single European market for financial services. However, we find that the requirements proposed by the current draft guidelines are not aligned with previous work, notably:
• ETSI TS 119 461 Electronic Signatures and Infrastructures (ESI); Policy and security requirements for trust service components providing identity proofing of trust service subjects (July 2021)
• ENISA report: Remote ID proofing – analysis of methods to carry out identity proofing remotely (March 2021)
• ENISA report: Remote identity proofing: Attacks & countermeasures (January 2022)

The ETSI standard proposes requirements for different use cases that all reach a ‘baseline’ level of identity proofing suitable for qualified and other trust services, notably for issuing of qualified certificates, which is on par with electronic identification at level ‘substantial’. The ‘baseline’ level is explicitly defined as corresponding to face to face identity proofing by a trained operator, which is also the benchmark for remote identity proofing as defined by the eIDAS Regulation (Regulation (EU) No 910/2014) Article 24.1.d.

The proposed EBA guidelines, as the ETSI standard, refer to electronic identification ‘substantial’ and qualified signature, but the requirements for onboarding by remote use of identity documents are not up to the level of assurance that should be expected for the finance industry, and not up to the requirements proposed by ETSI as necessary to reach the ‘baseline’ level by such means. The ENISA report from March 2021 surveys state of requirements across European countries. Our experience, as well as a comparison of ENISA’s survey towards the requirements of the proposed EBA guidelines, is that EBA’s proposed requirements also are below existing, national requirements for remote onboarding following the AML directive

We strongly suggest that EBA aligns the guideline requirements with the requirements of ETSI TS 119 461, whose development was funded by the European Commission. The standard is the result of a thorough consensus process by many experts, including national security authorities and supervisory bodies, actors in the trust services industry, and providers of identity proofing services. Aligning identity proofing requirements for qualified trust services and for onboarding to financial services (and even for issuing of digital identity) is beneficial for both regulatory and commercial reasons. Providers of identity proofing services would be able to offer uniform services across sectors, thus optimizing their investments. Onboarding for a financial service could be used directly for onboarding to a qualified trust service and/or for issuing a digital identity, and vice versa.

The proposal for revised eIDAS Regulation will result in harmonized requirements for identity proofing for trust services and for the European Digital Identity Wallet. ETSI TS 119 461 is expected to be a core building block in this harmonization. Using the upcoming revised eIDAS Regulation as the vehicle for harmonized identity proofing, even in the finance industry, can bring large benefits.

EBA promotes a risk based approach to requirements for remote onboarding. This is also the approach taken by the ETSI standard, building on the risk classification presented in ENISA’s March 2021 report. The requirements of the ETSI standard are targeted at mitigating these risks to the ‘baseline’ level.

We understand that EBA's objectives are to remain non-prescriptive regarding technologies and to allow fast implementation of the guidelines. ETSI TS 119 461 follows the same non-descriptive approach defining different use cases that all reach the ‘baseline’ level of identity proofing and being flexible regarding definition of new use cases that can be applied. Regarding fast implementation, all of the providers listed above, and several other actors, have technologies and/or services that by different means fulfil the requirements of the ETSI TS 119 461 standard. If EBA aligns with the ETSI standard, many providers across Europe are ready to supply compliant products and services.

2. Do you have any comments on Guideline 4.1 ‘Internal policies and procedures’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

We have no specific comments on 4.1

3. Do you have any comments on the Guideline 4.2 ‘Acquisition of Information’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

"32. Financial sector operators should implement specific steps during the remote customer onboarding process to obtain information on the purpose and intended nature of the business relationship in accordance with the provision 4.38 of the EBA Risk Factor Guidelines. In particular, they should take risk-sensitive steps to gather information from their customers to identify the nature of their personal, professional or business activities and expected source of funds, and verify the accuracy of this information as necessary"

Despite "38" and given that all solutions for remote identification should be GDPR compliant: what about asking the user for his/her informed consent that what she is doing is for a certain purpose (e.g. opening a bank account). This to prevent that the user ends up with having someting completely different. Obviously, the consent should be recorded and stored by the operator. We propose to add a guideline that addresses user consent practices.

4. Do you have any comments on the Guideline 4.3 ‘Document Authenticity & Integrity’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

"33. Where the financial sector operators accept paper copies, photos or scans of paper-based
documents in the course of remote customer onboarding without having the possibility to
examine the original identification document, they should take steps to have sufficient
assurance as to the reliability of the copy provided. This may include verifying"

In our opinion this guideline does not contribute to prevent ML and FT. It is very hard for an operator (or even a specialised party) to determine the authenticity of an identity document and the data on it based on a copy/photo/scan of the document. Why does EBA keep on promoting this solution for remote identification? Nowadays, there are much better solutions for the verification of the authenticity of the identity document itself (i.e. clone detection) and the authenticity of the data on it. For instance by means of NFC-technology. That is also the reason why there is a chip on all modern identity documents: to provide assurance. Moreover, these solutions allow for straight-through processing of the data and prevents errors due to manually entering data and or OCR-ing of data.

Is a lost/stolen check of the identity document needed? If so, please do take into account that such a check does not add a lot of value when the registration process is combined with an identity document holder verification process based on biometrics. After all, that proves that the user has a valid identity document and that he/she is actually holding it. So it cannot be lost or stolen.

"35. In situations where the customer’s own device allows the collection of relevant data, for
example the data contained in the chip of a national identity card, financial sector operators
should use this information to verify the consistency with other sources, such as the
submitted data and other submitted documents."

Why is this not the case for copies of identity docs as specified in 33? Here also we would expect that a cross check will be done between the data on the copy of the document with other sources. We propose to fix this inconsistency.

What about identity docs shown during a video session with the operator? Shouldn’t there be any guidelines for the operator to check the authenticity of the identity document shown during a video session? E.g. put finger in front of the document, check optical security features, etc.

5. Do you have any comments on the Guideline 4.4 ‘Authenticity Checks’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

"39. Where the remote customer onboarding solution involves the use of biometric data to verify the customer’s identity, financial sector operators should make sure that the biometric data have enough uniqueness to be unequivocally referable to a single natural person. Financial sector operators should verify the unambiguous match between the biometric data indicated on the submitted identity document and the customer being onboarded."

In terms of quality there is huge difference between the face image on the identity document (and of a copy thereof) and the one on the chip of the document. The latter has far higher resolution and will significantly enhance the performance of the biometric face matching. Are both solutions allowed? In line with our remarks above, we would appreciate it if EBA would be more explicit about not using copies of identity documents in remote identification processes.

"42. In situations where the evidence provided is of insufficient quality resulting in ambiguity or
uncertainty so that the performance of remote checks is affected, the individual remote
customer onboarding process should be discontinued and redirected, where possible, to a
face-to-face verification, in the same physical location."

How many attempts are allowed for a remote onboarding? Including passport scan attempts, face matching attempts, login attempts with eIDAS authentication means, etc. .

"43. Where financial sector operators use photograph(s) as a mean to verify the identity of the
customer by comparing it with a picture(s) incorporated in an official document, they
should:"

What is meant with “a picture(s) incorporated in”? In the chip of the document or also on the holder page of the document itself?
If the latter is also allowed, why pose all kinds of requirements for the selfie, as the quality of the source is not good as well. Or else pose strict requirements for the quality of the copy/picture/scan of the identity document to be provided. Governments recommend to add a watermark to such scans to prevent reuse of it for other purposes. Moreover, the scan also may contain privacy sensitive information that the financial operator is legally not allowed to process (e.g. a citizen service number) – users may blind such information, making it harder for the operator to do an authenticity check.


"43. c) perform liveness detection verifications, which may include procedures where a
specific action from the customer to verify that he/she is present in the
communication session or it can be based on the analysis of the received data and
does not require an action by the customer;"

Maybe it is better to speak of presentation attack detection (PAD) instead of liveness. Should the liveness detection include some kind of challenge response mechanism or is just detecting movement of the face sufficient? The first is more reliable and recommended. See also 45 about randomness in the remote identification process.

"43. d) in the absence of human verification, use strong and reliable algorithms to verify if the photograph(s) taken match with the pictures retrieved from the official document(s) belonging to the customer or representative"

This also holds for liveness detection / PAD obviously. Propose to add reliable algorithms to PAD/liveness detection as well.


"44. Where financial sector operators use a video conference as a mean to verify the identity of
the customer, they should:"

What about the security of the conference system as a whole to prevent deep fake attacks or manipulation of video streams? Propose to add guidance for the security of video solutions.

"46. In addition to the above, and where appropriate to the ML/TF risk presented by the business relationship, financial sector operators should use of one or more of the following controls:"

Shouldn’t operators do a periodic re-check of the identity of het user in case of a remote onboarding. E.g. ask the user to scan his id-doc and do holder verification every two years?

6. Do you have any comments on the Guideline 4.5 ‘Digital Identities’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

"54. Financial sector operators should take steps to minimize the risk that the customer's identity is not the claimed identity, taking into account at a minimum the risk of lost, stolen, suspended, revoked or expired identity evidence"

This requirement does not seems to fit under “Digital identities’ unless you are referring to lost/stolen/suspended authentication means. The term ‘expired identity evidence’ however seems to refer to identity documents. If so, we propose to move this requirement to the section on authenticity checks above (section 4.3 or 4.4).

7. Do you have any comments on the Guideline 4.6 ‘Reliance on third parties and outsourcing’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

No comments.

8. Do you have any comments on the Guideline 4.7 ‘ICT and security risk management’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

No comments

Name of the organization

InnoValor