Response to consultation on draft Guidelines on the use of remote customer onboarding solutions

Go back

1. Do you have any comments on the section ‘Subject matter, scope and definitions’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

Comment 1

Regarding the definition of “Biometric Data” included in paragraph 9 of the Guidelines:

This definition is based on the concept already defined by Regulation (EU) 2016/679 of the European Parliament and of the Council, of 27 April 2016, on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), known as GDPR.

In this sense, article 4.14) of the GDPR includes the following definition: “‘biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”.

The undersigned considers that the Guidelines shall include the same definition already set by the GDPR. Current definition of “Biometric Data” is missing part of the GDPR definition (marked in "*" in the following paragraph), and it should preferably be amended, as it can otherwise be confusing:

Biometric Data: “personal data *resulting from specific technical processing* relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”.

4. Do you have any comments on the Guideline 4.3 ‘Document Authenticity & Integrity’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

Comment 2

1.-The text establishes in paragraph 33 the possibility of accepting in the identity verification process the use of photocopies under certain conditions. Specifically, it mentions the following:

“Where the financial sector operators accept paper copies, photos or scans of paper-based documents in the course of remote customer onboarding without having the possibility to examine the original identification document, they should take steps to have sufficient assurance as to the reliability of the copy provided.”

A solid and strong identity verification process should only allow the use of authentic documents to ensure the security of the process. The capture of the authentic document can be done through one or more images (photographs of the original document) and/or a video (of the original document).

Numerous regulations of the different states of the European Union require the use of original documents and explicitly exclude the possibility of using photocopies or other types of representations of the document:

- In Spain, SEPBLAC establishes through the note “AUTORIZACIÓN DE PROCEDIMIENTOS DE VÍDEO-IDENTIFICACIÓN”, the need to use an original document. Specifically, it establishes that “El sujeto obligado deberá obtener y conservar en los términos del artículo 25 de la Ley 10/2010 una fotografía o instantánea del anverso y reverso del documento de identificación utilizado”. [Unofficial translation: “The regulated entity must obtain and keep, in accordance with the terms of Article 25 of Law 10/2010, a photograph or snapshot of the front and back of the identification document used”.]
- In Spain, the “Orden ETD/465/2021, de 6 de mayo, por la que se regulan los métodos de identificación remota por vídeo para la expedición de certificados electrónicos cualificados”, is complemented by the “Anexo F.11 de Herramientas de vídeoidentificación de la Guía de Seguridad de las TIC CCN-STIC-140”. This guide establishes the obligation to capture images of the identity document.
- In Italy, the document “DISPOSIZIONI IN MATERIA DI ADEGUATA VERIFICA DELLA CLIENTELA PER IL CONTRASTO DEL RICICLAGGIO E DEL FINANZIAMENTO DEL TERRORISMO” establishes the obligation to capture images of the identity document.
- In Germany, the Federal Financial Supervisory Authority (BaFin) establishes on its “Circular 3/2017 (GW) - video identification procedures” several authentication checks that shall be done during the video identification process, that necessarily require the use of an original document (i.e. “During the video transmission process, the respective employee must create photos/screenshots which clearly show the person to be identified as well as the front and reverse of the identity document used by this person for identification purposes and the information held on this document”; “Only identity documents with security features that are sufficiently forgery-proof, clearly identifiable and therefore verifiable both visually in white light and using the available image transmission technology (see list under B.VI) as well as which have a machine-readable zone may be used during the video identification process as proof of identity pursuant to anti-money laundering regulations”, etc.).
- In Poland, the Urzad Komisji Nadzoru Finansowego (UKNF) issued in 2019 guidelines regarding client identification and verification of their identity in banks and credit institutions based on video verification methods, and requires that, in the course of a video call, “podczas której pracownik banku uzyskuje możliwość bliższej obserwacji klienta i oryginałów dokumentów przedstawionych przez klienta”. [Unofficial translation: “a bank employee gets the opportunity to closely observe the client and original documents provided by the customers”.]
- Outside the European Union, Mexico has one of the most advanced regulations in terms of digital identity verification for the bank onboarding process. In Mexico, the CNBV establishes the need to use an original document.
(see https://www.dof.gob.mx/nota_detalle.php?codigo=5619057&fecha=21/05/2021).

On the other hand, from the security point of view, the use of a photocopy prevents the verification of the presence of some additional security measures present in the original document. These security measures include the presence of textures characteristic of the original print, micro-writing and holograms.

An attacker can use a photocopy to perform a spoofing operation. For example, an attacker could perform an attack based on the following steps:
- Capture a photograph of an original and authentic document.
- Edit the photograph of the above document using a photo editing tool. By this editing, the attacker could alter the personal data of the document, the address, etc.
- Print the altered document on paper.
- Finally, capture an image of the altered photocopied document during the digital identity verification process.

For this reason, admitting documents in a photocopy is a security hole.We consider that the EBA guidance should not represent a step backwards in the requirements of the most advanced digital onboarding systems and technologies by allowing the use of photocopied documents, which would weaken the credibility and solvency of the European system as a whole.

2.- Quality of the evidences is key, as the Guidelines set in several occasions through the proposed text. In this sense, and regarding identity documents, it is worth mentioning that although the remote customer onboarding process may include a video recording (i.e. a video call or a non-assisted video), obtaining a separate photo of the identification document used by the customer is the only way of guaranteeing sufficient quality and clearness conditions, so that the evidence (the photo of the identification document) has enough quality.
This is stated in the “AUTORIZACIÓN DE PROCEDIMIENTOS DE VÍDEO-IDENTIFICACIÓN” published by SEPBLAC in Spain, as it establishes that “La fotografía o instantánea obtenida deberá reunir las condiciones de calidad y nitidez que permitan su uso en investigaciones o análisis, no reputándose válida a estos efectos la mera captura de fotogramas del proceso de video-identificación”. [Unofficial translation: “The photograph or snapshot obtained must meet the conditions of quality and sharpness that allow its use in research or analysis, not being considered valid for these purposes the mere capture of frames of the video-identification process.”]
In this sense, the guidelines of the UKNF previously mentioned also considers as enhanced security measures the necessity of guaranteeing a high level of quality of the identification document photographs taken: “porównanie zdjęć z dowodu tożsamości bezpośrednio z wizerunkiem klienta oraz ze zdjęciem twarzy (wskazane bez okularów), przy zagwarantowaniu odpowiednio wysokiego poziomu jakości wykonywanych fotografii”. [Unofficial translation: “photos from an identity card directly with the image of the customer and with a photo of the face (indicated without glasses), while guaranteeing a sufficiently high level of quality of photographs taken”.]

In addition, it is proposed to verify the following security measures in the identity document:

- Verify that the document has not been photographed on a screen (replay-attack).
- Verify that the photograph of the face printed on the document is authentic and has not suffered any type of alteration or impersonation. For example, it must be verified that an attacker has not placed a photo on top of the original photo.
- The possibility of using images from the user's device's camera roll must be excluded. The use of this type of images does not ensure the integrity of the process, since the images from the camera roll may have physical or digital modifications or alterations.
- Finally, the use of biometric technology to verify that the user's identity is not duplicated should be proposed in the guidelines. This type of technology compares the face printed on the ID card against the printed face of users who have already registered in the system. This makes it possible to find duplicate identities and avoid situations in which an attacker uses different manipulated documents, all with the same printed face, but with different personal data (name, surname, ID number, etc.). The introduction of this measure would help significantly to avoid the use of the financial system by false identities, frequently used by criminal organizations.


Comment 3

The text establishes in paragraph 35 the need to read the chip when the user device has the capacity to employ this technology and the document contains a chip. Specifically, it states the following:

“In situations where the customer’s own device allows the collection of relevant data, for example the data contained in the chip of a national identity card, financial sector operators should use this information to verify the consistency with other sources, such as the submitted data and other submitted documents.”

The use of NFC reading technology should not be mandatory but just a recommendation, or at least should be limited to high-risk transactions, but should not be required for online customer onboarding.

NFC chip reading of ID documents is a process that generates a lot of friction from a user experience point of view. The reasons for this are as follows:

- It is not obvious for a user to know the position of the NFC chip in the document. Therefore, carrying out the reading process on the mobile device is complex.
- The NFC chip reader is located in different positions on each mobile device. Therefore, it is not obvious for a user to know how to position his/her device to read the chip in his/her document. Nor is it easy for the service provider to explain how the user should position his/her device, as this positioning differs depending on the mobile device and the document to be read.

Due to the above reasons, the chip reading process is associated with significant drops in the conversion funnel of the process. This causes higher acquisition costs for financial institutions and hinders the digitization of citizens.

On the other hand, the wording of paragraph 35 establishes the need to read the document chip "where the customer's own device allows the collection of relevant data". In other words, it conditions the reading of the chip to the device used by the user. In practice, this wording forces the digital identity verification process to be carried out through a mobile APP (iOS or Android), and excludes the possibility of using web technology. This is because the web environment does not currently have the ability to read the chip of identity documents. Therefore, a device with chip reading capability would be forced to perform the digital identity verification process in a mobile APP, since the wording conditions its use only to the capabilities of the mobile device, and not to the capture environment used (iOS/Android or web). Regulations and recommendations should be technologically neutral and not conditioning users and financial institutions to one predetermined system.

The limitation of the digital identity verification process to the native environment is a very important limitation for financial institutions.

For the above reasons, we consider that the reading of the document chip should be an optional feature for customer onboarding processes, being only required as mandatory for high-risk processes.

On the other hand, in those situations where the reading of the NFC chip is carried out, it is considered necessary to require the comparison between the personal data stored in the NFC chip and the personal data printed on the ID card. In particular, personal data such as first name, last name, date of birth and ID number should be compared. Additionally, the face printed on the document and the photo stored on the NFC chip should be verified to correspond to the same person, with the use of biometric technology. The above techniques are intended to ensure that the capture of the document image and the data obtained from the chip correspond to the same document.

5. Do you have any comments on the Guideline 4.4 ‘Authenticity Checks’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

Comment 4

The document establishes in paragraph 39 that the system must unambiguously verify that the person performing the process is the same person who appears in the document. Specifically, it states the following:

“Where the remote customer onboarding solution involves the use of biometric data to verify the customer’s identity, financial sector operators should make sure that the biometric data have enough uniqueness to be unequivocally referable to a single natural person. Financial sector operators should verify the unambiguous match between the biometric data indicated on the submitted identity document and the customer being onboarded.”

On the other hand, paragraph 43 d) establishes similar requirements. Specifically it states the following:

“Where financial sector operators use photograph(s) as a mean to verify the identity of the customer by comparing it with a picture(s) incorporated in an official document, they should: (...) d) in the absence of human verification, use strong and reliable algorithms to verify if the photograph(s) taken match with the pictures retrieved from the official document(s) belonging to the customer or representative.”

The undersigned believes that it is necessary to establish quantifiable biometric accuracy requirements to ensure the use of sufficiently accurate technology.

The National Institute of Standards and Technology (NIST), under the U.S. Department of Commerce, is a world-renowned authority on setting the highest technical standards in the most advanced technologies.

In practice, in relation to biometric technology it has established itself as the international laboratory that biometric engine developers turn to in order to assess the quality of their technology. This results in a global ranking of entities (companies, institutes, universities, etc.) in relation to the quality of biometric engines.

When we speak of quality, it should be understood in a broad sense, since NIST evaluates levels of reliability or accuracy, unbiased performance, etc.

The name of the NIST assessment to which the technology would be submitted for evaluation is the Face Recognition Vendor Test (FRVT) 1:1:
https://www.nist.gov/programs-projects/face-recognition-vendor-test-frvt-ongoing.

NIST evaluates face verification technology in different categories. Each of them represents a use case. The reference category that should be considered is VISABORDER.

We believe that a balanced and demanding criterion in terms of biometric accuracy involves requiring that the biometric engine has been qualified by NIST with a false positive rate (FMR) of less than 0.0100 for a false negative rate (FNMR) of 1E-6 in the VISABORDER category.

However, the accuracy of biometric engines improves considerably every year. For this reason, as an alternative to the previous proposal, it is proposed to establish an accuracy criterion that requires to be evaluated among the top 50% of the best biometric engines evaluated at any given time. This avoids the use of biometric engines with low accuracy, but maintains the possibility of using a large number of biometric engine manufacturers.

The submission of biometric engines to NIST is free of charge. It is also currently a continuous assessment, to which entities can submit a biometric engine every four months.


Comment 5

The document establishes in paragraph 40 the need to employ a liveness detection process during digital user identity verification. Specifically, it states the following:

“Where the ML/TF risk associated with a business relationship is increased, financial sector operators should use remote verification processes that include liveness detection procedures examining whether the video, picture or other biometric data captured during the remote customer onboarding process belong to a living person present at the point of capture, or real-time videoconference.”

Likewise, the document establishes in paragraph 43 c) a requirement oriented in the same direction. This paragraph states the following:

“Where financial sector operators use photograph(s) as a mean to verify the identity of the customer by comparing it with a picture(s) incorporated in an official document, they should: (...) c) perform liveness detection verifications, which may include procedures where a specific action from the customer to verify that he/she is present in the communication session or it can be based on the analysis of the received data and does not require an action by the customer.”

We believe that requirements should be established to ensure that the liveness detection mechanisms implemented are sufficient to guarantee the security of the process.

Specifically, ISO 30.107 "Information technology - Biometric presentation attack detection" establishes requirements on how biometric liveness detection technology should be evaluated. It is considered necessary to require the use of liveness detection technologies that have been evaluated according to ISO 30.107 and can demonstrate compliance.

On the other hand, “5. Accompanying documents", part "5.1 Draft cost-benefit analysis / impact assessment" in the section "3. Verification Process: Liveness detection" considers the need to use liveness detection in all cases or just in some cases.

The document finally establishes that liveness detection should only be used in certain cases. Specifically, the text states the following.

“The preferred option is mandatory liveness detection in high risk cases only (Option 2). It is important to define criteria for situations highly dependent on the technology with little or no direct human intervention. In this context, EBA considered that the reliability of the verification process increases significantly when the process resorts to liveness detection.”

We believe it is necessary to establish liveness detection verification technology in all cases (option 3: Mandatory liveness detection in all cases). Biometric liveness detection technology is key to user identity verification, just as the EBA mentions in the referred paragraph.

The absence of a biometric liveness detection process cannot guarantee that the person carrying out a process is who he or she claims to be. It therefore poses a risk to financial institutions.

In addition, the use of liveness detection technology does not pose excessive friction for the user during the identity verification process. Therefore, it is worthwhile to implement liveness detection mechanisms in all cases taking into account the balance between cost and benefit. As mentioned above, we believe that the standards to be set in this guide should be set at a level of stringency and development that will place the digital onboarding system in the European banking sector at an adequate degree of security and solvency.


Comment 6

The document establishes in paragraph 45 the need to employ a sequence of random actions by the user. Specifically, it states the following:

“Where possible, financial sector operators should use remote customer onboarding solutions that include randomness in the sequence of actions to be performed by the customer for verification purposes. Where possible, financial sector operators should also provide random assignments to the employee responsible for the remote verification process to avoid collusion between the customer and the responsible employee.”

We consider that the requirement is not clear in relation to the type of random actions that the user can perform.

For example, a random action may be to prompt the user to say a random sequence of numbers displayed on the screen of the mobile device. Another option may be to prompt the user to move his head to different sides randomly and as a liveness detection mechanism.

Clarification is required on the type of random actions to be performed by the user during the digital identity verification process.


Comment 7

The document establishes in paragraph 46 that “financial sector operators should use of one or more of the following controls:
a) the first payment is drawn on an account in the sole or joint name of the customer with an EEA-regulated credit or financial institution or in a third country that has AML/CFT requirements that are not less robust than those required by Directive (EU) 2015/849;
b) send a randomly generated passcode to the customer to confirm the presence during the remote verification process. The passcode should be a single-use and time-limited code;
c) capture biometric data to compare them with data collected through other independent and reliable sources;
d) telephone contacts with the customer;
e) direct mailing (both electronic and postal) to the customer.”

We consider it necessary to include a compensatory measure that allows the detection of duplicate identities. We propose the use of biometric facial identification technology (1:N), to compare the user's photograph at the time of onboarding with the photographs of existing customers of the financial institution. Through this comparison, the financial institution can ensure that the person trying to become a customer is not already registered with the institution with other personal data. Some European financial institutions already use this compensatory measure as part of their digital onboarding process as an additional anti-fraud mechanism.

We consider it necessary to include a compensatory measure that allows the detection of duplicate identities. We propose the use of biometric facial identification technology (1:N), to compare the user's photograph at the time of onboarding with the photographs of existing customers of the financial institution. Through this comparison, the financial institution can ensure that the person trying to become a customer is not already registered with the institution with other personal data. Some European financial institutions already use this compensatory measure as part of their digital onboarding process as an additional anti-fraud mechanism.

6. Do you have any comments on the Guideline 4.5 ‘Digital Identities’? If you do not agree, please set out why you do not agree and if possible, provide evidence of the adverse impact provisions in this section would have.

Comment 8

We agree with the cost-benefit analysis performed by the EBA in section “4. Digital identities” of part “5.1 Draft cost-benefit analysis / impact assessment”, which concludes that letting financial sector operators determine whether a Digital Identity Issuer is independent and reliable to support the initial CDD process, subject to certain conditions set out in the Guidelines.

This conclusion is in line with the risk-based approach, and enables the use of new technological solutions on the basis that they must meet the minimum requirements of reliability in verifying the identity of users that the financial sector operator has deemed necessary for each type of customer, operation, etc.

Furthermore, it overcomes the current limited use of qualified electronic certificates in Europe. This was for example recently recognized in Spain by the Secretary of State for Social Security and Pensions on its “Resolución de 25 de mayo de 2021, de la Secretaría de Estado de la Seguridad Social y Pensiones, por la que se habilitan trámites y actuaciones a través de los canales telefónico y telemático mediante determinados sistemas de identificación y se regulan aspectos relativos a la presentación de solicitudes mediante formularios electrónicos”, where it stated that “se constata que el uso de certificados electrónicos o de cl@ve como medios de identificación no está suficientemente extendido entre los ciudadanos” . [Unofficial translation: “it was noted that the use of electronic certificates or cl@ve as a means of identification is not sufficiently widespread among citizens”.]

Name of the organization

Veridas Digital Authentication Solutions, S.L.