European Banking federation

The European Banking Federation (EBF) welcomes the holistic approach of the European Supervisory Authorities’ discussion paper on the growing trend of digitalisation and automation of financial services. The developments of new technological tools should be indeed adapted to customers’ needs which expect from the financial institutions to revolutionise the way they manage their finance by instantly tailoring their investment portfolio, proposing new products directly and quickly accessible on their mobile etc. Today a new generation of investors/customers exist which is financially literate, but lacking the liquid assets needed to access most private banking services.

In the EBF’s views, the discussion paper can be considered rather balanced when identifying the benefits and risks for consumers and businesses. The automated financial advice process should be designed with a clear focus on investors/consumer’s needs and how the service will add value to them. But it should importantly incorporate appropriate safeguards to ensure sufficient clarity on protection/potential liabilities. The notion of personalisation of advice should be preserved and remain adapted to the type of product sold to the customers to avoid any misunderstanding and risks of mis-selling which would be detrimental to the customers.
We believe it is important to stress the potential benefits that regulated financial services providers can bring to the market by offering certain automated financial services under the existing regulations, provided that barriers to digitalisation are removed where necessary in order to create a level playing field.
In order to find the right equilibrium between the development of innovative solutions and preservation of sufficient safeguards, a more in-depth analysis of the benefits and the risks for the consumers and the financial institutions should be encouraged before considering any regulatory or supervisory actions.

The EBF mainly agrees with the characteristics described in the discussion paper and acknowledges the approach undertaken to:
i) Primary focus on the phenomenon of the automation of financial advice and not on the provision of advice itself. Indeed, the definition is common in most of cases to all types of advice, with or without human intervention. Consequently, the definition should be linked to the absence or very limited human interaction and not on the provision of advice itself. Both types of advice, with or without human intervention, may rely on similar tools (client personal information and algorithms and/or decision trees), and exactly the same methodologies can be applied. Moreover, it is anticipated that in a lot of cases, it may not be possible to easily determine whether a tool is considered as an automated financial advice l. The degree of human intervention may vary, and certain automated solutions will be used to support automated advice that is based on human intervention.

ii) Consider the existing framework of applicable European and national legislation which might, to an extent, mitigate some of the risks arising as a result of the phenomenon of automation in financial advice (including for example the Markets in Financial Instruments Directive (MiFID 2) for investment advices, the current data protection directive and adopted data protection regulation for the processing of personal data).
However, the EBF would like to point out that the Discussion Paper deals with a large variety of aspects of automation in financial advice. Certain forms of automation in financial advice are already widely adopted and commonly accepted (e.g. providing online investment advice when a client purchases financial instruments online, having a customer completing a MiFID questionnaire online, having a customer providing information needed to apply for a mortgage credit online etc.). Other forms of automation in financial advice are currently less-developed, and it remains to be seen if and when will ever be used and accepted on a large scale (e.g. fully automated asset management or robot-advice). When assessing benefits and risk of automation in financial advice and the need for potential further action, it is key to avoid transposing existing “wholesale” solutions that affect further the digitalisation of financial services in areas where prudential and customer risks are already properly regulated, simply due to perceived risk related to more “exotic” forms of digitalisation.

Furthermore, digital credit scoring processes which allow consumers to obtain a loan/credit based on his/her input, may also be perceived as automated financial advice. Even though the final output may only be a simple yes/no answer to the consumer’s credit application, it constitutes an indirect automated financial advice on whether the loan in question is within the financial capabilities of the applicant or not.

It is therefore important to develop precise and appropriate definitions and delimitations in order to avoid confusion. In our views, a clear distinction should be made between the use of an automated tool and the use of an automated financial advice e.g investment services should be regulated under MiFID but not the other types of services like comparison websites.
We believe that this distinction also have to be clear for the customer and should not be based on its own perception. It could otherwise lead to uncertainties e.g a situation where identical processes are considered as having different consequences.
Advice may be offered based in the same technical tools through different channels (branch, online, Skype, phone, etc.) and through different facilitators/devices (robot, platform, face-to-face, etc.).

According with what we have mentioned previously, automated advice is not necessarily an online platform, for example it could be provided through an ATM, by telephone or by a robot. The key difference in the definition is the “do it yourself style”, using some kind of device without direct human support or with a very limited human intervention.

It is important to stress that although the characteristics of automated financial advice limit the human intervention, a permanent access to an operator (via an online chat, mail or telephone) may be provided to help the customer along the process. However, being supported by a specialist in finance that complements the automated tool,clarifyies information on demand or provides explanations during the process, implies higher level of service than interacting solely with a machine and is not considered as an automated advice.

Currently, “decision trees” appear as the dominant form of automated advice, but looking ahead we would expect to see automated financial advice tools being increasingly based on machine learning and self-learning algorithms.

Machine learning involves the construction of algorithms that learn from and make predictions on data. Such algorithms thus operate by building a model from example inputs in order to make data-driven predictions or decisions, rather than strictly following static program instructions. This leaves the fine tuning of the models to computers. Although, if the tools tend to be automated and their criteria converge, the risk concentration of the system may increase. We believe that similar methodologies may be applied and may give similar outputs.
In general, automated advisors use algorithms and model portfolios to allocate investments based on client’s objectives and risk tolerance. Banks are usually more equipped to provide detailed personalised financial planning services. We observe a rapid emergence of robot-advisors for wealth management in the US but the phenomenon is now also spreading to Europe. As explained above, certain forms of automation in financial advice are already widely adopted and commonly accepted (e.g. providing online investment advice when a client purchases financial instruments online, having a customer complete a MiFID questionnaire online, having a customer provide information needed to apply for a mortgage credit online etc.). Other forms of automation in financial advice are currently less-developed, and it remains to be seen if and when they will ever be used and accepted on a large scale (e.g. fully automated asset management or robot-advice).
It is also important to note that the financial automated advices developed by banks focus more on the provision of information, comparison websites and calculators. A clear distinction should therefore be made between the use of an automated tool and the use of an automated financial advice, and consequently also between MiFID and non-MiFID services. In our views, investment services should be regulated under MiFID but not the other types of services like comparison websites.
i) Other financial entities are moving toward a fully integrated digital model. In general, automated advisors use algorithms and model portfolios to allocate investments based on client’s objectives and risk tolerance. Banks are usually more equipped to provide detailed personalised financial planning services. We observe a rapid emergence of robot-advisors for wealth management in the US but the phenomenon is now also spreading to Europe.
ii) Currently there are financial automated advices developed by banks which are more focused on the provision of information, comparison websites and calculators. As mentioned in the response to question 2, even if the characteristics of automated financial advice limit the human intervention, a permanent access to an operator (via an online chat, mail or telephone) to provide assistance is usually provided to help the customer along the process. The combination of automated financial advice advice and personal advice without media disruption might be a possible option too.
In addition, financial automated advice initiatives are currently under evaluation (concerning “execution only” but also discretionary management and investment advices).
iii) It can concern all types of clients but more especially the new generation of investors/customers, used to new technologies, ready to use innovative tools, access products in a seamless way. It will lead to a democratisation of these services as investors who could not access a personal advisor will now be able to invest with expert financial advice (not only restricted to customers who met specific requirements like a certain minimum amount to invest). Automated financial advices should be seen as an alternative or a complementary tool to the personalised advice in case the customer demands human interaction.
 Regarding the importance to differentiate automated tools not designed to be financial advice from automated financial advice issues:
The presumption that no matter the characteristic of the tool, the service provided is considered as a financial advice as long as the consumer perceives it as such, might create liability issue and barriers to the provision of certain services. There are indeed for example some tools provided by financial institutions which are not designed to give financial advice but to merely guide investors/customers to make their own investment decisions on a well-informed basis. There is not a clear stance on how to treat this type of service. This leaves institutions with concerns about regulatory risk, possibly preventing them from investigating the merits of, and developing, such services. Such platforms and tools should not leave room to interpretation. It should be made clear for the clients whether they are receiving an automated advice or not.

A clear distinction should therefore be made between the use of an automated tool and the use of an automated financial advice and consequently between MiFID and non-MiFID services. One alternative should be to address automated financial advice in compliance with specific regulations (e.g Pure investment advice under MiFID (investment service and MiFID products). This topic should be included in the future discussions of automation in financial advice, in order to further clarify the boundaries.

 Further consideration should also be given to regulatory obligations:
- These may vary among the different Member States and according to the products (e.g pre-contractual and contractual, reflection period, signing contracts etc.). The paper states that the automated financial advisers could facilitate cross-bordertransactions. In our views the problem is not necessarily technical but regulatory.
- These could impose important standards which require interactions with clients prior to new relationship (e.g now your customer diligence, Anti-Money Laundering/Combating the Financing of Terrorism (AML-CFT) and Foreign Tax Compliance Act (FATCA)/Common Reporting Standards (CRS) aspects etc.). Those standards are enhanced considerably for remote/online selling, which should also be the case for automated financial advices..
- Although certain forms of automated advice may fall within the remit of MiFID or the Insurance Distribution Directive (IDD), these regulations do not deal specifically with automated advice. .The lack of level playing field as some new actors have the capacity to offer the same service as banks or insurers without being submitted to the same rules

 In addition, the lack of financial literacy might represent an obstacle. Raising awareness as regards the importance of financial education and financial literacy should be a priority (see in this regard the OECD working papers)
We consider that the potential benefits are, in general, accurately described however some elements need to be elaborated or further clarified:

B1: Consumers pay less when they receive advice through automated tools

Benefits relating to the reduction of the costs are relative. Even if consumers might pay less when they receive advice through automated tools (B1) it is not necessarily always the case. It might depend of the level of quality of the tool. The development and maintenance costs are also elements to take into consideration.
Actual development and maintenance costs for the tool might be very high at inception and should be also taken into account, as well as other running costs such as cybersecurity or compliance aspects which will have to be reflected in the price of the service.

Cost comparison should be based on the same service but it remains very difficult to achieve due to several reasons, including: e.g advice enhanced by human support through supervised entities vs. tool that does not need to comply with any quality requirements and is not supervised by any regulator, cannot be comparable. The customers need to be well informed of the different service categories.

Even if the provision of automated advice might appear cheaper and not requesting any thorough understanding/knowledge, it may imply a higher risk for clients as their decision will be based on automatic algorithms that may induce them to guide themselves for example by the potential profitability of some actives that may be adequate only for high risk profiles. However, if the client redefine his profile in order to invest in an adequate product, an advisor may contact the customer to explain him/her the risks.

B2. A wider range of consumers has access to advice through automated tools

Accessibility remains the main benefit to the consumer.
Considering the rise of new technologies and the globalization of the access to the Internet from any electronic devices, an automated tool allows a broader range of customers (until now restricted to customers who met specific requirements like a certain minimum amount to invest) (B2) to access certain types of products in a more seamless way. However, it is important to underline that the customers concerned will necessarily have to be familiar with online tools, so only a certain category of consumers will be able to use and access advice through automated tools.

B4: Consumers obtain financial advice in a faster, easier and non-time-consuming way

This argument is based on the idea that answering a few questions and specifying a few criteria can produce a result on a par with advice obtained personally in a branch.
It is important to recall that the supervisory requirements under MiFID 1, but certainly also under MiFID 2, oblige advisers to do much more than just collecting a few details about clients. Instead, they are required to make a comprehensive analysis of clients’ personal and financial circumstances and to identify clients’ investment preferences together with clients, taking due account of these circumstances. Clients sometimes pursue conflicting financial objectives, and these have to be discussed jointly by client and adviser and reconciled. Sometimes clients overlook important needs particularly when it comes to financial protection against risks; where insurance is concerned, this is only revealed after an adviser takes full stock of all existing or, as the case may be, missing insurance policies. Only if a tool offered online does the same should it be “classifiable” as advice. Whether the data input required is actually less time-consuming for consumers is doubtful. If such a questionnaire is to be compiled, for example, by means of extensive profiling through evaluation of contract data, usage data and internet footprint data, it should be pointed out that substantial data protection obstacles first have to be overcome in this respect.

B5: Consumers receive more consistent advice when they use automated tools

What is evidently meant is that automated advice is free of any subjective and personal assessments on the part of the adviser, offering a wider perspective (taking into consideration potential and up-to date options). This may be perceived as an advantage in individual cases. Nevertheless, it should not be underestimated that precisely the adviser’s personal experience can also be extremely beneficial to a recommendation. After all, it is again ignored in this context that current automated product selection tools do not conduct any check similar to a suitability assessment, so that automated advice and face-to-face advice cannot in fact be compared. In particular, automated advice does not normally allow any processing of individual descriptions of complex personal situations. Also, whether the term “personalised feedback” can be used in connection with automated advice (see section 38 of the Discussion Paper) is questionable (see also our above comments on section 36).
In our views, the same tools, methodology and outputs may be offered with or without human interaction.
Automated financial advice could facilitate:
i) The provision of information which are up-to-date on on-going basis:
⁻ The clients’ portfolios are adjusted in accordance with market conditions and any other relevant factors.
⁻ The client may always have access to his/her updated position at market prices.
ii) Record keeping (e.g offer the possibility to follow up and review the former investment decisions and the grounds on which they were adopted).
iii) Financial inclusion: It will lead to a democratisation of these services as investors who could not access a personal advisor will now be able to invest with expert financial advice (not only restricted to customers who met specific requirements like a certain minimum amount to invest).
In this respect, it is worth pointing out that there is a fear that the new requirements on the ban of inducements under MiFID 2 while having a different impact in the various Member States, is likely to increase the cost for advice for retail clients. An advisory service which is highly computerised, such as the robot-advice, will certainly be less expensive and could be therefore more attractive for clients with lower assets under management.
iv) Automated financial advice service is defined by the direct inputs of the client, and may therefore offer more opportunities for services tailored to the client’s needs.
/
/
We consider that the potential benefits are, in general, accurately described. The provision of advice through automated tools is ongoing and eliminates subjective interpretations and human mistakes.
Costs are likely to be lower with generally a reduction of fixed costs but we do not believe that it necessarily incurs fewer costs. The impact on costs should be specifically analysed considering that:

i) The costs of development of the tools and of granting permanent online access to reliable sources of data would presumably be high. Especially at the beginning if only a few consumers will use the tool. The costs for the financial institutions will be important because banks will have to bear the costs of both distribution channels at the same time (branch network and automated online advice).
ii) The possibility of some human contact would probably always be kept.
iii) Qualified personnel should probably be needed to permanently check and verify the accuracy of the outcome of the tool, the algorithms used, etc.
iv) Although potential human errors could be reduced, there may other costs derived from the misuse of the tool, any flaws on the use of the tool or any other issues that could imply civil liability (controls to prevent cyber-crime and ensure cyber-security, data protection might also have an impact on the costs).
In addition, regarding section 42 of the Discussion Paper which mentions that institutions can use automated financial advice tools to deliver a more standardised “consumer experience” free of human interpretation: To allow proper comparison, comparable standards would have to be defined for both face-to-face and online advice. For example, a tool that matches available products solely against the clients’ risk appetite can hardly be called “advice”. See also response to question 6.

We also believe, as stressed above, that a further differentiation based on the quality of a tool’s output should be made. Such a differentiation is required so as to avoid any conflict with rules on the provision of advice already in place, e.g. in the area of investment advice on financial instruments.
The use of automated tools would allow the entities to maintain accurate and up to date registers of all the processes. It would enable them to perfectly reconstruct and track all steps in the transactions, the communications with the clients and provide evidences in the event of consumer complaints but also for administrative proceedings or law suits in which clients are involved or when supervisors or courts request them.
/
/
As a general comment, the EBF welcomes the description of the potential risks to consumers identified in the discussion paper. The EBF would like however to bring further clarifications on the potential risks identified.

R 1: Consumers make unsuitable decisions as a result of lack of information, and reduced opportunity to fill the gaps or seek clarification.

While consumers making unsuitable decisions represents potentially a risk for any kind of services provided online, it should be considered that this is mitigated by the possibility for the customers to have a permanent access to an operator (via an online chat, mail or telephone) to provide assistance to the customer along the process. In some cases, providing automated financial advice might even avoid any potential conflicts of interest due to human intervention. It is also important to stress that, in general, as for any financial advice, specific legal requirements would have to be fulfilled. This risk can also be reduced by implementing a high quality automated decision tree containing a number of feedback loops and control questions.

However, regarding online platforms and websites some of our main concerns remain, such as :
 Risk for clients to considerand treat as advice other type of services. The risk of confusion could arise from the fact that the customer based its decision on price comparison with limited information or s/he confused recommendation with advice.
 Receive advice without realizing: the client should provide a clear consent and express his understanding before receiving the advice.

R2: Consumers receive unsuitable advice as a result of not being made aware how information they input is used by the automated tool.

This risk is not specific of the provision of automated financial advice, since this could also happen under face to face relationship. The fact that a customer does not understand how the information is processed does not make the provision of the automated financial advice or the recommendation invalid. The most important aspect should remain for the customers to obtain any suitable assistance or responses to their questions.
Requiring the customer to understand how the information is used by the automated tool would require complex mechanisms to be put in place and could even lead to unintended consequences such as the provision by the customers of 'social adequately replies'. Such behaviour might lead to a provision of an advice which is not anymore suitable to the real need of the customer.

R3: Consumers receive unsuitable advice as a result of biases in the tool that they are not aware of

This is primarily an issue of design of the tool. For that reason, it is important that firms using these tools are required to comply with the appropriate sectoral regulation. For instance, a lot of these risks can be managed by a proper application of the MiFID legislations which provides for duties to act in a fair and honest manner, provide fair and non-misleading information, prevent and manage conflicts of interest. The Consumer Rights Directive and Unfair Contract Terms Directive, Distance Marketing of Financial Services Directive, Misleading and Comparative Advertising Directive and the common rules on liability provide further protection against the use of biased in the tool.

R4: Consumers have limited and/or unclear information about the extent to which the tool produces recommendations tailored to them.

The risk would be mitigated if the specific service being provided is clearly explained to the client. Even if the advice will be more standardised, it does not preclude the customers to receive advices tailored to their needs. Actually, if the customer changes any of the data, the recommendation will vary.

R5: Consumers do not understand who is providing advice because of the fragmented nature of the advice process

The fact that consumers do not understand who is providing advice because of the fragmented nature of the advice process is indeed an important risk which may arise when the financial institution does not develop the tools itself but is based on agreements with one or several entities. It could indeed create a reputational risk due to the lack of updated information on the websites. It may affect legally and commercially. However, what it is important is to make sure that the entity which provided the advice/ guidance is the same which collects the data. The risks might also be mitigated by ensuring that the customer is able to identify the entity providing the service.

R6: Consumers are unaware that the personal data they input in the tools is used in ways they did not envisage when they provided it.

These risks are not different from a face to face relationship. The current data protection legislations and the implementation of the General Data Protection Regulation adopted recently should provide sufficient safeguards in this regard, notably by ensuring that the customer gives its unambiguous consent for the use of the personal data provided. Hence, this is not a specific risk of the automated provision of services.
It is also important to note that the EU General Data Protection Regulation was issued particularly also with the increased use of new media in mind. There is therefore no need for a data protection regime specifically for automated financial advice. The provisions of the EU General Data Protection Regulation particularly address the concern voiced in Section 67 that financial institutions may use the data they receive from consumers in ways not originally envisioned by consumers.

However, it is important to clarify that the statement that “Automated tools could also increase the possibility for so called ‘social engineering’ to occur because as consumers become increasingly accustomed to providing personal information online, this increases the potential that they could fall for phishing and other scams designed to trick consumers into revealing personal financial information, over email or via websites” is not accurate in our views. The general precautions consumers are supposed to take in any release of personal information online, must be followed at all times. The banks must ensure that the used tools can always reassure consumers in these measures.

R7. Consumers make unsuitable decisions because of limitations or assumptions within the tool.

This is not a specific risk of the automated financial advice provision, if any, it would even normally be lower than the risk inherent to face to face relations, always subject to subjective interpretations. This is a risk that should be dealt with through the application of the general requirements on disclosure, conflicts of interest, fair and misleading information under MiFID.

R8: Consumers make unsuitable decisions because there are errors in the tool.

This risk which can also happen in case of face to face advice, can be reduced to a minimum by testing the decision trees and algorithm before implementing the tools.
We are of the opinion that the risk of unsuitable decisions of consumers due to a mistake in the decision tree or algorithm is minor and easier to audit than the risk of incorrect advice due to human failure in a face to face advisory discussion. Furthermore such tools will be checked for errors on a regular basis to prevent liabilities issues.

R9: Consumers suffer detriment because the automated financial advice tool they use is hacked and the underlying algorithm is manipulated

It is important to note that it does correspond to a specific risk for automated financial services - any Information Technical Centre system gives rise to this risk and the financial institutions have to establish appropriate controls to detect and reduce the risk of accidental changes in the IT systems – (due to extended compliance).

R10: Consumers make unsuitable decisions because the tool facilitates them to move too quickly through the process.

While this risk is not simply theoretical, this is inherent to any type of digitalisation process, also in other industries. There is no evidence that it is not possible to reconcile the potential benefits of convenient, easily accessible solutions with the requirements of customer protection. Moreover, in a lot of cases, online procedures allow to “force” a customer to go through a certain process and review certain information, which is not possible with paper documentation that is simply handed over to a customer, or which a customer does not have time to review during a physical meeting.
New technological solutions come with risks and these risks should be adequately managed. In our views, this risk is not specific or to be differentiated from an advice provided face to face. Actually to the contrary, since the customer may take more time to answer the questions (even modify them at a later stage). We therefore not consider that this risk is specific to automated financial advice. In addition, we consider this risk can be reduced by using high quality decision trees containing feed-back loops and control questions to avoid a 'fast click through process'.

R11: Consumers lack motivation to act on advice given by automated tools where such tools do not facilitate an end-to-end process

This is a very remote risk, since:

i) The client who accesses this kind of tools is looking for the service and, therefore, it is unlikely that he/she loses motivation as a consequence of that.
ii) Many tools will be designed to facilitate the end of the process or, otherwise, will inform about such limitation beforehand.

R12: Consumers lose out as a result of automated advice tools being based on similar algorithms, resulting in many consumers taking the same actions in relation to the same types of products/services

Theoretically, this is indeed a risk. However, similar risks have been identified before in respect of algorithmic and high frequency trading. The recently adopted Market Abuse Regulation has provided an adequate regulatory response allowing to continue to draw upon the benefits of the use of algorithms while limiting the potential risks for the market and for customer. This should also be possible in respect of the use of algorithms in the context of automated financial advice. Automated tools normally use their own algorithms, differentiated from others. Automated financial advice service is defined by the direct inputs of the client


R13: Consumers may no longer be given the opportunity to access any human financial advice

As it has already been explained above, automated advice is likely to evolve as a complementary or alternative tool to the services provided by human advisors, even if they are provided by the same or different financial entities. Situations where consumers may no longer be given the opportunity to access any human financial advice will very much depend on the willingness of the consumer itself and according to the type of product. It is generally observed that customers who seek automated advice are normally not very keen to personal contact with their financial entities and, to the contrary, the people who value personal advice are less likely to use automated tools. Customers will continue to have a permanent access to an operator (via an online chat, mail or telephone) to provide assistance along the process or benefit from personalised advice if need be (but then will not fall anymore in the scope of automated advices.)
/
/
/
R14: Financial institutions may be exposed to litigation and subsequent reputational risk due to faulty automation.

It is important to underline that the risk of litigation is always present for any products or services and may indeed affect the reputation of the bank. Accordingly, this will need to be dealt with through proper (legal) risk management. The risk of errors from faulty automation can be however mitigated, if not removed, by testing the decision trees and algorithm before implementing the tools. We are of the opinion that the risk of unsuitable decisions of consumers due to a mistake in the decision tree or algorithm is easier to control and audit than the risk of incorrect advice because of human failure in a face to face advisory discussion. Like for any other type of online service, tools will need to be checked for errors on a regular basis to minimise liability issues.

Financial institutions might be concerned by the allocation of liabilities when another company developed the tool and provide the service (especially when the entities are located in different jurisdictions). The customer will indeed always direct the complaint to the entity with whom it has a relationship without prejudice of the allocation of liabilities amongst the different entities involved in the advice process.

We would draw a distinction between the risks of faulty automation and how the information outputs are used. A flawed algorithm could still be capable of providing accurate information to a customer in a given scenario but over the long term the way in which the information is construed and investment decisions based on it might not be correct and therefore liability might be placed incorrectly on the advice provider.

Without a clear distinction as to whether the output information itself was faulty, or whether consumer detriment arose as a consequence of how the information was construed, too much liability would lie with the advice provider

R15: If providers of automated advice tools also offer consumers the possibility to engage with a human advisor as an alternative means to obtain advice, consumers may overuse that alternative means so as to supplement the automated advice on the product/service

Section 82 says that there is the danger for financial institutions that consumers may use a human adviser to supplement automated advice. We do not see this as a danger, but as an option that consumers should be free to choose in the future as well. Should consumers wish to combine elements of digital and human advice in the advisory process, it should nevertheless be ensured that the result is a complete/all-in-one advisory process.

Automated advice is likely to evolve as a complementary or alternative tool to the services provided by human advisors, even if they are provided by the same or different financial entities. It is however generally observed that customers who seek automated advice are normally not very keen to personal contact with their financial entities and, to the contrary, the people who value personal advice are less likely to use automated tools.

R16: Legal disputes may arise due to unclear allocation of liability

In this section a scenario is described in which financial institutions may face risks due to the fact that the party responsible for providing an automated tool cannot be clearly identified. Such a situation is not legally permissible in our view, since under Article 5 of Directive 2000/31/EC (E-Commerce Directive) the service provider must be named on the respective website.

Financial institutions might be concerned by the allocation of liabilities when another company develops the tool and provide the service (especially when the entities are located in different jurisdictions). The customer will indeed always direct the complaint to the entity with whom it has a relationship without prejudice of the allocation of liabilities amongst the different entities involved in the advice process. The risk of unclear allocation of liability should naturally not be overestimated as the general rule would be that a firm cannot waive liability towards a customer because it outsourced a service or involved subcontractors in providing the services. However, a clear allocation of the liabilities should be mentioned in the agreement with the outsource providers.

As regards the importance to distinguish automated tools not designed to be financial advice from automated financial advice issues: The presumption that no matter the characteristic of the tool, the service provided is considered as a financial advice as long as the consumer perceives it as such, might create liability issue and barriers to the provision of certain services. There are indeed for example some tools provided by financial institutions which are not designed to give financial advice but to merely guide investors/customers to make their own investment decisions on a well-informed basis. There is not a clear stance on how to treat this type of service. This leaves institutions with concerns about regulatory risk, possibly preventing them from investigating the merits of, and developing, such services. A clear distinction should be made between the use of an automated tool and the use of an automated financial advice and consequently also between MiFID and non-MiFID services.
One alternative should be to address automated financial advice in compliance with specific regulations e.g pure investment service should be regulated under MiFID but not the other types of services like comparison websites. This topic should be included in the future discussions of automation in financial advice, in order to further clarify the boundaries.
In addition, we would like to recall the importance to make a clear distinction between ‘advice’ (whether automated or not) and “the provision of information” to customers. Advice and advisory services are defined in sectoral EU directives such as the Mortgage Credit Directive or MiFID and in national legislation. It could have a negative effect on the development of automated financial services if these rules and regulations concerning advice should be applicable on tools that are only made to inform customers.
In our views some risks to financial institutions are missing, which are the following:

 The risks linked to called aggregators (websites which compare similar services or products of financial institutions). There is indeed a risk that the comparison focus only on prices and end up comparing products which are not comparable. Especially insurance and banking products are often multi-layered and complex and are not suitable for comparison on a platform. Aggregators are not subject to legal provisions, banks are required to follow (e.g. in which manor information has to be provided to customers.)

 Use of automated advice tools over time: Whilst these tools should be useful to provide information consistently over longer periods of time, with this extended use comes extended risk. Without clear definition of liability this risk could be construed as excessive. In order for this form of information provision to be embraced fully by the industry, clear boundaries of liability over the longer term are likely to be needed. For instance, it would have to be defined from the outset, how the use of information provided by automated advice systems should be used so as to avoid consumer inertia to diligently act on the information provided as time goes on, only for the financial decision to not go as anticipated, and liability is retrospectively assigned to an institution.The fragmentation of the process if arrangements are not binding for new players.
 Economic, organizational and reputational risk with the adaptation that probably will require a significant reduction of branches and advisors.
 Risk of an un-level playing field. Establish adequate requirements to shadow banking, online players and fintechs is necessary. A more limited service with a lack of human interaction should involve the same requirements according to the risks “same services/risks – same rules” principle should apply.
/
The principle of “same services/risks, same rules” should apply to all companies regardless of the sector or location.
The banking sector supports a competitive and innovative EU Digital Single Market which safeguards existing consumer protection, trust and security. Fair trade and open competition on the market enables consumers to access a range of new competitive products with higher performance. It also enables companies to offer products to a broader array of customers. To do so the right competitive environment should be set and allow an open and fair competition among the market players:
⁻ Regardless of the sector: Because products and services will be under high pressure to be supplied instantly, safely and with a guarantee of quality, the principle of “same services/risks, same rules” should apply to all companies offering similar products and services whatever the market/sector they are targeting when it involves comparable risks. All companies should benefit from the same opportunities and flexibility to innovate.
⁻ Regardless of the location (the global dimension should be considered): it is important to eliminate or reduce the regulatory divergence with other non-EU countries such as the United-States, Asian countries etc.
All service providers will make increasing use of the internet as a distribution channel in the future. The growth potential in this area has not yet been exhausted by any means. Efforts will be focused on offering consumers round-the-clock access to banking services. The main aim will be to further develop distribution processes so that, where possible, no further processing by banks is required, i.e. providing “one and done” to banking services. A key aspect of this development will be supporting or even advising clients in their decisions to purchase a banking, investment or insurance service. Only interlocking distribution channels will deliver an optimal consumer experience, however. It is thus conceivable that consumers will start by using a decision-making aid and then switch to video advice or perhaps arrange an appointment to call at a branch to ultimately have their financial situation thoroughly checked after all.

We disagree with the implication in section 89 of the Discussion Paper that face-to-face advice in the securities sector may be reserved only for wealthy consumers in the future. While this trend is in fact being encouraged particularly by fee-based advice, commission-based advisory services will continue to allow consumers to make use of face-to-face advice where needed.
As mentioned above, in general, automated advisors use algorithms and model portfolios to allocate investments based on client’s objectives and risk tolerance. Banks are usually more equipped to provide detailed personalised financial planning services. We observe a rapid emergence of robot-advisors for wealth management in the US but the phenomenon is now also spreading to Europe. Certain forms of automation in financial advice are already widely adopted and commonly accepted (e.g. providing online investment advice when a client purchases financial instruments online, having a customer complete a MiFID questionnaire online, having a customer provide information needed to apply for a mortgage credit online etc.). Other forms of automation in financial advice are currently less-developed, and it remains to be seen if and when will ever be used and accepted on a large scale (e.g. fully automated asset management or robot-advice). When assessing benefits and risk of automation in financial advice and the need for potential further action, it is key to avoid transposing existing “wholesale” solutions that affect further digitalisation of financial services in areas where prudential and customer risks are already properly regulated, simply due to perceived risk related to more “exotic” forms of digitalisation.
At this stage automated tools for investors have not been broadly developed and will have in any case to be compliant with existing legislations (e.g MiFID 2 currently being transposed).
In our views, a clear distinction should be made between the use of an automated tool and the use of an automated financial advice e.g investment services should be regulated under MiFID but not the other types of services like comparison websites.
In our views, the discussion paper should clarify the definition of ‘consumer’ in the paper. It is indeed a very broad concept which is not used consistently in European legislations. For example MiFiD uses the word ‘non-professional investor’ instead of consumer. Therefore it is not clear which type of clients fall within the definition of consumer.

Furthermore, we consider of utmost importance to maintain a level playing field between all market players proposing automation financial advice. The principle “same services/risks – same rules” should apply.
Noémie Papp, Legal adviser retail and coordinator digital issues
E