COFACE (Confederation of Family Organisations in the EU)
The experience of innovative uses of consumer data provided below have been gathered from the perspective of consumers/financial services users by COFACE (Confederation of Family Organisations in the EU).
While the EBA’s paper aims at identifying benefits and detriments of innovative uses of data, such analysis pose a number of deeper, philosophical questions. For instance, what is the consumers’ interest? Is it to receive “personalized” offers which entice him/her to consume more of the same? Or is it rather to expose him/her to new products he/she might not have thought of? More essentially, does personalization empower consumers to make better choices or does it encourage and promote unnecessary/frivolous consumption? Use of consumer data for providing personalized budget management will also inevitably be ideologically tinted, promoting either a consumerist or ascetic lifestyle.
Beyond the debate of benefit and detriment, use of consumer data should first and foremost abide by very strict principles such as proportionality, impact on discrimination, social exclusion, mutualization/socialization of risk and data ethics. Some of these principles are already included in the General Data Protection Regulation recently adopted.
The EBA’s role should be to help establish, in partnership with Data Protection Authorities, guidelines and codes of conduct for all financial service providers to help them assess whether their use of consumer data complies with the principles listed above. In order to provide legal certainty and a level playing field, the EBA should make clear that the guidelines provided will serve as a basis for regulation should financial services fail to comply with them on a voluntary basis.
The EBA should define governance rules for all financial service providers handling consumer data. This includes credit bureaus, but also any potential actor which may provide personal information about a financial services user (for instance, a company selling health sensors sharing data with insurance companies). The governance rules should ensure that decisions about the use of data and the extent to which risk is mutualized or socialized remains a collective decision by all interested stakeholders (public authorities, financial services providers, consumers, data protection authorities). Such bodies need to decide about the proportionality of using personal data. The impact of Big Data and the importance of ethical use of data, however, goes much beyond the scope of financial services alone.
The EBA should promote data analytics solutions with strong privacy and data protection. One such solution would forbid any financial institution to run data analytics without the consent of consumers. Should the consumer seek to purchase a product such as a loan, the consumer him/herself would run a creditworthiness test (which abides by data ethics, proportionality principles) directly on his/her online banking platform and transfer the result to the financial institution. In this manner, at no point in time does the financial service provider handle any personal data or ends up in possession of consumer data which can be later reused for other purposes outside of the consumers’ control. Such a suggestion is in line with the GDPR.
Transparency about the use of consumer data needs to be enhanced including access to greater information about algorithms and the result of the data analytics process (for instance, why he/she has been refused credit).
Algorithms need to be subject to auditing by an independent body to ensure that they abide by the principles evoked above.
The EBA should work in close cooperation with Data Protection Authorities in order to develop robust, clear and relevant guidelines, governance rules, transparency, mechanisms for user control and auditing of algorithms and the use of consumer data.
The use very much depends on the type of financial institution. “Traditional” retail banks use internal data about consumers coupled with external data from other sources such credit bureaus. The data used depends greatly on a number of parameters including the type of credit bureau (for instance, in some countries like France, only negative data is available to banks, whereas in other countries like the UK, data about mobile phone payment and utility bills are included via private credit bureaus). Fintechs, on the other hand, can resort to a greater variety of data either collected about users directly via their services (transactions, investment patterns, product purchases…) or via access to other forms of data such as social networking, online behaviour etc.
The most commonly used data also depends on the product being sold to consumers. When selling a loan, for instance, the most commonly used data is so called “negative” data (defaults on a previous loan) and “positive” data (current financial commitments in the form of other loans), followed by a range of data depending on the country (see above). For health insurance, the most important data would be lifestyle related (being a smoker) and current health conditions (diabetes, arthritis,…)
The reliance on data is also linked to the question above. Depending on the financial service provider and the financial product being sold, the data relied upon is the same as listed above.
Consumer data is used predominantly for:
- Market segmentation (creditworthiness assessments, calculating risk premiums in insurance…);
- Marketing and targeted advertising;
- Personalized/customized services (budget management, robo-advice);
- Risk prevention (early intervention, avoiding default) and fraud early intervention (identifying irregular patterns in financial activity);
- Optimizing services (cost reductions, efficiency gains).
Financial institutions are clearly increasing their use of consumer data and there are many potential impacts on the market:
- Further financial exclusion of the most vulnerable via increased market segmentation or creation of a parallel market for the “poor” with lower quality products/services.
- Imbalance between financial institutions which gather/hold a large amount of consumer data (such as “traditional” retail banks and payment/transaction data) and new market entrants.
- Overreliance on data analytics which may encourage increased risk taking (for instance, overreliance on robo-advice/robo-investment, overreliance on new forms of individualized risk based pricing) and therefore, more prudential risk.
- Banks, payment service providers, and other financial firms may be tempted to sell their customers personal data to third parties, e.g. what ING already envisaged in the Netherlands few years ago .
The potential benefits identified in the paper deserve closer examination.
43: Increased cost-effectiveness could also be passed on to investors/shareholders or an elite within the financial service provider rather than passed on to consumers. In the investment industry, there have been a number of studies showing that investors/brokers charge high fees regardless of whether their investment strategies are successful or not.
44-48: The fact that consumers are being fed tailored recommendations could also be seen as a detriment, contributing to “lock” consumers in a consumption pattern which may not be in their interest (reinforcing frivolous spending for instance). Furthermore, tailoring products to consumers’ profiles could also be a detriment in some circumstances. For instance, in accessing loans, it could lower the price for the wealthiest and least risky consumers, but greatly increase the price or restrict the access for the poorest/most vulnerable consumers.
45: Granting access to your personal data may be a benefit in the very immediate/short term (for instance, access to a financial product you desperately need) but become a huge detriment in the longer term if such data is re-used, sold on to third-parties at a later stage.
46: All studies on over-indebtedness come to similar conclusions: over-indebtedness is mostly linked to life accidents such as illness, divorce or change in the financial situation of a debtor (loss of employment, stagnating revenues compared to the cost of living), all of which are unpredictable, regardless of the data used. Furthermore, studies carried out in the US by the National Consumer Law Center shows that using more data only increases the probability of mistakes and inaccuracies. Finally, using data from non-traditional sources such as social networking might contribute to the proliferation of a market for online reputation management whose aim would be to tamper with consumers’ online data including social networking content in exchange for money in order to improve their credit score .
COFACE supports BEUC’s position on the use of credit data as explicated below.
BEUC and COFACE strongly support the principle of responsible lending that is provided for by the Mortgage Credit Directive, and advocate for the same principle to apply to personal loans . A key question is the following: what information is needed to assess the borrower’s creditworthiness?
Banks, credit unions and other traditional lenders conduct such assessment based on information and documents directly provided by the borrower, plus data registered by third parties (public credit registers, private credit bureaus). More recently, some lenders, especially online providers such as peer-to-peer lending platforms, have been using alternative information sources (web and social media profiles). Recently Facebook patented a technology that could be used by lenders to determine whether the borrower is a good credit risk, based on credit scores of his/her Facebook friends . Though, later on the social network giant opted to limit the amount of information available to third-party services, perhaps following the hint by the US Federal Trade Commission that the company could be regulated as a consumer-reporting agency .
Depending on the EU country, consumer credit data is collected and registered by public credit registers or private credit bureaus. The sector is strongly regulated in some Member States, where only public registers exist and collect minimum data related to consumers’ pending and/or delayed credit commitments (positive and negative data). For example, Belgian and French credit registers are managed by these countries’ central banks and aim to fight against over-indebtedness. On the other side, private credit bureaus (also called credit reference agencies) like Experian, Equifax, and Credit info are present in many EU countries and collect extensive information on consumers’ financial and non-financial commitments that they sell to lenders and non-financial service providers. Furthermore, in many countries several credit bureaus compete with each other.
Link between the amount of data and responsible lending: Credit bureaus claim that collecting more data on individual consumers contributes to a more accurate creditworthiness assessment by lenders. Yet, the reality seems to be different. FSUG (The Financial Services User Committee of the European Commission) recently conducted an investigation to assess the role of credit bureaus in responsible lending and prevention of over-indebtedness. One of the key findings was that “… no clear link exists between the frequency of arrears in the different EU countries and the extent of credit data used. France, Spain, Finland Portugal, Belgium and Austria have similar frequency of arrear levels with a limited use of credit data. The United Kingdom, the Netherlands and Germany also have comparable frequency of arrear levels with a very high use of different credit data. On the other hand, countries such as Poland have very high arrear levels while the use of credit data is high; Cyprus has a very high arrear levels while the use of credit data is relatively low.” The FSUG concludes that the levels of arrears is much more dependent on other variables such as employment, income, social policies than the depth and breadth of credit data used .
Link between the amount of data and lower interest rates: The claim that increased and “innovative” use of data for creditworthiness purposes improves access to more affordable credit for consumers is also questionable. For example, a US study carried out by the National Consumer Law Center (NCLC) assessed, inter alia, whether the use of big data actually improves the choice for consumers in the area of credit. The authors tested a number of claims made by big data proponents, such as: multiplying the number of variables will expand access to borrowers with thin credit files; by using a constellation of factors to price credit, the cost of credit will be reduced for low-income borrowers, thus enabling lenders to provide lower-cost small loans as alternative to payday loans. The study concluded that big data is a big disappointment and does not live up to its big promises. The use of big data in the lending area does not appear to result in more affordable products for low-income consumers. While some loans are marginally better, for the most part, credit products using alternate data are just as expensive as payday loans .
Consumers not in control of their data: There are also issues around how credit registers handle and process consumer data. Frequent problems experienced by consumers include: inaccurate data that adversely affect the consumer’s credit application, difficulty for consumers to access and correct inaccurate data about themselves, opaque credit scoring mechanisms, and automated process without human intervention. In addition, in some countries consumers are incentivised to borrow more to build their credit score, i.e. potential borrowers are judged on their capacity to borrow and refund, rather than capacity to save money. A recent awareness campaign by our UK member Which? informed consumers about various aspects and their rights related to credit scores . German consumers complain about non-transparent scoring mechanism applied by Schufa, Germany’s largest credit bureau, as well as the difficulty to access and correct wrong data. According to BEUC’s German member vzbv, consumers’ ability to repay a loan must be evaluated based on valid criteria, and not depending on whom they are friends with on social networks, what they like to shop for or what apps they install .
BEUC and COFACE consider that most of the above issues could be addressed based on a holistic analysis and appropriate measures by policymakers. We believe it is the right time to conduct an in depth analysis covering the following questions:
• What data is necessary and sufficient to assess the borrower’s ability to repay a loan ?
• What is the role of credit bureaus and do they perform their role appropriately?
• Is it appropriate that consumers’ credit data is controlled and traded by third-party commercial entities?
BEUC and COFACE’s suggestion: In our view, all the information (income and expenses) necessary for creditworthiness assessment can be found on the consumer’s bank/payment account statement. This information is objective and should allow the lender to conclude whether the borrower has sufficient and stable income, whether the level of the loan-to-income ratio is appropriate, whether the consumer already has other pending mortgage credit or personal loans, including payment arrears, what are other financial and non-financial commitments (rent, utility bills, insurances, etc.). We see no reason to add other data elements, which are likely to be subjective and increase the risk of wrong assessment of the consumer’s financial situation. The information derived from the account statement (covering a sufficiently long period of time, e.g. one year) can be provided directly and quickly by the consumer to the lender, including new types of lenders such as peer-to-peer platforms, in a friendly format – digitalisation and new technologies should help with that. Data belongs to the consumer; it should not be collected, controlled and traded by private credit bureaus, which result in much consumer detriment.
47: Further market segmentation and individual risk based pricing can only be a “benefit” if we take the position of the least risky (mostly the wealthiest) consumers and goes against principles such as mutualization/socialization of risk.
49: As has been underlined by BEUC, independent financial advice/guidance can only be guaranteed from actors who do not have a financial interest in the advice/guidance being proposed. It therefore cannot be assumed that financial advice from financial service providers, even automated/customized, will be configured in the best interest of the consumer.
As BEUC explains:
Before talking about ‘better advice’, it should be mentioned that most of consumers have never received any financial advice from their financial institutions and are only in contact with pure salespeople.
Today, consumers in the EU are not getting the advice they really need when looking for mortgages, insurance or seeking to better invest their savings, and they are too often recommended products they do not need (e.g.: a too much sophisticated bank account or bundled products). Especially in the retail investment area, the low quality of advice has been documented widely, both by our members and by public authorities. Third-party commissions or in-house sales incentives tend to steer consumers towards overly complex and expensive retail investment products, often not suitable for their risk profile.
BEUC considers that getting advice in financial services, in all its different forms, will be one of the areas where consumers could potentially benefit a lot from smart technology, if designed well. BEUC response to a recent EBA consultation on automated advice set out our views regarding potential benefits and challenges to consumers. For example, one of the key factors determining market outcomes will be the quality of the algorithm guiding consumers through the advice process. Regulatory oversight of the software involved is therefore crucial, as all the features of unsuitable “human” advice (e.g. a product bias toward unsuitable products because of commissions) can easily be mimicked by an algorithm .
It should be noted that the quality of the algorithm will become increasingly important in retail finance and influence the consumer experience in general, as traditional providers and fintechs invest heavily into such tools and rely on them to process and interpret massive volumes of data.
More generally, potential benefit B5 as described by EBA is very similar to B4, thus requires more time to assess the impact on consumers.
50-51: These are by far the most important benefit for consumers. Prevention and early detection of financial difficulties could greatly help in addressing over-indebtedness and instant detection of suspicious financial activities could prevent consumers falling victim of fraud/scams. Nevertheless, the early detection/fraud prevention systems need to be configured in such a way as to work effectively and not block the rightful owner from using their payment facility. There have been cases where consumers have been unable to make purchases abroad and were left in a delicate financial situation because of their payment facilities being blocked by fraud prevention mechanisms .
57: Selling data to third parties may not have only positive outcomes for banks. Certain retail banks have recently tried to change their terms of service to allow them to sell data to third parties and faced severe backlash from consumers, forcing them to backtrack.
58-70: Consumer consent in the digital environment poses may problems. In order to access a service, consumers often tick boxes without reading terms and conditions. Even if consumers disagree with the content of terms and conditions, there is a power imbalance between financial service providers and consumers as consumers are the ones who “need” access to certain essential services (loans, insurance) and will therefore consent. Furthermore, objecting to terms and conditions simply excludes consumers from accessing a product/service as these operate under a “take it or leave it” principle.
Additional benefit: Transaction speeds can be greatly increased for consumers.
Legal uncertainty, at the moment, is one of the biggest barriers that prevents the use of consumer data. However, with the upcoming enforcement of the GDPR, financial institutions may be able to develop methods for using consumer data in a way which respects important principles such as consumer control over the data generated, the principle of proportionality etc, and more broadly, the consideration of data ethics.
The EBA could address such a barrier by publishing guidance on the application of the GDPR and other relevant regulation to financial services.
- Big Data may pose a threat to certain business models. Insurance companies, for instance, need mutualization of risk to make their business model work. By individualizing risk to the extreme, it may have the effect of driving the least risky consumers out of the market as they would not be willing to pay for an insurance policy they perceive they don’t need, and simultaneously drive the most risky consumers out of the market as they would be priced out of the market.
- Novel risk assessment methods may pose prudential issues as these have not been thoroughly tested. Repackaged sub-prime loans combined with CDS is an example of “novel” risk hedging techniques which have gone awfully wrong. The same can hold true for relying on Big Data to assess risk (based on social networking data etc).
- Data sharing my result in data about consumers getting into the wrong hands (predatory lenders).
- Extending data analytics to information generated by users themselves may pose new moral hazard risks. Some consumers may seek to artificially improve their “scores” via either paying online reputation management companies or by tampering with data generated about them directly (for instance, attaching a connected pedometer to a dog to make it look like you are in good shape, tampering with GPS data etc).
- Linked to the risk above, algorithms may directly influence/shape consumer behaviour as opposed to respond to consumer needs/behaviours. For instance, an algorithm which relies on successful repayment of previous loans in order to assess creditworthiness forces a consumer to take out credit frequently in order to “build” a positive credit history as opposed to using his/her savings to pay for goods/services. Failure to do so may exclude the consumer once he/she seeks to take out an essential loan such as a mortgage. It may also influence on other purchasing decisions like taking a mobile plan instead of a prepaid card since in some countries, mobile plans are included in credit data collected by credit bureaus.
- Automated or semi-automated decisions based on data create risks around liability. Who is responsible in case a robo-adviser or robo-investor gives out misleading/incorrect advice or makes bad investment decisions?
- Instant customization of financial products for consumers could also head in a similar direction to air fares: dynamic pricing which does not necessarily aim at providing the “best” price for consumers, but to “optimize” the money generated from consumers (looking at past behaviour to check if a consumer is an impulsive buyer or whether he/she compares different offers, which country he/she lives in to infer his/her purchasing power…)
- Customizing products will also dramatically increase product choice and make it harder for consumers to compare offers, which could also lead to more misselling of products.
- Broadening the use of data could, under certain circumstances, also exclude consumers from accessing financial products on political grounds.
Risks which have materialized:
- The National Consumer Law Council in the US has provided evidence of consumers being excluded from financial services due to mistakes in credit data. Discrimination of the poorest category of the population has also been examined in the news .
- Equifax has been convicted in the US for selling consumer data to predatory lenders.
- Credit card details have been a frequent target for hackers . Moreover, theft of non-financial data has also been on the rise, such as the recent LinkedIn hack . It is not yet clear how data breaches may be monetized by hackers but there are several scenarios which can be envisaged: sale of the account details to third parties, sale of the data to third parties that may have an interest in the data set for data mining/analytics (advertising, predatory lending…), extortion (threatening victims to publicly release embarrassing data), impersonation (attempting to scam close friends and relatives) .
- Changing terms of service or user agreement to “force” users into consenting to share their data with third parties for a variety of reasons (advertising…).
- Certain new financial service providers have faced substantial prudential risk:
o Lending Club has faced difficulties .
o Virtual currencies, which were deemed safe thanks to blockchain technology, face a big challenge due to the theft of 50 million “ethers” (a virtual currency) .
- Google has recently gained access to sensitive information of patients (health related data) in the UK . Sensitive health data is also subject to data theft and/or exploitation by insurance companies .
- Online reputation management services have steadily developed over the last years .
- Various tracking devices have collected/shared data about users with insurance companies (notably health trackers) with the objective to create individualized risk profiles .