Primary tabs

European Banking Federation

We consider the proper approach should be starting with a narrow scope including all credit institutions that are required to report to CRR and national frameworks. Starting the project with institutions sharing same reporting requirements will allow to progress with an adequate speed. Aiming for a broader scope will only add unnecessary complexity to an already complex and ambitious project. Additional layers can certainly be added at later stages of the project once the integrated approach for credit institutions has been created.
As additional point, we suggest considering that in case of banking groups composed by less significant or local banks which have a smaller size and level of significance than the holding, a principle of proportionality could be applied through which some regulatory reporting could be produced only at consolidated level. The actual regulatory framework implies that these banks incur significant IT and HR costs in order to provide informative details that could be provided by the holding and do not provide additional information than reports that are already produced at consolidated level.
The data collections considered in the stocktaking section of the Feasibility Study and, in more general terms, in all other sections of the Feasibility Study, cover all EU banking authorities, both at EU level and national level, as well as all type of data requests regardless of these being for supervisory, statistical or resolution purposes, template-based or granular, structured (ITS) or ad hoc. Therefore, we consider the Feasibility Study covers most of the reporting requests existing to date for EU credit institutions. We strongly encourage beginning the project with this scope (supervisory, statistics and resolution) that is already very much ambitious and only consider expanding it at later stages.
Although we strongly encourage beginning the project with this already challenging scope (supervisory, statistics and resolution), we consider that attention should also be given to the long term (dot on the horizon). Using a phased approach and expanding the current scope at a later stage to, for example, ESMA, DGS and Securities & Derivatives reporting is the way forward to providing most efficient/effective results for all stakeholders involved, in the ‘end state’. This is also very much linked and aligned with the European Commission’s initiative towards moving to a modern, efficient, and effective approach to data collection for supervisory purposes i.e.EC Roadmap on supervisory data.
It is particularly important that all relevant parties (authorities and industry) have full clarity about the scope ensuring full understanding about which reports will be covered. This will also bring more clarity about the impact in banks' current reporting burdens allowing, at the same time, a better assessment of the long-term benefits of the proposal.
While we consider it is early to determine whether the research work done by the EBA is complete/relevant or not, we acknowledge the EBA has performed, in a best-effort basis, a thorough analysis of this complex issue towards fulfilling the mandate stated in Article 430c. Having said that, and while we understand the aim of the report is to conclude whether integrated reporting is feasible or not, we consider the draft report feels uneven in the sense that it lacks concrete details/proposals on key aspects while putting up for consultation many questions demanding lengthy (technical) assessment on aspects that are not the most relevant as starting points.
We also note the EBA has identified the efficient engagement with all relevant authorities and stakeholders as a key success factor of the integrated reporting system. The purpose of the Integrated Reporting project is, among others, to alleviate the reporting burden for the reporting institutions. Consequently, the industry is recognized as a key stakeholder in the project. To this end, we strongly believe the final report should advise the participation of the industry in the Joint Committee with enhanced role regarding:
 to effectively control the reporting burden and associated costs. Commercial banks should be part of the governance from the start and with sufficient mandate e.g. to review and challenge the impact of new or up dated reporting requirements.
 the implementation of the integrated reporting system and technical aspects to be covered in the implementation phase e.g. common dictionary, architectures impact, decommissioning current reporting frameworks;
 the exchange of views about the systems, tools and technologies to be used in a possible implementation phase; and
 the outlook of the possible future actions on the development and implementation plan that may wish to be pursued after the Feasibility Study analysis.
 the plan to implement integrated reporting must consider that, for the integration of requests by NCAs, national frameworks are very different. The integration phase of national requests should take this into due account.
Not relevantSomewhat relevantRelevantHighly relevant
Training / additional staff (skills)  X 
IT changes   X
Changes in processes   X
Changes needed in the context of other counterparties / third-party providers X  
Time required to find new solutions   X
Other (please specify)X   
Highly relevant:

- Governance: An end-to-end industry driven governance model, founded on a common purpose, ambition and strategy is a paradigm shift to all stakeholders involved in the reporting chain.
- Feasibility/capacity of exchanging the information, confidentiality, and legal constraints for European banking groups with subsidiaries in third countries
- Harmonisation of reporting requirements based on a non-harmonized legal framework of different supervisory authorities

Relevant:
- Aligning MIS and regulatory data dictionaries
- Selecting appropriate BI infrastructure and tools
We would like to draw your attention the following points for your consideration:
• EBA’s Cost of Compliance survey (paragraph #57): the cost of compliance survey responses regarding the impact of increasing granularity may have been affected by some resistance to change among the teams that, within financial institutions, currently produce templates like FINREP and COREP. In our opinion, to move forward into a more granular approach and replace some of the more detailed current templates would outweigh the costs of a more demanding data quality (DQ) and the need of new coordination mechanisms (feedback loop and anchor values) as it would boost data exploration and reduce the ad hoc requests.
• We note the following are not included, or at least mentioned, in stock take:
 The valuation and bail-in datapoints from the Single Resolution Board (SRB)
 The standard OSI (on-site inspections) data points from the ECB.
• It is difficult to understand how ad hoc data requests could be harmonized. While regular data reports have a basis in law, ad hoc data requests are, by nature, irregular and unplanned. The main aim should be to minimise rather than harmonise ad hoc requests. Ad hoc can only be done when data are needed for a specific purpose e.g. for a specific investigation or when an unexpected situation comes up such as Covid-19 or Brexit. If the intention is to more structurally reuse data that authorities already have (because have been reported regularly) for specific investigations/emergencies, banks would very much endorse this principle. We note that authorities could already make more use of data that are in their possession because these have been regularly reported.
Highly agreeAgreeSomewhat agreeDon’t agree
Data Dictionary - Semantic levelX   
Data Dictionary - Syntactic levelX   
Data Dictionary - Infrastructure levelX   
Data collection - Semantic levelX   
Data collection - Syntactic levelX   
Data collection - Infrastructure levelX   
Data transformation - Semantic levelX   
Data transformation - Syntactic levelX   
Data transformation - Infrastructure levelX   
Data exploration - Semantic levelX   
Data exploration - Syntactic levelX   
Data exploration - Infrastructure levelX   
We agree on the holistic approach that has been considered since it makes the reporting process steps and levels of integration easier to understand, and slice and dice the analysis into more concise pieces.
Regarding the solutions that the EBA should investigate to reduce the reporting costs, we would like to remark that the current regulatory reporting process is costly but beyond that it is very burdensome, meaning that it is not only costly but also inefficient and unsustainable in the long term for the reporting institutions and produces also inconsistent and inaccurate results for the authorities that may hamper their duties. An integrated approach for the whole reporting process is going to be undoubtedly expensive in terms of IT investment and change management in the short-term but it would lead, from EU perspective, to a more sustainable and less burdensome mechanism and a more accurate and valuable information for the CAs. However, additional considerations for the more diversified European banking groups with subsidiaries in third countries should be considered when selecting the option to be developed due to the impact an integrated reporting may have at consolidated level lacking the above-mentioned benefits.
Nevertheless, we would like to discuss into further detail the statements in paragraphs #76 and #83 regarding the decoupling of the semantic and syntactic levels in the data definition step of the reporting process. In our opinion, building up a data dictionary should align the semantic definition (business content or business glossary), the syntactic definition (Metadata Model and Logical Data Model) and the infrastructure or physical level (Database, etc.). Decoupling the business glossary from the syntactic and the database would misalign the three main units that must collaborate in the reporting process: the regulatory reporting experts, the data people, and the IT units.
Besides, such misalignment would not allow to identify overlapping and gold-plating issues in the underlying regulation, which would lead to regulation removal or simplification, and their effects in the metadata model (data fields duplicated in the logical data model with almost the same business content) which would lead to trim the metadata model.
With the conclusion that all three layers should be there, linked one to another in the dictionary, we appreciate discussion can take place at a later stage as to whether we should start with the semantic or syntactic layer. From the industry perspective, we consider there are valid arguments to consider the semantic layer as starting point.
In our opinion the single regulatory dictionary is at the same time the foundation and the backbone of the whole concept of integrated reporting. Thus, the EBA should investigate thoroughly the pros and cons of the two concrete approaches to this dictionary paradigm, on one side the DPM (under the refit project scope) and the BIRD. The technical advantages, the usability in a digital reporting environment, the ease to plug in the CAs and reporting institutions reporting processes, among others should be carefully assessed. We would suggest the deployment of a technical WG to discuss on the matter, with a good combination of experts on reporting, data and IT from the authorities and the industry, to advise the EBA on ways forward that could lead to the integration of the two projects or to the creation of an entirely new dictionary built upon the best practices of DPM and BIRD.
In the context of the Data Transformation step of the reporting process, we think that another way of reducing the burden and ease the problem of the feedback loop under the ‘more granular’ approach is that the “transformation of the data” concept that can be foreseen in the single regulatory dictionary (as a definition of the transformation rules, or by the definition of an output layer) and in the Central Data Collection Point (as a technical transformation tool/engine) could be used not only by the CAs but also by the reporting institutions as the unique transformation tool to derive and obtain templates, ITS or ad hoc ones, from granular datasets.
Regarding the solutions that the EBA should investigate to reduce the reporting costs, we would like to remark that the current regulatory reporting process is costly but beyond that it is very burdensome, meaning that it is not only costly but also inefficient and unsustainable in the long term for the reporting institutions and produces also inconsistent and inaccurate results for the authorities that may hamper their duties. An integrated approach for the whole reporting process is going to be undoubtedly expensive in terms of IT investment and change management in the short-term but it would lead, from EU perspective, to a more sustainable and less burdensome mechanism and a more accurate and valuable information for the CAs. However, additional considerations for the more diversified European banking groups with subsidiaries in third countries should be considered when selecting the option to be developed due to the impact an integrated reporting may have at consolidated level lacking the above-mentioned benefits.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
At this stage, it is difficult to properly assess this aspect given that the draft report lacks concrete details, as mentioned in our response to question 3. A better assessment should take place with a cost-benefit assessment in place.
Banks use multiple dictionaries, usually a combination of external reporting dictionaries from authorities and internal dictionaries. Some banks, for example, have created a '(Global) Data Dictionary' together with implementation model(s) used for internal communication, data exchange and/or an internal datawarehouse. In the case of large banks, these, with their many reporting applications, have created their own data dictionary based on the different pieces of regulation. BIRD is not used in operational perspective as BIRD is not yet complete, however the dictionary is in certain cases used for definitions in the banks own data dictionary.
We would like to take the opportunity to clarify that we do not consider the DPM as a data dictionary but rather simply as a reference to fill in templates. In this context, it is important to also clarify that the DPM is not broadly used as a primary tool for the analysis of reporting templates, let alone, as a data dictionary of definitions of the prudential, resolution and statistical data requirements. In fact, based on a survey of EBF members, the DPM is rarely used by banks across Europe and whenever it is used, it is only for template-analysis purposes and not as a dictionary.
As previously noted in our response to question 6, in our opinion, building up a data dictionary should align the semantic definition (business content or business glossary), the syntactic definition (Metadata Model and Logical Data Model) and the infrastructure or physical level (Database, etc.), decoupling the business glossary from the syntactic and the database would misalign the three main units that must collaborate in the reporting process, the regulatory reporting experts, the data people, and the IT units.
We also consider as critical the definition of financial contracts hierarchies in the sense that more complex financial contracts should be a combination of simple contracts. A data dictionary should provide details about the business model it depicts. Thus, it should include:
o The data entities
o The data elements (definitions, data types, acceptable values)
o Data entity relations
o Data constraints and validations
The real added value is when the data dictionary is broadly defined in terms of data definition, validation, transformations, reporting logic and is transparent in the process of creation and implementation/in practice. However, we believe that the terms used should reflect the language of banks (possibly ready for human interface) and best match the operational systems of banks, not only for references purposes but as a real practically implementable dictionary. For example, supervisor's requests such as asking how many rooms are in a building are not considered common practice since we think this is irrelevant for building valuation.

Furthermore, communication with banks and data deliveries will have to be based on the dictionary (one language). Within the lifecycle of the data ending in the final regulatory reporting we should distinguish three stages:
o Data from sources (including transformations like RWA engine as a source)
o Data in datawarehouse
o Data in reporting environment.
The data dictionary should at least align with the last stage (as referred in the draft report) but also take the first stage into account because this is the start of the chain."
A single data dictionary as far as it includes all the metric and attributes that are to be used to produce every single data regulatory request is of the utmost importance to produce the data in a less burdensome and more efficient manner, to ensure an homogeneous interpretation of the request by the institutions and to address comparability, accuracy and DQ issues that hamper the usability of the data by the CAs. It will also help in the following aspects:
● It can be used as a catalogue to check for redundancies, overlapping and gold plating issues in the current or new data requests.
● It can be used as an inventory of all the metrics and attributes already reported, including their metadata information about refresh rates, last update, etc. that will foster data interoperability.
● Their structure division into semantic/syntactic/infrastructure layers enhance the data traceability and lineage.
● It will facilitate finding the data in the institutions databases as far as every single new data request is univocally defined within the dictionary.
● And in all the following topics as shown in the table under question #11:
o Understanding reporting regulation
o Extracting data from internal databases
o Processing and reconciling data before sending the data to the CA.
o Exchanging data and monitoring regulator´s feedback
o Preparing disclosures reconciled with regulatory data.
● Its role should be to provide a clear and unambiguous description of the reporting requirements and the prevention of multiple interpretations of the same data element.
● It should be the basis for all communication from and to the banks.
● It can support analysis between reports and prevents different definitions for different reports.
● It can also support creating new insights within banks when using uniform definitions.
● It will be the standard, according to which the bank will produce its regulatory reporting.
● It will be the standard with regard to the maximum data requirements a regulator will be able to ask in the short term.
SignificantlyModeratelyLow
Understanding reporting regulationX  
Extracting data from internal system X 
Processing data (including data reconciliation before reporting) X 
Exchanging data and monitoring regulators’ feedbackX  
Exploring regulatory data X 
Preparing regulatory disclosure compliance.X  
Other processes of institutionsX  
Please refer to our response (in pdf format ) for further details and explanations on each aspect listed above.

Understanding reporting regulation:

 A comprehensive and broad data dictionary will enforce unambiguity at all levels of the data pyramid, with fewer interpretation differences during the reporting process.

 Data definitions provide the business content of data stored in source systems. A common and standard data dictionary can significantly improve the processes of understanding data regulation if it is designed and implemented with a focus on data comparability and on serving all regulatory data chain processes. The data dictionary could then serve as a point of reference. Institutions can improve comprehension of ambiguous or seemingly contradicting data definitions by analyzing their interactions/ relations with concepts that are already well established. This is expected to:

o minimise the need for questions to the authorities and regulators and to free up resources currently employed in the single rulebook Q&A process,
o improve data quality at the level of the individual institution and
o improve comparability of the data received at the level of the authorities.

Extracting data from internal system:

 It helps selecting and mapping the data to the reporting requirements.
 Reduced number of alternative data flows (from internal systems to output files) compared with the current multiple and complex aggregations.
 By understanding and determining the data requirements unambiguously, the Bank will be able to design effective and efficient systems that satisfy both the business and regulatory needs. Thus, data extraction will become easier and without the need of multiple transformations that increase complexity and operational risks.

Processing data (including data reconciliation before reporting):

 If it fits banks' data strategy, in which reconciliation is well organised, the selection, delivery and reconciliation of data will be relatively simple.
 Reduced number of data transformations and alternative interpretations
 A standardized data dictionary will facilitate the reconciliation and control mechanisms in the data aggregation and processing phases.

Exchanging data and monitoring regulators’ feedback:

 Standardized and well defined data is a precondition for operational excellence.
 Regulators’ feedback would be more specific and traceable. It will make easier to monitor the regulators feedback, as it will refer to a mutually intelligible language.

Exploring regulatory data:

 If the ‘define once’ principle applies, regulatory data exploration and reconciliation would become more straightforward.
 It will facilitate the BI capabilities within the Bank, as it will provide a common, integrated infrastructure with all data elements in compliance with a data dictionary understood by everybody.

Preparing regulatory disclosure compliance:

 Again, the ‘define once’ principle will facilitate consistency checks with regulatory requirements.

Other processes of institutions:

 Data quality management
 Business development


Highly important
We consider it is of the utmost importance to have a single regulatory dictionary containing every single data point for every regulatory purpose under every approach (template driven or granular driven) and for every competent authority (CA) or central bank being at EU or National level. Furthermore, it is crucial to arrange this properly, otherwise the reporting process will continue to require expensive additional reconciliation activities, which will not reduce the reporting burden, but will increase it.
Highly costly
While we consider this highly costly in absolute terms, the comparison to continuing the current practice suggests the implementation costs could be reasonable assuming the industry will be able to decommission the current reporting frameworks and follow the one set of data requirement, one definition, one data delivery principle. We further consider the investment for moving into a unique regulatory dictionary will pay off by reducing the burden and increasing the quality of the data provided and the feasibility and sustainability of the whole reporting processes within all institutions. Moreover, continuing the current unharmonized, sometimes poorly defined data models, while at the same time supervisors will require more and more data driven reporting, is not sustainable and will only result in ever increasing costs.
High cost reductions
Including the national reporting frameworks into a single integrated approach, as the one described in this Feasibility Study, will play a major role in reducing costs since this will eliminate the first reporting layer heterogeneity among jurisdictions in the EU and thus enhance the level playing field and transnational industry competition and consolidation. The cost savings would be potentially large. In particular, for banks operating in more than one European country, with all their specific (additional) requirements / discretions, the potential of cost savings would be even more significant. In addition to the dictionary, we also expect one data delivery model based on the data dictionary to increase the potential of cost reductions.
High cost reductions
Including ad hoc requests in the dictionary will not reduce the production costs per se, as the ad hoc compliance due to its challenging time-to-delivery and its one-off characteristics does not rely on significant investments, but it will reduce the intense use of scarce human resources since when a request is well defined, and the dictionary will play a key role on that matter, the ad hoc difficulties would be dramatically reduced.
We consider it is missing (i) the benefits of communicating in “one language” and (ii) the “single source of truth” which (iii) can be found in one place. Other benefits could be higher efficiency, better and structured governance within the bank (due to aligned requests from the supervisor).
Under the current BCBS239 approach there is a huge way of improvement on the compliance assessment that the joint supervisory teams (JSTs) perform in the institutions. There is a lack of homogenous criteria. For some institutions, the JST is not considering the ECB Regulatory Reporting Dashboard scoring, for some others the compliance is based in the capacity of producing management reporting to deal with hypothetical crisis paying no attention to regulatory reporting and finally, there are some cases in which the only concern is the investment figures that the institution is going to allocate to the BCBS239 in the following years.
So, in our opinion as a starting point to improve the assessment of the BCBS239 compliance would rely on a standardization of criteria to the JSTs teams so the institution's approach would produce a real improvement in the reporting processes.
But beyond the assessment, there are other obstacles to real improvement of the reporting under BCBS239 approach. The lack of a common regulatory data dictionary, the duplication of data requests, the gold plating etc. are major obstacles to create a maintainable Trusted Data Source (TDS), with enough data quality standards and traceability. Only a thoroughly designed reporting process, integrated and centrally maintained can produce a meaningful and sustainable BCBS239 compliance in the institutions reporting processes. In that way, the common dictionary proposed in this paper, the reduction in overlaps and gold plating, the homogenization of an input layer in a central data collection point and the aim of avoiding the 1st reporting layer, are big steps towards the right direction.
Currently, there is granular reporting in place (such as AnaCredit, Loan tapes etc) and compliant with BCBS 239. In the context of the options presented, the aim is that the granular reporting scope will be broadened. This will be helpful for capturing data lineage as BCBS 239 requirement. The greater the granularity, the more consistent reports will be, but data quality requirements might become more stringent and harder to implement.
As final point it is important to stress that while evolving towards a more granular reporting, key conditions should remain in force to ensure success:
o Calculations: the calculations of legal ratios e.g. RWA, Capital, LCR, NSFR, etc., as well as calculation of underlying risk metrics e.g. PD, LGD, should remain to be done only by the banks since these are legally responsible for these calculations and should remain so as the results of the calculations are crucial for their regulatory compliance.
o Transparency: when the regulator would make aggregations, simulations, alternative calculations, banks should be fully informed about this by getting full insight in the methods and the data used by the authorities, and by getting sufficient time to analyze and challenge the results obtained by the authorities.
o Data protection: as the consequences of data leaks for transaction level data are even more serious than for aggregated data, the highest possible levels of data protection are necessary.
o Timing of data deliveries: not all granular data becomes available at the same time (calculated data requires more time), so this must be considered by the authorities.
o Cost benefit study should prove that more granularity, considering the above remarks, is indeed more cost efficient.
  • statistical
  • resolution
  • prudential
We consider that for all the reporting purposes, the path of more granularity must be explored. For the prudential reporting, there are legal constraints that must be carefully assessed, so while the legal ratios templates probably should be kept as they are there is a leeway to replace the drill-down templates by granular collections, enhancing this way the data exploration capabilities of the CAs.
On the statistics framework, the current design of the IReF is including a full granularity approach, enhanced of course by the lack of consolidation complications. Besides, for the Resolution framework, a hybrid solution must be designed as on one side there are legal ratios/figures as MREL and TLAC but, on the other hand, there are full granular requests as the bail in-able instruments, the BRRD for derivatives file or the valuation for resolution capabilities.
Nevertheless, the decision on which templates/requests can be replaced by granular datasets, the design of the level of granularity and the DQ rules to be applied, must be conducted in a very thoughtful way to avoid trial and error approaches that could lead to unnecessary and wasted investments.
A last point to note is that it is also important to take proper measures ensuring that banks are and remain compliant under all other legislation (e.g. GDPR).
option 2
Option 2 is the most feasible and preferred choice on the way to more granularity. We consider that both the CAs and the industry have to assess not the pros and cons of granularity, which are pretty clear, but how far we go granular under option 2 to reach most of the advantages of this approach without making the approach infeasible and unmaintainable in terms of burden. Since we appreciate it is early to take firm conclusions on this point, it is our view that option 2 should be the minimum level to aim for. Continuous reassessment should take place throughout the process exploring as to how close we could also go in the context of option 3.

The rank of the main challenges regarding the granularity approach under option 2 are the following:
1. Impact on third countries.
2. Not all the accounting balances can be explained from a loan-by-loan granular approach.
3. DQ rules: good enough agreement.
4. Legal constraints.

#1 For the more diversified European banking groups with subsidiaries in third countries, we would like to flag that although at an individual level the approach may be considered appropriate and the cost versus the benefits of harmonization would be beneficial, at a consolidated level for these banking groups not. As long as this would force subsidiaries in third countries to prepare excessively granular information in order to respond exclusively to European regulations/requirements for information that are different from those required locally in third countries and defined under different criteria, and thus very difficult to harmonize.
Therefore, we believe that requesting greater granularity at the consolidated level for third countries would be questionable in terms of:
i. the usefulness that this level of detail could provide to regulators/supervisors,
ii. the feasibility/capacity of moving/managing all such information from third countries (we are talking about millions of contracts) to institutions/regulators and supervisors.
iii. the legal constraints regarding the exchange of granular data with EU competent authorities from third countries.

#2 As the EBA explains in some paragraphs of this paper, there are some considerations to be acknowledged when we start going granular. The first one is that some of the balance sheet items cannot be explained in a loan-by-loan way. The three main reasons for that are the accounting level adjustments due to last time decisions at end-of-month, the misalignment between accounting figures and the aggregation of amortized cost of a loan-by-loan datasets and, finally, the accounting figures that are not explained at granular level because they include so many granular items that is not feasible to drill-down. Therefore, to generate a granular approach to match with aggregated-by-nature figures, like balance sheet accounting ones, we have to use two datasets with different levels of granularity, one with a loan-by-loan (or transaction level when needed) approach and the other with a more aggregated scope so the sum of the figures in the two may result in the balance sheet figures. The second one must be carefully monitored by the CAs, so for each institution the percentage of the balance sheet figures that cannot be explained from a loan-by-loan dataset must be kept steady at least.

#3 Under a more granular perspective of the regulatory reporting the DQ approach has to be carefully assessed and agreed between the industry and the CAs. DQi (Data Quality indicators) must be designed to measure the whole quality concept of a granular datasets keeping in mind that a 100% DQ is aspirational and that the progress of an institution towards that goal could lead to an infeasibility of the whole concept of interoperability under the integrated reporting approach. Reaching a 100% DQ would absorb a huge amount of investment from the reporting agent´s annual budget, thus progress demands a commitment on what level of quality is enough quality.

#4 As the EBA states in some paragraphs of this paper there is a legal constraint regarding the responsibility of the prudential templates and legal ratios information which resides in the reporting institutions. In order to allow that some of the templates could be replaced by granular datasets and derived from those datasets, the legal constraints must be removed and, of course, remark that a new governance for the derived templates must be established.
We consider it is too early to answer such question. First of all both the CAs and the industry have to discuss where to start the way to granularity, for instance the ECB has started to design a new granular approach to all their template-driven collection for statistical purposes under IReF project and in the same way the remaining CAs should define which of the current templates should be candidates to be replaced by granular datasets and after that, deploy a collaborative working group with the industry to define the level of granularity and the layout of the input layer of the data collection. Besides the design of that data collection layout, level of granularity etc. must comply with the use of the single data dictionary and the principle of scalability that could lead to a flexible design in which new data fields could be easily included to spread the granularity concept to other templates and other reporting frameworks.
Highly (1)Medium (2)Low (3)No costs (4)
Collection/compilation of the granular dataX   
Additional aggregate calculations due to feedback loops and anchor values  X 
Costs of setting up a common set of transformations*X   
Costs of executing the common set of transformations** X  
Costs of maintaining a common set of transformationsX   
IT resourcesX   
Human resourcesX   
Complexity of the regulatory reporting requirements X  
Data duplication X  
Other: please specify X  
- Anchor values and misalignment between granular and accounting
- DQ issues complexity
- Homogenization issues (from NGAAP to IFRS)
Highly (1)Medium (2)Low (3)No benefits (4)
Reducing the number of resubmissions  X 
Less additional national reporting requestsX   
Further cross-country harmonisation and standardisationX   
Level playing field in the application of the requirementsX   
Simplification of the internal reporting process  X 
Reduce data duplications X  
Complexity of the reporting requirements X  
Other: please specifyX   
 Less ad hoc requests

 More stability of the requests, as in general terms granular datasets are more stable than templates due to interoperability
As discussed in the answer to question 6, we would like to remark that current regulatory reporting processes are costly but beyond that it is very burdensome, and that the assessment of the costs and the benefit in monetary terms is beyond the capacity of the institutions. Nevertheless, as we agree that the granularity is the strategic approach to be followed to relieve the institutions from some of the more burdensome requests like the ad hoc, we think that a stepwise design of the transition to the data driven approach from the current template driven approach is a key matter to avoid a high-cost rework that could be the consequence of a poorly designed or hasty race for granularity.
If existing reporting frameworks will not be withdrawn, it will be harder for banks to comply with other legislation and/or gold plating is still allowed. Furthermore, the costs will exceed benefits and therefore it is not worth implementing.
Authorities and reporting institutions jointly
As we have discussed in the previous question our vision is that the definition should be made jointly, the responsibility remains in the reporting institution and the execution could be done for CAs and reporting agents in the CDCP.
From that perspective, and keeping in mind that the transformations affected by the previous paragraph would be only the ones that create the templates under current legal frameworks (or an evolution of the current ones, e.g. all the current FINREP templates could be split into the ones that are going to be kept as templates and the ones that are going to be replaced by a granular dataset, for the latter the transformation governance framework applies but also in the same way for all the new releases (ITS) for FINREP) the most challenging issues depicted in the paper would be:
● Legal issues, such as the need to modify current regulation to replace some template for granular collections, to create a new regulation to set the governance responsibilities in the steps of definition, execution, correctness, etc.
● Memorandum of Understanding with third countries CAs, as we have discussed previously and enforce the encryption of sensitive data where needed.
● Consolidation and NGAAP issues, could be easily reduced in most of the cases with a request based upon IFRS homogenized data that already exists within the institutions (to create current FINREP consolidated templates) and with a few attributes to univocally identify intra-group transactions (to be eliminated during the build of the highest level of consolidation templates process)
Creating a partnership in which jointly authorities and reporting institutions further develop this, together with a good governance framework.

• Manual Adjustments:
o defining adequate rules
Creating a partnership in which jointly authorities and reporting institutions further develop this, together with a good governance framework.

• Consolidated/individual figures:
o Alignment of data collection on prudential, statistical and resolution legislation
o Other supervisor scope should be preferred.
Creating a partnership in which jointly authorities and reporting institutions further develop this, together with a good governance framework.

• Different Valuations:
o Harmonization of valuation methods such as fair value, amortized costs etc.
o A unique valuation should be assured.
Creating a partnership in which jointly authorities and reporting institutions further develop this, together with a good governance framework.

• Principle Based Rules:
o Harmonization of the rules across all EU
Creating a partnership in which jointly authorities and reporting institutions further develop this, together with a good governance framework.

• Legal Aspects:
o Alignment in EU territory, such as confidentially and data privacy legislation etc.
o Legal aspects should be addressed at national level and taken into consideration at European level definitions.
The feedback loop approach should be closely linked to a comprehensive DQ framework. The feedback loop should consist of two layers or lines. First one should consist of agreeing with every reporting institution that their derived templates make sense and that no material mistakes have been made during the transformation process. Whilst the second one should be the dialogue about the DQi for that institution, completeness, accuracy, timeliness, outliers and all the remaining quality metrics should be discussed, and remediation plans agreed when needed.
Also, all transformed data, based on the central processing, used to generate reports and the commonly agreed upon metrics. Statistical data and aggregated prudential / resolution data should be part of feedback loop.
The analysis regarding granularity and transformation rules is comprehensive and we do not think that any other areas concerning those issues should be investigated at this stage.
Yes
Yes
o Gold plating, asking for more data
o Different (data) delivery models
o Other reporting timelines
o Different accounting standards
o Different definitions
o Overlapping data requests
o Local differences in regulation
o Different technical formats
o Different consolidation scopes
o Different frequency of the data requests
o Mix of aggregated and granular data
o As information is delivered with a different timeline, sometimes occur differences for a same reference date.
o Multiple effort to generate different reports covering the same subject / scope: reporting to different authorities increases institutions’ reporting burden mainly due to the differentiation of frameworks and data definitions. Moreover, there are often multiple reporting requests from different authorities for similar information leading to additional time and resources consumption.
o The obstacles relate to divergent requirements between various national authorities and versus the harmonized requirements. This increases the cost of data gathering and reporting in general significantly.
Multiple dictionaries
As a result of a different primary reporting definition and a different technical input layer in the different EU countries, banks are using a different semantic and syntactic data definition and a different semantic and syntactic definition for the data collection.
Others:
 Country by country or by reporting framework.
 An internal data dictionary.
 Some banks using one conceptual data dictionary with more technical implementations. Each system has its own implementation.
Different formats
Banks have different data dictionaries in the various countries, but also one “data glossary” in which they indicate the links between the terms in the different dictionaries.
Very important
It would be of the most importance as the CDCP will streamline most of the major reporting frameworks under one single dictionary and with a homogeneously defined collection layer, regardless the jurisdiction thus avoiding national primary reporting collection layers and enhancing scalability and reusability of the Data and IT regulatory projects for all the EU subsidiaries and branches. Besides, as we have discussed before within this document, the CDCP would enhance reported data interoperability and eventually reduce the ad hoc requests by enhancing data access for all the CAs under a common data model.
It is also very important, if meant that we can access the calculated data. Regulator and the bank should be looking at the same dataset, with same definitions, to make it easier for analyses and answering questions. This will prevent communication issues.
o Unique dictionary for all the data collections, regardless their final purpose, the requester authority, and the national vs EU nature of the request.
o Single collection layer, avoiding the primary/secondary reporting layers.
o Proper governance to reuse and share already existing data.
o Centralized transformation rules to create derived data/templates.
o Encryption facilities to protect sensible data from EU and third countries jurisdictions.
o Consistent interfaces for data collection across all jurisdictions and all types of reports
o Same protocols and formats for data exchange between the institutions and the Authorities
o Common roles and access control rules
o Quality assurance
o Quality controls and control framework
o A common clearly defined Data Dictionary covering the data definitions as well as data quality management principles and rules. Reduction of complex data transformation.
Challenges:
 Harmonizing all the national requirements into a single system/datamodel will prove challenging technically transferring, managing, and using that vast amount of data will be challenging, and will require a well-designed platform.

Costs:
 Switching all existing reporting to that new CDCP will no doubt come at a very high one-off cost and will require a long roadmap to achieve.

Benefits:
 A CDCP where all data requirements are harmonized will allow for a significant cost reduction in setting up and maintaining our internal data flows. It should also allow vendors of COTS banking products to offer an integration to the CDCP in their software (or at least a mapping to the correct data model).
 We also see important benefits of OPEX (operational excellence) related savings for both the regulators and the institutions. We note we would need detailed insight in the setup of the CDCP i.e. how the shared data is processed and secured by the authorities or its sub-service providers so we can 1) have adequate insight in how banks' data is protected by the authorities and/or their platform partners or 2) be able to contractually get banks' schedules and policies in place, or 3) have the Right to Audit the system or 4) be able to obtain another form of assurance.

Other aspects:
 Alignment of Data Definitions across the banks
Costs:
Implementation cost

Benefits:
Clearly Defined Data

Challenges:
Provide the appropriate / aligned information independently of the national/ local diversifications/ Definition Standardization

 Set up a simple/ plain data transformation model
Costs:
Implementation cost

Benefits:
Minimization of Data transformation

Challenges:
Adopt a model commonly and easily adaptable for the banks which at the same time will serve the needs of the European authorities

 Data Model Maintenance
Costs:
Maintenance Cost

Benefits:
A single point of maintenance

Challenges:
Data Model Maintenance scalability/ additions/ requests
While we consider this aspect could be discussed at a later stage of the project, we share some initial views for your consideration:
 A starting point should be a commonly used single granular data model in the European industry.
 Also, both the proposed "centralized system" (5.2.10) and "distributed system" (5.2.11) offer identical benefits to the reporting institutions, while we do consider the centralized system to have a lower TCO. Moreover, if the "distributed system" would again allow for national divergence of data requirements, this solution would be significantly worse than the centralized system from the perspective of the reporting institutions.
Yes, to a large extent
We reply to this question from the assumption that we are referring to internal costs. Yes, the industry is prepared to bear necessary costs to make the changes into its internal systems to embrace the integrated reporting as we consider that it is the right approach to streamline all the requests and reduce the reporting burden. Nevertheless, we have to keep in mind that this is a significant project that has to be carefully deployed in a stepwise manner and with the participation of all the involved stakeholders to ensure its success.
Whether the industry would be willing to share costs of other stakeholders, the industry is ready and keen to provide human resources i.e. data experts for the success of the project, but do not foresee to bear costs for authorities.
not valuable at allvaluable to a degreevaluablehighly valuable
Data definition – Involvement   X
Data definition – Cost contribution   X
Date collection – Involvement  X 
Date collection – Cost contribution  X 
Data transformation – Involvement X  
Data transformation – Cost contribution  X 
Data exploration – Involvement  X 
Data exploration – Cost contribution   X
Data dictionary – Involvement   X
Data dictionary – Cost contribution   X
Granularity – Involvement   X
Granularity – Cost contribution   X
Architectures – Involvement   X
Architectures – Cost contribution   X
Governance – Involvement   X
Governance – Cost contribution   X
Other – Involvement    
Other – Cost contribution    
Institutions’ contribution to costs should be limited to the engagement of human resources in each of the different elements of IRF
The experience of CAs and industry collaboration in reporting projects as BIRD, IReF and Loan Tapes standardization has shown that the involvement of industry experts provides a wide range of benefits like increasing the speed of the definition, create simplifications and shortcuts, etc. and in more general terms, to complete the 360 degrees data vision of the projects avoiding biasing them by only analysing the problems from the side of the CAs.
Consequently, the involvement of industry experts in the Data Definition, Data collection, Data transformation, Data dictionary, Granularity and Governance issues is of the most importance to reach that holistic approach of the new integrated reporting project if we want it to be feasible and usable both for authorities and institutions, and if we want it to fulfil the purpose of reducing the regulatory reporting burden.
It is not fully clear the part related to cost contribution. Banks expect no new reporting requirements as a result of this initiative, but harmonisation of definitions and the way of submission i.e. less complexity compared to the current way of working. Banks are keen to contribute to the development of the data dictionary (as specified in 11.3.4).

Further comments on each specific element:

Data definition: Clear definition helps in the implementation and increase data quality
Date collection: To-the-point data collection with higher value added purpose
Data transformation: Technical transparency
Data exploration: (i) Transparency. (ii) Data quality improvement
Data dictionary: Please see “governance” and “granularity”
Granularity: Institutions can provide suggestions on the feasibility of increasing the granularity of specific reports and data elements without inducing significant costs to the industry
Architectures: (i) Institutions can provide valuable input in the identifications of options with minimum disruption for the industry, by pointing out their current set up for report generation and dissemination. (ii) Choosing one standard architecture is important.
Governance: (i) Institutions, by being the recipients of all data requests within scope of the IRF and by having identified their similarities during the process of the analysis preceding the report generation, can provide valuable input towards the elimination of duplications in data requests and data transformations and facilitate reusability of data. (ii) Alignment between stakeholders. (iii) Access to the relevant institution data (privacy).

Cost estimate depends on the chosen scenario varying from full integration to optimising current DPM and granular reporting frameworks and leaving room for National discretions etc. Furthermore, it is very institution specific and requires elaborate analysis and it is not clear if implementation or regular costs are asked, one-off or recurring, depends on the decision on granularity and transformations discussed above.
A push approach
Advantages of push approach:
o It gives to the institution, responsible for the data, the end-to-end control of the data process and the control of the version history for every cutoff date. So, the institution keeps the tracking of changes between versions and what issues we are fixing with the newest version of the data.
o It gives to the reporting institution the responsibility on punctuality as it starts the delivery when all the internal checks have been tested and the internal controls thresholds have been complied with. So, institutions keep the ‘ready to go’ button.
o In general, the advantages of pull seem to be overstated in the draft report. While it is true that the authorities would be able to consult the data stored at the reporting institute at any time, it is strange to assume that data would be available faster in a "pull" concept than a "push" concept. The authorities would still give a deadline by when the data needs to be available, and the reporting institutions would use that time to verify the data before they publish it to the system where the authorities can "pull" it from. So, data would not be available significantly faster to the authorities in a pull than in a push concept. Additionally, it would pose a significant technical challenge. Would the authorities request that the reporting institutions' data repositories are built in a certain technology (or perhaps a limited list of accepted technologies), or will it maintain interfaces with all commercially available technologies? How would expected downtime be managed (which is different in each institution) or would the institutions be required to have a 24/7 guaranteed uptime of these systems? It is stated in a pull concept it is easier to send a granular amount of data but sending large amounts of data through an API/Web Service seems more challenging than with a managed file transfer which is typically used for a push. We see it feasible that a significant set of granular data is sent to the authorities in a push-concept, as is done with AnaCredit now.
Obstacles/challenges:
The main challenge is related to the minimum harmonization principle that should be completely removed (desirable) from the national/EU reporting regulation. Probably the NCAs and the NCBs would oppose any trimming of their national particularities.
Obstacles/challenges:
- GDPR
- Under this issue we have to keep in mind that the third countries’ Data Protection Regulations must be taken into consideration if we go forward into a more granular approach.

Possible solutions:
- Either reporting of aggregated data for natural persons or report data using anonymization pseudonymization techniques that will not permit the identification of data subjects.
- That issue could be addressed by a bilateral negotiation between EU authorities and third countries’ NCAs to ensure the sensible data protection in the CDCP by a reciprocity agreement or by the use of encryption procedures

Obstacles/challenges:
In our opinion the transformation rules from granular data to templates should be agreed by the industry and the CAs and after that, the execution of the transformation rules could be conducted in the CDCP and the outcome should be distributed to the CAs and the reporting institution at the same time. Only after those three steps in the transformation process the reporting institution would agree on keeping the responsibility of the data derived from their granular datasets.
Obstacles/challenges:

Next to the items already mentioned in the CP, in general terms:
o Harmonization of reporting requirement based on a non-harmonized legal framework of the different banking legislators that were created by the European Commission as a response to the great financial crisis and Eurocrisis; i.e. the ESA framework as well as SSM and SRB. Each legislator set its own reporting requirements and standards, which frustrates the creation of an integrated Reporting Framework. Furthermore, DGS frameworks are still memberstate specific and although in the ITS for FINREP there is a link with the ECSB statistical reporting, the reporting frameworks and reporting scope are different. In addition to this we note that certain reports are template driven where more and more request by supervisors are data driven. In requesting loan tapes, supervisors do not understand these differences in scope and frequency, creating that banks cannot deliver what is requested.

o Furthermore, the level one text of EU legislation does not always sufficiently reflect issues with regard to data driven reporting, like using consistent definitions between legal acts. This should either be adjusted at the level one text or should be delegated to the ESA's to cover a data driven reporting framework designed, issued and managed by them. We understand from fin-tech companies that are tagging the CRR, that in the legal framework itself also some inconsistencies are discovered, now they are able look at this regulation at a very low level of detail.

o Central data collection point
College agreement between the banking supervisors of the central data collection point and who has access to this collection point. It should not be difficult to arrange this when the European Commission sees the benefits.
Potentially additional legislation needs to set-up to deal with personal data by supervisors. As data should fit purpose, personal data should normally be a re porting requirements.

o Access to granular data by banking supervisors
As Europe is already in a data driven world, we do not foresee an issue with data access barriers as long as personal data is not an object of storage. The access criteria should be well defined. It might also be needed to provide access to national statistics bureaus as an important source for assessing economic trends.

o Responsibility of institutions
Institutions are responsible for providing correct, accurate and complete data to supervisors. So, the compilation of the data file is responsibility of the institution. When the institution has handed over the data file to the supervisor, the data belongs to the supervisor as after the handover moment the Institution has no control over what kind of operation is performed with the data, unless these operations are explicitly tuned between the Institution and the supervisor.

o We foresee legal constraints regarding data of foreign subsidiaries, in particular from outside the Eurozone (which covers the ECB (SSM) and the SRB) or EU (which covers the ECB (central bank and non-Euro NCAs and NRAs). We welcome the attention for GDPR compliance but note we would expect the data collection to exclude personal data amongst others because the authorities' powers to request personal data on a regular basis is very limited and personal data regulations require requests from public authorities to always be in writing, reasoned and occasional and to not concern the entirety of a filing system or lead to the interconnection of filing systems (recital 31 GDPR). We expect the European Data Protection Supervisor to be involved in this initiative at an early stage if personal data are included in the collection.
In our opinion there are many benefits in the ‘Agile Coordination Mechanism’ as proposed in the draft report, but the prerequisites for that to be feasible are the common dictionary and the CDCP. We further welcome the central coordination mechanism for data requests and consider it a significant improvement to the current situation. We experience that the sequential approach has so far worked quite well but is getting at the end of its lifetime as supervisors are requesting more and more data from institutions.
As stated in the draft report, consistency of definitions is key in an integrated framework. Each new element has to be added in the dictionary following the same constant approach and selection. So indeed, centrally managed and we favour a centralised approach. Even though we understand that competent authorities and banks have made investments in the past, we consider setting up a centralized framework as a new fundament preferable, even when this requires investments as we are of the view that all will benefit in the future and data driven reporting requires centralisation to be consistent and efficient. We therefore are not in favour of a hybrid approach. Also, for national and ad-hoc request, the hurdle to take to deviate from the centralised dictionary should be high and a simple cost assessment does in our view not fit the concept. When the framework reaches maturity, more and more definitions will be included in the centralised framework, meaning that less need should occur for national and ad-hoc requests. When new definition emerges, they should go through the centralised framework. The only exception in our view is in crisis situation as described in paragraph 398. Continuing with our way of thinking, you might understand that we are not in favour of paragraph 404 BS 406. In our view an authority should not have the option not to include existing national data requirements in the integrated reporting framework. In case EBA is of the view that this is inevitable, clear examples should be provided. When it would be allowed, without strict criteria to work around the central data hub, this will create inefficiencies and is inconsistent with a data driven reporting concept. Please note that also banks will have to earn back investments made to facilitate the highly appreciated concept of central data hub. When national competent authorities can easily circumvent this concept, we fear that banks will not benefit sufficiently.
We further need to ensure that Category 4 type B requests (non-recurring/ not present in the Data dictionary/ not collected through the CDCP) are kept to a minimum or else we run the risk of having a parallel segregated reporting environment that will eventually reduce the effectiveness of the IRS.
In the meantime, we consider discussion should take place on some kind of coordination committee of new data requests to generate some of the better governance benefits, like analysing new requested data with already existing data in any CA database, create tactical mechanisms of data sharing among CAs and get the advice from the industry regarding feasibility of the request, timeframe for delivery to avoid overlapping of requests in reporting process peak times, simplification strategies based on management data, etc.
This advisory role from the industry should be included in the ‘Agile Coordination Mechanism’ as a mutually beneficial aspect for the CAs and the industry.
We further agree that the Joint Committee should be a forum of authorities involved in the efficient development and implementation of an integrated reporting system. This should be an EU wide governance body with involvement of the industry ensuring the new system is feasible for both authorities and industry. Discussion should certainly take place about the specific role of the industry for which we are ready to propose possible ways.
Given that the formal establishment, by the European Parliament and Council, of this Joint Committee may take time, we suggest to, in the meantime, informally establish a Joint Committee to kick off work and help steering the main aspects of the integrated reporting system. This could also take the form of an informal coordination mechanism as noted in the Discussion Paper. In either way, the industry should have a role.
Since years, the EBF is in interaction with different stakeholders within Europe and is offering the industries’ expertise in view of reporting aiming to jointly develop the future of reporting. In the last years this close cooperation was, at least from our perspective, very useful and already resulted in fruitful results. Also, as the concept of BIRD has shown, banks can provide required added value to the process bringing with them the operational knowledge when collecting the data for the different supervisory reportings.
We agree with how the EBA has described the ‘Agile Coordination Mechanism’ as it resembles a simple and efficient way to conduct a better governance of new data requests and to use the full capabilities of the CDCP and the common dictionary. It could, for example, start with defining the data definitions per competent authority, as the objective of the data request can differ. Once this is done, it could see where there is overlap and to eliminate this overlap. Using a unified code for each data definition will help (and will facilitate machine readability). When an authority would like to add a new definition, it should clearly state why this definition in their opinion is not in the central dictionary already. A board of supervisors for prudential and statistical reporting shall than judge the new requests and when approved make sure that the data element is added to the central framework.
Yes, for instance the SSM Covid19 Templates, are recurrent but not indefinite and are not coming from regulations. Those ‘recurrent ad hoc’ data requests should be included in this category.
Also, there are other category not included in the ‘Annex I: Classification of reporting request’. There are granular datasets that the reporting institution has to produce and keep but are not collected by the CAs but only on demand or under a fire drill exercise. Examples of this category are the Valuation for resolution dataset for the SRB and the granular datasets of protected deposits under Deposit Guarantee Schemes.
Furthermore, we consider some flexibility is required. Therefore, instead of one category 2, we propose two:
 Category 2a: SSM requirements
 Category 2b: no SSM requirements
In our opinion it is quite difficult to cluster every single data request in a stone-written categorization criterion, therefore it would be wise to be flexible and to scrutiny ad hoc requests and find out if it fits in the integrated reporting approach or not.
Also, non-recurring (ad-hoc) data request should be avoided. We consider that sufficiently strong safeguards should be taken to prevent the misuse of category 4 items.
In our opinion, this matter of discussion should be addressed at a later stage.
Other (please explain)
The use varies among banks with no use, limited use with some using it to support data collection and data transformation, and others having substantial use at each reporting process step.
Data transformation
Banks see data transformation as the main part of the reporting process in which a RegTech could provide benefits. That said, it is also seen that the combination of all components brought together in a reporting service would provide benefits.
Yes, we agree with the conclusions.
Regarding further challenges, and in particular concerning the future of the project, we strongly advise high consideration should be given to use a qualified third party e.g. consulting firm with proven experience in developing regulatory reporting solutions. We are convinced that such step would substantially speed the process up. Having a qualified party playing the role of facilitator, especially on the technical aspects, would give a significant and needed boost for such complex project, with involvement of many stakeholders, to become more concrete and realistic. Otherwise, there is the risk the slow progress jeopardizes the future of the project.
We also see as another relevant challenge the standardised definitions (requiring the data dictionary in the broad sense as specified in 11.3.4) from the regulator, supported by an integrated EU wide governance.
in-house
We see that larger institutions better opt for in-house developments for all the steps of the regulatory reporting process. It should also be taken into account that RegTech vendor's role will be limited once the integrated standardized reporting architecture is implemented. They could have a role with regards to data exploration, to allow institutions for monitoring and answering questions from authorities. For smaller institutions, RegTech could play a role as a (full) service provider for (integrated) reporting purposes.
We consider the only party that can do this are the competent authorities by providing an integrated and well-defined data model. RegTechs could support by providing tools that can easily assess consistency of data elements and by tagging them to definitions in the level 1 or 2 text.
Yes
We agree harmonized data model is a base requirement and consequently the first step necessary for using RegTech. In fact, we consider the data standardization under one common dictionary is the cornerstone of any step forward into the integrated reporting project.
Francisco Saravia
E