Primary tabs

a) National Association of German Cooperative Banks / b) Fiducia & GAD IT AG on behalf of the German Cooperative Banks

We consider the proper approach should be starting with a narrow scope including all institutions that are required to report supervisory, statistical and resolution data.
In the light of recent proportionality discussions and the preliminary results of cost of compliance study, small and non-complex institutions should be considered in regard to cost reducing measures.
The feasibility study should focus on supervisory, statistical and resolution data on an European and national level. The aim must be to define once and report once in regard to all data required by the authorities.
Given the scope and depth of issues to be considered, it is impossible to determine whether the paper is complete.
The paper summarizes topics to be considered but it lacks detailed proposals. A quantified evaluation is simply not possible without an in-depth analysis and concrete processes. Furthermore, the technical aspects are, at this point, not the most relevant issues when discussing about implementing an integrated reporting system. Key aspects comprise, among others, a single, unique and common dictionary and the process design of reporting (layer format, data access on an institutional of supervisory level, data storage, contact partner, submission dates etc.) as well as possible intersections with the IReF and BIRD initiatives.
Given the current timeline for an IReF implementation by 2024, a certain basic BIRD approach is needed. The results of both IReF and BIRD need to be acknowledged in the future analysis on the feasibility of an integrating reporting system. Furthermore, taking into account those results business cases could be set up and valid cost and benefit analysis by the industry achieved.
In addition, the aspect of proportionality is not fully considered. The discussion paper mentions challenges to ensure proportionality and the need to define proportionality in the context of an integrated reporting system with more granular data requirements.
A possible approach could include the following aspects:
The later mentioned feedback loops and required anchor values in combination with a pull or push approach contain some wiggle room to explore proportionality.
Example:
The amount of required anchor values could be reduced for SNCI which would entail less transformations and data validation. Furthermore, easily monitorable thresholds could be implemented.
Overall, we consider this paper a starting point for further discussions in which the industry should be strongly involved.
Not relevantSomewhat relevantRelevantHighly relevant
Training / additional staff (skills) X  
IT changes   X
Changes in processes  X 
Changes needed in the context of other counterparties / third-party providers   X
Time required to find new solutions X  
Other (please specify)   X
As described in our answer to question 3, a valid evaluation at this stage may not be indicative. Without knowing any process details, we can only estimate the impact of an integrated reporting system.
Next to proportionality aspects (see comments on questions 3) it is difficult to identify further obstacles in detail given the nature of this paper. The paper mentions very important aspects, but it lacks a clear idea regarding processes. In order to quantify costs and benefits, more detailed information is needed on how the data retrieving process is supposed to work.
For example, the push and pull model. Does the pull model entail a real-time data pull? Will there be submission dates? Do institutions have the opportunity to approve the data before it is pulled? Are the date stored in the institution or a middle layer?
The paper correctly includes the evidence from the Cost of Compliance study when describing “overlaps between (EBA/standardised, regular) reporting requirements and reporting requirements of non-standardised/non-regular nature (ad hoc requests) as heavy contributors to the cost of reporting”.
We fully agree and also support the perception that increasing granularity not automatically leads to a reduction in the number of ad hoc requests (i. e. COVID reporting) as well as national data requirements. A further analysis on the reduction of ad hoc request is needed to shed a light on possible benefits of more granularity, especially for smaller institutions.
Highly agreeAgreeSomewhat agreeDon’t agree
Data Dictionary - Semantic levelX   
Data Dictionary - Syntactic level  X 
Data Dictionary - Infrastructure level  X 
Data collection - Semantic level  X 
Data collection - Syntactic level  X 
Data collection - Infrastructure levelX   
Data transformation - Semantic level  X 
Data transformation - Syntactic level  X 
Data transformation - Infrastructure level  X 
Data exploration - Semantic level  X 
Data exploration - Syntactic level  X 
Data exploration - Infrastructure level  X 
We agree on the holistic approach that has been described.
The assessment made, on the other hand, leaves us with further questions. How does a non-alignment of all levels deliver a common understanding within an institution? A non-alignment might lead to misunderstandings and further internal transformations of data and hence, higher costs. That being said, an alignment and a start at the semantic level seems opportune.
In addition, we want to emphasize the papers assessment that “it is of utmost importance that prudential and resolution reporting requirements stay fully aligned with the underlying regulations and accounting standards.” Although this sentence is included twice in the paper, an analysis how to achieve an alignment is missing.
Furthermore, any results from parallel initiatives (i. e. BIRD survey or IReF) should be taken into account.
As mentioned in our answers before, a specified cost and benefit analysis is not possible given the scope of the discussion paper.
Nevertheless, we imagine higher costs due to testing phases (parallel data storage and transformation) and cross validation.
More. EBA DPM (i. e. ITS), ECB (AnaCredit), definitions by national authorities, accounting frameworks.
A data dictionary defines the basic data requirements (i. e. definition of a loan, accounting value etc.) and in addition the associated aggregates (transformations). It should align the semantic, syntactic and infrastructure definitions so that institutions can implement coherent data process chains.
The dictionary should be the base for the communication between authorities and institutions (one language approach), on a national and European level. Furthermore, it should be fully aligned and not interfere with national accounting standards.
A single and common data dictionary could have a cost reducing impact when overall regulatory requirements are defined clearly, and redundancies are eliminated. In turn, the data quality could be improved.
For example, a harmonization of definitions should be analyzed: First, deviations need to be identified and then examined. Second, evaluation of added value of different requirements. Third, deletion of data requirements without added value.
Given the preliminary results of the Cost and Compliance Study, a single and common dictionary might lead to a better understanding of the regulatory requirements and hence might improve the data extraction, transformation and monitoring.
Furthermore, comparability and transparency across supervisory and statistical reporting could be improved (including data validation).
SignificantlyModeratelyLow
Understanding reporting regulationX  
Extracting data from internal systemX  
Processing data (including data reconciliation before reporting)X  
Exchanging data and monitoring regulators’ feedback X 
Exploring regulatory data X 
Preparing regulatory disclosure compliance. X 
Other processes of institutionsX  
A standard data dictionary would reduce the time for data validation across different reporting requirements (national vs European level) and enhance data transparency.
As the preliminary results of the Cost of Compliance study show, understanding the regulation is an important factor for high costs. Furthermore, a better understanding leads to clearer processes to extract and process data.
Highly important
Please see answers to questions 9,10 and 11.
Highly costly
Rather highly costly, no concrete estimate possible. See answers to questions 3,4,5 and 6.
High cost reductions
An added value can only be generated if European requirements equal national requirements and hence data points can be reduced.

If the reporting requirements are not reduced, we expect a rather moderate cost reduction. But without any further details no concrete estimate is possible. See answers to questions 3,4,5 and 6.
No estimate possible. See answers to questions 3,4,5 and 6.
According to the preliminary findings of the Cost of Compliance Study, ad hoc requests entail high costs for institutions. Hence, integrating those requests might reduce costs which arise due to identifying the required ad hoc data sets. Nevertheless, an integration is only feasible if ad hoc requirements are identified in advance which is rarely the case (i. e. COVID-19 reporting).
We do agree to some extent, but at this stage no concrete cost and benefit analysis is possible.
Another benefit: In case of an implemented unique data dictionary, national requirements could be reduced if NCAs accept those definitions and extract data for national interest from EU reports.
Further costs: Taking into account national accounting standards, a unique data dictionary needs to comprise all requirements. If not, additional compliance costs would be created. Nevertheless, national accounting frameworks must be considered.
In the banking process, BCBS 239 optimised processes allow individual institutions and groups to report data stemming from a single data warehouse. In the case of aggregation on the supervisory side (option 2), it is unclear how technical consolidation can be carried out in order to compile a group report. There is a fear that group and individual institution reports will be produced in different applications. This contravenes, for example, Principle 2, paragraph 33 of the BCBS 239 paper: “A bank should establish integrated data taxonomies and architecture across the banking group, which includes information on the characteristics of the data (metadata), as well as use of single identifiers and/or unified naming conventions for data including legal entities, counterparties, customers and accounts.”
Furthermore, in principle 3, the requirements of paragraphs 36c and 36d can no longer be optimally met: Today’s aggregation in the banking process uses, for example, the same technical rules in the balance sheet statistics, which are also used for accounting purposes. This architecture minimises the reconciliations with accounting data required by paragraph 36c.
  • statistical
  • resolution
  • prudential
For all reporting areas, the use of granular data can present a solution to achieve the aim to define once and report once. In other words, a granular approach over all reporting areas is the only way to achieve the objective of reporting once.
Furthermore, the reduction of cost must be an overall condition for any granular requirement (see Cost and Compliance Study).
option 2
To identify main challenges and possible solutions, we need more information on the data processes and requirements. How many anchor values are required and how would feedback loops be implemented? Who is in charge of the data collection layers and what data needs to be transformed? The level of granularity needs to be defined.
With option 2, Institutions would have to report granular as well as aggregated data and hence institutions would need to keep using aggregation tools. Furthermore, aggregated data in banks might still differ from aggregated data on an authority level since the aggregations took place in different systems.
In addition, proportionality is not fully defined in this discussion paper. For further aspects regarding proportionality, please see our answer to question 3.
We consider it is too early to answer this question. It is important that one common layer comprises all three pillars and that redundancies are avoided.
Highly (1)Medium (2)Low (3)No costs (4)
Collection/compilation of the granular dataX   
Additional aggregate calculations due to feedback loops and anchor valuesX   
Costs of setting up a common set of transformations*  X 
Costs of executing the common set of transformations** X  
Costs of maintaining a common set of transformations X  
IT resourcesX   
Human resources X  
Complexity of the regulatory reporting requirements X  
Data duplication X  
Other: please specifyX   
Highly (1)Medium (2)Low (3)No benefits (4)
Reducing the number of resubmissions  X 
Less additional national reporting requests X  
Further cross-country harmonisation and standardisation  X 
Level playing field in the application of the requirements  X 
Simplification of the internal reporting process X  
Reduce data duplicationsX   
Complexity of the reporting requirementsX   
Other: please specifyX   
The costs could exceed the benefits as follows:

- lack of proportionality: It is likely that banks have to validate more data than before.
- various reporting processes: The implementation costs for infrastructure for push and pull data requests.
- high amount of feedback loops and anchor values
The authorities
Harmonised and standardised, ready to be implemented by digital processes (fixed)
Authorities should be responsible for defining transformations in any scenario. Further cost and benefit analyses at this stage are not possible.
Those are mainly due to existing different definitions or regulation amendments on short notice. In case of a single, common dictionary, the number of manual adjustments could diminish if redundancies are eliminated.
Alignment of consolidation scope and data collection on prudential, statistical and resolution legislation might be a solution.
A harmonization of valuation definitions should be analyzed but national accounting frameworks must be considered. First, deviations need to be identified and then examined. Second, evaluation of added value of different requirements. Third, deletion of data requirements without added value.
A harmonization of valuation definitions should be analyzed but national accounting frameworks must be considered. First, deviations need to be identified and then examined. Second, evaluation of added value of different requirements. Third, deletion of data requirements without added value.
Anonymous data transformation on the data hub might be a solution and data should only be shared with parties of interest.
Generally, feedback loops could increase costs for institutions need to transform granular data to compare those to aggregate data sets transformed by the NCA. Feedback loops should focus on current aggregates already reported and internally implemented by banks (i. e. capital ratios, LCR, NSFR).
For proportionality aspects, SNCI should have lower data requirements in regard to feedback loops to enhance proportionality (see also answer to question 3).
The paper correctly includes the evidence from the Cost of Compliance study when describing “overlaps between (EBA/standardized, regular) reporting requirements and reporting requirements of non-standardized/non-regular nature (ad hoc requests) as heavy contributors to the cost of reporting”; in our case, this regards mainly national ad hoc requests.
We fully agree and also support the perception that increasing granularity not automatically leads to a reduction in the number of ad hoc requests (i. e. COVID reporting). A further analysis on the reduction of ad hoc request (especially national ad hoc requests) is needed to shed a light on possible benefits of more granularity, especially for smaller institutions.
In addition, please see our comments regarding proportionality (i. e. Question 3, 22, 27).
At this point, it is difficult to really evaluate the topics of granularity and transformation because of the lack of explicit data process description. In order to give more feedback, specific aspects on how data is retrieved or pulled or pushed are needed.

No
No
Given our IPS structure, a report for the IPS group has to be submitted to another “authority”, our DGS. A solution is needed for IPS group reports.
Multiple dictionaries
Currently, there are only dictionaries on the manufacturer’s side, which are not visible to the banks. It is planned to visualise the dictionary in the following stages of development.
Different formats
Somehow important
At this stage, the implementation process is not clearly described yet which is why a valid evaluation is not possible.
Nevertheless, institutions need to be able to access reported data for transparency and validation reasons.
Furthermore, it is important to have contact partners on a national level to avoid any language barriers. Additionally, German NCAs have more experience with IPS structures.
At this stage, the implementation process is not clearly described yet which is why a valid evaluation is not possible (see also question 33).
Further details on data quality management are needed: How is data validated? Via cross validation or feedback loops?
At this is stage, this question cannot be answered. It depends, what part of integrated reporting shall be financed by the industry.
not valuable at allvaluable to a degreevaluablehighly valuable
Data definition – Involvement    
Data definition – Cost contribution    
Date collection – Involvement    
Date collection – Cost contribution    
Data transformation – Involvement    
Data transformation – Cost contribution    
Data exploration – Involvement    
Data exploration – Cost contribution    
Data dictionary – Involvement    
Data dictionary – Cost contribution    
Granularity – Involvement    
Granularity – Cost contribution    
Architectures – Involvement    
Architectures – Cost contribution    
Governance – Involvement    
Governance – Cost contribution    
Other – Involvement    
Other – Cost contribution    
At this stage, the implementation process is not clearly described yet which is why a valid evaluation is not possible.
A mixed (pull and push) approach
A mixed approach would, on the one hand, enable banks to validate pushed data (i. e. anchor values) beforehand and hence could reduce feedback loops. Pulled granular data, aggregated on EU level, on the other hand, could reduce reporting processes and hence costs.
But most important for an evaluation is the process framework (layer format, data access on an institutional or supervisory level, data storage, contact partner, submission dates etc.). See also question 3.
In the push and pull procedures described above, the technical issues are described. No comments are made on the procedural issues and requirements arising therefrom.
It follows that in the push process, the data centre and the associated banks keep matters such as delivery dates, frequency of delivery, delivery period, corrective notifications, capacity, resource use, quality assurance processes, etc. in their own hands, whereas in the pull process these issues appear to be largely outsourced or require intensive coordination and clear guidance.
Other (please explain)
Since 2016, we have been developing an all-banking platform with state-of-the-art technologies to fully meet banks’ needs in the medium term. Therefore, we are not dependent on RegTechs.
a) Frank Bouillon / b) Dr. Thomas Ester
a