All entities as considered in the regulatory scope of consolidation for any purpose of prudential authorities, all stakeholders in the area of statistics, prudential and resolution
Like in Answer 1 we prefer a comprehensive approach covering the requirements of European and national prudential authorities (EBA, SRB, SSM, ESZB Statistics, BIZ, national central banks und supervisory authorities etc.). National data (without an opt-out option) requirements need to be included
The scope is comprehensive and can be used as a starting point for a discussion with the industry on European level, with priority given to the data dictionary and granularity.
In general, many key aspects have been addressed, although considerations for the procedural handling of a more granular re-porting system, including data validation/corrections and reporting follow-up processes (in this context, the considerations on feedback loops) could be further elaborated in terms of their sub-stance.
Equally, the options for taking proportionality into account in a granular reporting system should be addressed in greater detail.
Training / additional staff (skills)
Changes in processes
Changes needed in the context of other counterparties / third-party providers
Time required to find new solutions
Other (please specify)
Data Dictionary - Semantic level
Data Dictionary - Syntactic level
Data Dictionary - Infrastructure level
Data collection - Semantic level
Data collection - Syntactic level
Data collection - Infrastructure level
Data transformation - Semantic level
Data transformation - Syntactic level
Data transformation - Infrastructure level
Data exploration - Semantic level
Data exploration - Syntactic level
Data exploration - Infrastructure level
Necessary validation prior to implementation (negative example: procedure in the case of definitions for AnaCredit purposes)
In the case of a transition to more granular reports, a long phase in parallel to the current system/process
ESBG members use EBA data point model (DPM) and internally developed data dictionaries on the basis of regulatory and statistical requirements. Some members have to use for external reporting a predefined data model from the National Bank. The respective data model from the National Bank considers also a reporting layer.
The data dictionary should at least consider the input and output parameters incl. transformation logics to ensure data lineage. The characteristics are generally complete, important especially redundancy-free and unambiguously defined.
For integration, it is very desirable for the national reporting requirements to also be included in the standardised data dictionary, or for the standardised data dictionary developed at Europe-an level to be extended on a country-specific basis to include national requirements.
- Business dictionary/ definition of terms in business language.
- Promotes an understanding of and reconciliation between different reporting requirements
- Defines the data (input interface) for the reporting system
Understanding reporting regulation
Extracting data from internal system
Processing data (including data reconciliation before reporting)
Exchanging data and monitoring regulators’ feedback
Exploring regulatory data
Preparing regulatory disclosure compliance.
Other processes of institutions
Understanding reporting regulation - If semantically distinct
Exploring regulatory data - Exploring (including with regard to any ad hoc) requests is made easier for the supervisor
Other processes of institutions - Risk management in the institution
Moderate cost reductions
Moderate cost reductions
It depends on the lifetime of existing IT systems in the bank, in general they are understandable
Irrespective of the granularity (depth/scope) of data to be re-ported, BCBS 239 is complied with. Our response is therefore neutral.
Statistical - In principle, yes (in individual areas); however, granular data should only be reported where necessary, as the more granular the data is, the more cost-intensive the reporting becomes.
Resolution - In principle, yes (in individual areas)
Prudential - In principle, yes (in individual areas); problematic with regard to process-related requirements (requirements highlighted in the discusion paper for data quality assurance via feedback loops and anchor values); preservation of proportionality
Option 2 preferable under certain conditions, design (e.g. scope and depth of granularity/associated costs) still too vague, only if current template-based reporting system is eliminated.
Avoidance of multiple re-porting processes, i.e. elimination of template-based reporting with summary sheet reports with aggregate values (as is the case today) and additionally an integrated more granular report that ultimately serves only to verify the summary sheet report. This results in an x-fold bur-den on the institutions and multiple effort. Process/design of the feedback loops and required (partial) replication of the transformation rules (if implemented centrally), resulting in a risk of permanently high resource costs compared with the status quo, or possibly even higher); Preservation of proportion-al design
More likely not Feasible (e.g. in terms of the granularity of the FINREP report)
Granular data can be raw data (e.g. name, first name) or derived data (e.g. client classification ), as well as partially off-set/computed values (e.g. offset collateral, total exposure). In our view, aggregation at client level or at the level of a “group of connected clients” is also still feasible. Any aggregation beyond this, e.g. at institution or segment level, is no longer expedient in our view
No costs (4)
Collection/compilation of the granular data
Additional aggregate calculations due to feedback loops and anchor values
Costs of setting up a common set of transformations*
Costs of executing the common set of transformations**
Costs of maintaining a common set of transformations
Complexity of the regulatory reporting requirements
Other: please specify
- Risk of losing proportional design. This must still be ensured.
- Process/design of the feedback lops and mandatory (partial) replication of the transformation rules (if performed centrally), resulting in the risk of permanently high and possibly even increased resource expenditure compared to the status quo
No benefits (4)
Reducing the number of resubmissions
Less additional national reporting requests
Further cross-country harmonisation and standardisation
Level playing field in the application of the requirements
Simplification of the internal reporting process
Reduce data duplications
Complexity of the reporting requirements
Other: please specify
Proposed response geared towards Option 2
- Multiple reporting processes if additionally integrated and more granular reporting were to be required as well as the summary sheet reports (as is the case today), i.e. (extensive) elimination of template-based reporting is necessary.
- Process-related costs in connection with feedback loops and anchor values, coupled with additional implementation of transformations by the institutions (also to maintain information symmetry with the supervisor)
- Elimination of proportionate design
- Possible strategic considerations in connection with increased transparency / data sovereignty
- Increased reporting frequencies in connection with procedural handling (defined time frames for reporting submissions, release process of quality-assured data by the institutions and limited dead-lines for queries must be maintained regardless of the technical way of data collection (push/pull))
Harmonised and standardised, ready to be implemented by digital processes (fixed)
Authorities Costs - Risk of additional financial burdens for institutions due to cost allocations, involvement of the industry in an advisory capacity
Authorities Benefits - Binding nature, standardised, understanding of supervisory considerations, information symmetry
Authorities Challenges - Speed of development and continuous adjustments
Authorities Design options/solutions - Adequate implementation periods
Open as regards nGAAP
More mathematical, unambiguous definitions, fewer legal definitions
Principle of confirmation of anchor values should be followed. Generally, feedback loops are problematic, as presumably there will be very high process-related effort. Anchor values – which are critical for the supervisory assessment – must be adequate, in conjunction with the introduction of tolerance thresholds.
Proportionality approaches in a granular integrated reporting system:
feasible approaches would be a streamlined set of anchor values, streamlined feedback loops, introduction of tolerance thresholds, elimination of entire re-ports or individual reports within a reporting area depending on the size of the institution and/or the business model , reporting frequencies, data topicality requirements
Some of our members report to other authorities in host countries, If they do so, different local specifics/definitions cause differences that are challenged by the authorities.
- Simplification of information exchange (data sharing/data coordination) between authorities in the sense of “report once”
- In the case of a CDCP, mandatory preservation of communication via national supervisory authority (language barrier in the case of centralised contact, high level of understanding of the savings banks, their business and network structure at national level)
- Incorporation of national reporting requirements into a central data register
- Information security, high availability, access to all data submitted by the institution concerned
Improve governance so as to intensify data sharing/data coordination by adapting the legal conditions for intensifying data sharing between the relevant supervisory authorities; establishing up a central data register is therefore adequate from the perspective of the savings banks (rather than a centralised data collection) It will be necessary to examine whether it is appropriate to interpose the NCAs between the CDCP and the institutions so that it will also be possible to cover the national reporting system in a single architecture and a single process
not valuable at all
valuable to a degree
Data definition – Involvement
Data definition – Cost contribution
Date collection – Involvement
Date collection – Cost contribution
Data transformation – Involvement
Data transformation – Cost contribution
Data exploration – Involvement
Data exploration – Cost contribution
Data dictionary – Involvement
Data dictionary – Cost contribution
Granularity – Involvement
Granularity – Cost contribution
Architectures – Involvement
Architectures – Cost contribution
Governance – Involvement
Governance – Cost contribution
Other – Involvement
Other – Cost contribution
Data dictionary - In our view, specification of the data definition
Granularity - In our view, specification of the data definition
A push approach
The technical way to supply data as such (push and/or pull) is not decisive from an institution's point of view(although some of the ESBG members would support the push approach), it depends on the respective process-related framework conditions, which are, however, too unspecific in the discussion paper, hence also question 41 without an answer.
For the processual handling are in particular compelling:
- receipt of defined time windows for message submissions,
- receipt of release of quality-assured data by the institutions, and
- definition of limited deadlines for queries/validations.
We would like to answer:
b. Data collection
c. Data transformation
d. Data exploration
Partially , Data processing centre clients do not want to invest in reporting at this time
via a service provider
By having data transformation and connecting data points with underlying regulatory requirements. Data quality, see above