Primary tabs

UniCredit S.p.A.

Banks (credit institutions) subject to CRR and national requirements. Priority should be given to larger banks (e.g. cross border institutions) that could take benefit from a more harmonized reporting framework
All regulatory reporting other than those submitted exclusively to NCAs where harmonization is hard to achieve and there is not enough data volumes where economies of scale can be obtained.
Items covered are complete; it could be added to the Data dictionary topic that the more standardization would facilitate comparability of data at a deeper level and managerial data exploration with the reuse of regulatory data.
Not relevantSomewhat relevantRelevantHighly relevant
Training / additional staff (skills) X  
IT changes   X
Changes in processes   X
Changes needed in the context of other counterparties / third-party providers X  
Time required to find new solutions   X
Other (please specify)X   
The main obstacles to be overcome are: 1) the adoption of a single and shared taxonomy (to be provided by the Authorities), including to the maximum extent the same degree of granularity; 2) the analysis of local peculiarities in terms of business and regulators requirements (including legal entities outside EU); 3) Change Management topics (i.e. resistance to change); 4) Legal barriers (e.g. different data/banking secrecy frameworks that could affect in particular the collection of granular data) 6) ICT investments and related costs due to possible decommissioning of actual solutions.
1) Different solutions in the data model and data transmission (DPM vs ERM) where DPM is less usable for data analysis; DPM is very rigid and maintenance costs are relevant and more frequent. Moreover DPM is prone to redundancy (e.g. Large Exposures: depiction of exposures towards Central administration that are reported as many times as are the connected group of clients).
2) For SRB are just mentioned EBA harmonized reporting and not valuation dataset which is another relevant granular data collection;
3) Standard OSI templates (e.g. credit risk loan tape) are missing as well
Highly agreeAgreeSomewhat agreeDon’t agree
Data Dictionary - Semantic level X  
Data Dictionary - Syntactic levelX   
Data Dictionary - Infrastructure levelX   
Data collection - Semantic levelX   
Data collection - Syntactic levelX   
Data collection - Infrastructure levelX   
Data transformation - Semantic levelX   
Data transformation - Syntactic levelX   
Data transformation - Infrastructure levelX   
Data exploration - Semantic levelX   
Data exploration - Syntactic levelX   
Data exploration - Infrastructure level   X
We deem appropriate keeping to the maximum extent the alignment between semantic and syntactic layer.
The scenario infrastructure-only is not realistic since it would be an unfeasible aggregation of different data.
The reduction of reporting costs should be realized reducing the requests of some aggregated information leveraging on (statistical) granular data. Obviously avoiding micromanagement in terms of full reconciliation but foreseeing some thresholds below which the data can be considered sound.
This is a stepwise approach: to cover the entire end-to-end chain of the data life cycle, the framework embraces a continuous improvement approach, enabling the growth and improvement of data dictionaries. We tend to use as much as possible homogeneous data dictionary/taxonomies . At the same time, in order to deal with the heterogeneous requests coming from the different regulators, our Data Lineage practice allows to manage several data dictionaries managing them in a consistent way.
Very often they are strictly related to several application/layers involved in the reporting framework and in some cases they are provided by external vendors.
The (syntactic) data dictionary should be a central catalog containing high-level business concepts, as well as their characteristics, which can be easily identified / understood / integrated / updated. A catalog accessible and appropriately updated by any licensed / authorized business expert would avoid data duplication and help building a robust / comprehensive dictionary. Once a certain level of maturity is achieved, a higher level of granularity could be contemplated and it might be useful to also include common transformation rules that can be involved in defining a high-level business concept. It should include, from a high-level perspective, all framework-specific semantic definitions it supports.
The data dictionary has a central role. It is the starting point in order then to guarantee a sound usability of data for different strategic options. It should be the basis for all communication from and to the banks and could be meant as a reference for the feasibility of new data requirements, at least in the short term.
SignificantlyModeratelyLow
Understanding reporting regulationX  
Extracting data from internal system X 
Processing data (including data reconciliation before reporting)X  
Exchanging data and monitoring regulators’ feedbackX  
Exploring regulatory dataX  
Preparing regulatory disclosure compliance.X  
Other processes of institutionsX  
Understanding reporting regulation: a single and shared data dictionary will strengthen clarity at all levels of the data reporting process with fewer interpretation differences
Extracting data: the functional analysis to define extraction rules could start from the data dictionary with savings, for example, in terms of capacity for their production.
Processing data: the dictionary would facilitate the reconciliation and DQ checks along data aggregation and processing phases.
Exchanging and monitoring: Regulators’ feedback would be more specific and DQ could be enhanced.
Highly important
Highly costly
Despite the possible increase in costs in the initial phase, related to the implementation of the unique data dictionary metamodel and the updating of all existing systems, once the unique data dictionary is consolidated and adopted, we believe that a significant cost reduction is expected. The initial phase of data analysis, mostly one of the most expensive phases of a project, would be greatly facilitated by a centralized data repository.
Finally, this activity should be seen in a medium long term scenario where relevant costs (investments) should overcome actual (equally high) costs and reporting burden.
High cost reductions
This choice would reduce current heterogeneity among jurisdictions in the EU and thus enhance the common level playing field in particular for cross boarders banks.
High cost reductions
The integration of ad hoc regulatory reporting with harmonised regulation into a unique data dictionary would solve discrepancies among transformations and business concepts' meaning. Hence, it would allow: on one hand a common comprehension, as well as usage, of the transformation rules for reporting frameworks; on the other hand to use the already existing assets as building blocks for new future transformations at different granular levels.
The dictionary would facilitate the detection of “single source of truth” data flows and boost governance within banks of the same group.
Granular data can be required in a context in which there is a great homogenization both in terms of semantic and syntactic otherwise it may generate additional efforts.
The definition of a granular dictionary and related transformation will enhance aggregation capabilities of the reporting institutions.
  • statistical
  • resolution
  • prudential
It may help across the different reporting areas. In terms of priority we propose (i) statistics (ii) resolution (iii) prudential as capital ratios would be included in the set of aggregated data to be calculated both by the regulator and the institutions.
option 2
Granularity can allow for more flexible use of data and easily prevent duplication; that said, a higher level of granularity could be less feasible initially and Option 1 would be the most feasible at an early stage, where the maturity level may not be as high and the costs are more significant.
Nevertheless, a gradual, carefully defined approach towards Option 2 make sense and is preferable in the longer term, e.g. to remove gradually data duplications and complexity where benefit/cost ratio is highest.
There are many challenges to be faced with granular data reporting:
- items that cannot be depicted with an instrument approach (loan by loan): e.g. other assets, consolidation adjustments
- DQ issues (DQ indicators to be defined)
- legal constraints (e.g. GDPR, banking secrecy, third countries legislation).
We believe the backbone of Option 2 should be to have granular data underlying Finrep since crucial for consistency/completeness purposes, less "transformation intensive", audit-relevant and as common denominator for many other reporting streams. Other reporting streams could be handled as gradual add-ons on this backbone, aiming to add only additional data specific for the reporting purpose and avoid/remove existing data redundancies. Instutions should be given ample room for shaping the roadmap for onboarding.
Finally, ESCB IReF project, which has a wide coverage of banking products/events, could be a good benchmark for future solutions.
Highly (1)Medium (2)Low (3)No costs (4)
Collection/compilation of the granular dataX   
Additional aggregate calculations due to feedback loops and anchor values  X 
Costs of setting up a common set of transformations*X   
Costs of executing the common set of transformations** X  
Costs of maintaining a common set of transformationsX   
IT resourcesX   
Human resources X  
Complexity of the regulatory reporting requirements X  
Data duplication X  
Other: please specifyX   
- DQ issues and management of several accounting principles
Highly (1)Medium (2)Low (3)No benefits (4)
Reducing the number of resubmissions X  
Less additional national reporting requestsX   
Further cross-country harmonisation and standardisationX   
Level playing field in the application of the requirementsX   
Simplification of the internal reporting process X  
Reduce data duplications X  
Complexity of the reporting requirementsX   
Other: please specifyX   
More stability in regulatory requests and fewer ad hoc requests
A key aspect, that could be a significant cost/complexity driver, is to avoid an excessive period of co-existence between old/new reporting paradigm, and the lack of modularity in the transition approach - which should allow the single banks different options for adopting the new framework.
Authorities and reporting institutions jointly
Harmonised and standardised, ready to be implemented by digital processes (fixed)
The definition should be made jointly but the responsibility of output data remains in the banks.
There are several challenges mainly related to legal issues:
(i) new regulations with new governance
(ii) MoU with third countries authorities
(iii) data protection management
Higher threshold of acceptance are considered useful.
Manual adjustments should be reduced as much as possible. In this perspective, the usage of End User Computing applications (IDP) should be discouraged.
Some consolidated data (portion of tempates or entities) could be left to direct contribution of banks (output data) without tracking the transformations from individual data.
Harmonization of metrics required
Harmonization of regulations
extensive harmonization in the EU
Yes
Yes
Different taxonomies, different rules (semantic, syntactic and technical protocols).
Multiple dictionaries
We are feeding a data dictionary with unique business terms (semantic layer) in the Group and more syntactic and technical implementations disseminated within legal entities applications.
Different formats
There are significant variations both between different reporting types and different countries. Nevertheless we are working in order to guarantee data lineage between different layers.
Very important
1) It would overcome current primary reporting collection layers and enhance scalability and reusability of the Data and IT regulatory projects for all the EU legal entities.
2) It would foster the dialogue between Regulators and banks
3) Ad hoc requests are expected to be less frequent since data sharing between authorities will be enhanced
Competent authorities should find the right governance rules, also to remove over time possible discrepancies&redundancies in - and to help define a clear and common/consistent evolutionary path for managing changes over time, aspects which are clearly made difficult by multiple data collection points. Some flexibility should be maintained In order to handle transition smoothly (in terms of changed cut of time/frequencies) for various reporting types/countries should be kept.
Among others, following items are worth to be mentioned:
- Clear governance for data usage and sharing
- Same protocols and formats for data exchange between banks and Authorities
- Agreed DQ framework
- data encryption



The main costs should be at competent authorities. A significant change in terms of homogenous semantic, syntactic and clear governance rules for the management of the CDCP and for the definition of new request. At institutions level we see only benefits after initial one off costs for adapting to the new set-up.
Yes, to a large extent
A clear and long term roadmap should be defined, and statement is of course dependent on complexity of final scenario. In our Grouo we are already moving towards the integration of assets and processes.
not valuable at allvaluable to a degreevaluablehighly valuable
Data definition – Involvement   X
Data definition – Cost contribution   X
Date collection – Involvement  X 
Date collection – Cost contribution   X
Data transformation – Involvement    
Data transformation – Cost contribution   X
Data exploration – Involvement    
Data exploration – Cost contribution   X
Data dictionary – Involvement   X
Data dictionary – Cost contribution    
Granularity – Involvement    
Granularity – Cost contribution   X
Architectures – Involvement    
Architectures – Cost contribution   X
Governance – Involvement  X 
Governance – Cost contribution    
Other – Involvement    
Other – Cost contribution    
Based on internal projects we are running for integrating architectures, we can estimate at least 150 million euros
A push approach
Absolute preference remains on a PUSH approach, also considering high heterogeneousness both between different group entities and reporting processes.
We would not change respect to a push approach. We see only drawbacks (process, data quality, governance) for a pull approach
Delivery of granular data for retail customers in light of local regulations (e.g. Austrian one) making it impossible to receive centrally granular data for retail perimeter
Other (please explain)
Data definition: only partially, for higher degree for some countries (e.g. Italy) or processes (e.g. Anacredit)
Data collection and data transformation: only partially, higher degree for some countries (e.g. Italy)
Partial/No used explained by a variety of factors, dependent on country/regulatory process: Not available at all (for some countries), not fully developed (for some countries), available but not useful for my needs (e.g. due to integration complexity)
We believe that RegTech development could be useful both to increasingly industrialize all 4 process steps, according to following remarks:
a) data definition (e.g. thanks to common metadata, shared among operators and regulators, and made available "real time" to the market through common tools)
b) data collection (e.g. through common repositories/granular data models, to be used for mapping from internal systems)
c) data transformation (e.g. to commoditize all "deterministic calculations" common across all banks, and subject to periodic updates from regulatory authorities)
d) data exploration (less important for RegTech, since more frequently needed in conjunction with internal data, but potentially useful e.g. to boost data quality checks on granular data)
in-house
Yes, we see this definitely as an interesting option, although currently with a preference for solutions allowing OnPremise installations. Our Group already today has a preference for RegTech/standard solutions wherever possible to address regulatory requirements, and especially so if the requirements are of European/cross-border relevance (due to the wide geographical scope of the group, both within ECB perimeter and beyond).
see question 51.
If data integration is referring to the "last mile" protocol vs regulatory authorities, we believe this should be a feature of available market tools.
Yes
Luca Guarinoni
U