Response to consultation on revised Guidelines on stress tests of deposit guarantee schemes

Go back

Question 1: Do you have any comments on the proposals to address the issues of the definitions, the test formats, the use of granular tests and the arrangements to ensure objectivity?

We appreciate these proposals, although we fear that the overall amount of work to be done alongside these tests will be very cost-intensive and time-consuming - without gaining a considerable win of re-sults, real "objectivity" and therefore knowledge.

We, still, strongly recommend a "leaner" approach.

Question 2: Do you have any comments on the proposals to increase the complexity of stress tests (as provided in Guideline 4.109), retest the aspects impacted by significant changes in DGS’s systems and processes (Guideline 3.4), and require stress testing all the legally mandated functions of a DGS (Guideline 3.2)?

The approach to increase complexity and sophistication of the design of the stress tests is initially sensi-ble, but over time, at some point, this requirement will result in a much larger burden than gain in terms of insights for the DGS as well as EBA etc. The focus should be on realistic scenarios, and designing tests as close to reality as possible, in order to achieve meaningful results.

In particular for SCV file tests, for the sake of comparability of results over time (for one individual DGS), a standard setup would even be more useful than constantly changing parameters.

It seems neccessary to think about stretching the timeframe of the testcycle to five years and how many SCV-file-tests really are neccessary. Another possibility would be to scale the amount of SCV-file-tests in correlation to the number of cross-border-partners of ones system as those tests are part of cross-border-tests.

Question 3: Do you have any comments on the grading of the outcomes (as proposed in Guideline 5), the deadline for the next report submission (Guideline 6.2), the reporting of the area not required by the Guidelines and the background information on the DGS to fill in the revised reporting template?

The declared aim of the GL remains to improve comparability of test results, besides greater prepared-ness and reliability of DGSs stress tests. However, taking into account the increased complexity of the reporting template, which still leaves a lot of room for individual answers, presumably varying strongly in contents and depth, and will lead to an even larger amount of unstructured data to be evaluated in a peer review, it seems unrealistic to achieve an actual comparison of the results received. The answer/ scoring mode for the indicators cannot in many cases provide comparable results. Even for numeric (quantitative) results in the same unit measures, a qualitative rating can only provide sensible infor-mation (with a view to comparing it) if it is done according to the same standards/reference framework by all DGSs. Otherwise the same quantitative result can be transformed into potentially any qualitative rating.

Comparability should therefore not be considered as one main objective of the revised Guidelines. It should have only a subordinated role. The absolute main goal should be that the DGSs´ functions work well so that the respective DGS is resilient and able to fulfil its legal requirements entrusted to them in accordance with Directives 2014/49/EU and 2014/59/EU. The assessment of which should be the goal of the peer review.

The reporting deadlines should be published in the most recent peer review only, in order to avoid confu-sion. The term “other means” is not expedient (difficult to be monitored).

The definitions of the qualitative grading categories are a bit “hidden”; it would be useful to add them in the template’s instruction sheet for reference.

Support of a longer timeframe for the stress-testing cycle (at least the first after adoption of the new GL) to avoid mere working off of (significantly increased) duties to meet the deadline. This is of particular relevance where DGS have functions beyond payout and contribution to resolution mandated to them, considering that a carefully planned comprehensive simulation takes several months of preparation. Moreover, the increased data requirements and changes in procedural and technical aspects of stress testing will need to be translated into a corresponding stress test program first of all by all DGS, which will shorten the timeframe actually available to conduct the required stress tests until 2024 even further.

2.27: The reporting towards the designated authority should not be confined to the template provided in the GL. It should be possible for the DGS to use own templates already developed. Likely, these will be more suited to the purpose already, containing the information that is of relevance to the designated authority.

Taking into account the potentially sensitive nature of information contained in the stress test report, the dissemination and publication /leakage of which might pose dangers to financial stability, the sharing of outcomes of stress tests with other authorities should be restricted.

E.g. „They can decide to share the outcome or parts of the outcome of the stress tests, in the form of the reporting template or in another form, with relevant authorities, upon their request and subject to applicable confidentiality provisions, and provided the authority has a legitimate interest.”

Question 4: Do you have any comments on the proposals to address the issues related to the design, execution and reporting of SCV files stress tests, in particular the differentiation into two categories of SCV files stress tests (as proposed in Guideline 4.7), stress testing the SCV files of all relevant institutions (Guideline 4.9), and the definition of substandard entries (Guideline 4.19)?

With regard to the proposal made in 4.12 to require DGSs to inform credit institutions how well they per-formed in relation to industry average, it should be left to the DGS if and how they want to incentivise further improvement. Apart from that, as the test assesses the fulfilment of legal duties, it is less a mat-ter of incentive rather than how institutions failing to do so are dealt with – which depends on the re-spective legal / regulatory framework the institution and DGS are operating in and is not in the control of the DGS. This should therefore not be an aspect of DGS stress-testing.

3.10 regarding testing the repayment function requires DGS to assess the quality of its internal process-es to collect and analyse the SCV files and liaise with the relevant credit institution for requesting fur-ther/corrective data if needed. In this context, the requirement and subsequent assessment of i4 makes sense. However, in the context of the SCV file test, the indicator seems to make less sense, because it does not provide relevant information as to the ability of the DGS to perform the related tasks in a pay-out case. The interaction with the members during a system-wide SCV file test is necessarily different from a real payout case, and therefore very limited insights can be gained from it for the purpose of as-sessing the resilience of the DGS.

Question 5: Do you have any comments on the proposals on the assessment of the IT safety (as proposed in Guideline 32), stress tests of reimbursing depositors in complex cases (Guideline 48 and 49) and selection of DGS partner when stress testing cross-border cooperation (Guideline 4.51)? Do you have any further comments on the design, execution and reporting of operational capability tests?

The likelihood of having to o cooperate with certain partner DGSs is in large part dependent on the num-ber of cross border branches in the jurisdiction concerned. Therefore, also considering that more infor-mation might not be available to the DGs, this seems a sensible and valid risk-based approach in the majority of cases. If more information is available and leading to a different result, this can of course be used additionally to make a decision.

Question 6: Do you have any comments on the proposals on stress testing the adequacy of the funding means for each affiliated credit institutions (as provided in Guideline 4.65), stress testing the access to all of DGS’s funding sources (Guideline 4.72) and stress testing the governance framework of the funding means (Guideline 4.70)?

4.65-4.66: The desk top exercise to assess the adequacy of the ex-ante funds, ex post contributions and alternative funding arrangements available for a DGS intervention for all member institutions not likely to be subject to a resolution action should be restricted to a calculation exercise to provide general information of the adequacy of funding related to the different member institution, based on data actu-ally available to the DGS. By including further information, which is not based on actual facts and figures but on assumptions (e.g. related to potential deferrals), the results received by the respective DGS will loose comparability and objectivity at least to a certain degree.

A more detailed picture of the individual funding arrangements in place can be drawn in the context of stress-testing access to funding, where further assumptions on e.g. timing of failure, type failed institu-tion, possible cases of deferrals etc. can and should be factored in.

Indicator i31: According to our opinion this indicator has rather operational aspects such that we propose to move this indicator to the section that deals with analysis of the operational processes of the cross-border cooperation test.

Question 7: Do you have any comments on the proposals to address the issues related to the design, execution and reporting of stress tests related to contribute to resolution, failure prevention and contribution to insolvency proceedings (as provided in Guideline 3.27, 4.84 and 4.92)?

No comment

Question 8: Do you have any other comments on the proposed revised guidelines on stress tests of DGSs?

No comment

Name of the organization

Die Deutsche Kreditwirtschaft / German Banking Industry Committee