Primary tabs

European Federation of Building Societies

Identifying the nature of the economic downturn requires a long data history of internal defaults and losses (at least 20 years) disclosed at the level of model component and exposure types. This point is an issue of concern, given that such a long internal data history is still not available.

The same is true for external macroeconomic time series. It is almost impossible to identify sufficiently long, appropriate and representative external time series, especially for specialised internal portfolios (specific retail segments, specialised lending, etc.). The use of other data series implies the risk of lacking representativeness. Also, the determination of adequate margins of conservatism is not trivial and will lead to a significant amount of undesired volatility of the results obtained.

Furthermore, it is not clear if the inclusion of a huge number of macroeconomic factors gives better results than a simpler approach.

The implementation costs of such an approach are extremely high. This holds not only for an initial development phase but also for the model maintenance. This is due to the requirements concerning an extremely long internal and external data history, to the necessity to assign observations to a time of realisation (identification of the time of realisation requires adjustments to the underlying databases) and also to the role of the panel of experts. Furthermore, for implementing this approach, it is necessary to develop a number of different independent macroeconomic models, which will increase the complexity of the model significantly. It is not clear if such modelling leads to better predictions. The expected added value from this extremely large effort is estimated as low and does not justify it.
The term “type of exposure” is not clear. Does this refer to regulatory exposure classes or does this reflect the portfolio segmentation used for the estimation of LGD and CCF?
In our opinion, the concept is clearly described in the draft RTS.

On the other hand, the proposed alternative approach for the determination of the model components (using a list of predefined potential drivers for multimodal shapes, such as for example cure rate, time-in-default, recovery rate related to outstanding amount, etc.) is not clear: How is the term “relating to” in expressions, like “recovery rate relating to collateral value” to be interpreted? Do you refer to a certain segmentation by collateral value and the corresponding recovery rate or does this refer to a ratio? This should be made clear.

Furthermore, an approach using pre-defined model components would not be applicable and would be very complex to implement. In our opinion, the consistency between general model set-up and the definition of model components and also of the internal data availability must be guaranteed. This is not always possible with pre-defined model components. An adequate definition of components depends heavily on the general model set-up and the availability of the related internal data. In order to identify pre-defined model components, it could be necessary to artificially build sub-segments along alternative risk drivers that might not even be included in the internal model or data framework. As an example, in a model of the type LGD = w_cure*LGD_cure + (1-w_cure)*LGD_Liquidations, an additional sub-segmentation along the risk driver “collateral value” would have to be introduced into the model set-up in order to identify the component “recovery rate relating to collateral value”.
The implementation costs of such an approach are extremely high. This holds not only for an initial development phase but also for the model maintenance. This is due to the requirements concerning an extremely long internal and external data history, to the necessity to assign observations to a time of realisation (identification of the time of realisation requires adjustments to the underlying databases) and also to the role of the panel of experts. Furthermore, for implementing this approach, it is necessary to develop a number of different independent macroeconomic models, which will increase the complexity of the model significantly. It is not clear if such modelling leads to better predictions. The expected added value from this extremely large effort is estimated as low and does not justify it. The alternative reference value approach seems more appropriate here.
This approach too is very complex and the knowledge one can expect to gain is often only minimal.

There are examples where the point of time of realisation is not clear. For example, in the case of the component “probability of a cure”, the point of time of the realisation of the component is the completion date of the recovery process without losses (if it does cure) or the time of writing off losses (if it does not cure). Here considering the time of default or the completion date or the recovery process might be the most appropriate point of time.
No.
The Basel framework has been in place since 2006/2007. Therefore, the major problem of this approach is the availability of sufficiently long and representative internal and external time series. Especially the identification and availability of suitable external data sources for such a long time period is a problem, given that the external macroeconomic data history should be adequate and representative for the institution’s specific portfolio. This holds especially for specific portfolios such as for example certain retail portfolios or specialised lending.
It is unclear on the basis of which indications an assessment can be made of whether a sufficiently severe economic downturn has occurred within the 20-year period necessary for the data history.

Moreover, it should be defined more clearly what is meant by the term “for the future”. The time period for “the future” is especially relevant. Are you referring to a one-year horizon?
No. Freedom of methodological choices should be granted.

In our view, the practical application procedure described is far too complex. Further detailing of the application description would tend to further complicate implementation.
Yes. The use of the long-run average values in a certain downturn scenario for those components, where none of the explanatory macroeconomic variables assumes the worst historical value in this special downturn scenario, might not be an appropriate choice.

Taking the example from the explanatory box of Article 6: In Downturn Scenario A in 2001 only the GDP growth assumes its historical worst realisation. Nevertheless, it is possible that interest rates, for example, also assumed a “stressed value” in that year, even though this was not the historical worst realisation. In this situation, the long-run average of model component 2 might therefore be insufficiently conservative, given that model component 2 was also “stressed” in crisis A. Calculating LGD_A with the long-run average for component 2 therefore might not be appropriate.

In this sense, it is important not to include a readily specified approach in the Article. Methodological freedom should be respected here too. A benchmarking requirement could then help to validate the methodology chosen by the institution.
Extrapolation of internal time series using external data sources and estimating the corresponding relationship is a standard methodological approach. Nevertheless, the problem of finding adequate and representative external time series, especially for more sophisticated or specialised portfolios, will always exist. Internal effects, such as the individual portfolio composition, market decisions or modifications in the recovery process, cannot be captured through such an extrapolation.

Furthermore, the development of such models will multiply the complexity of the LGD model set-up.

The inclusion of additional margins of conservatism, because of additional requirements that are very difficult to fulfil, does not improve the quality of the downturn estimations.
For CF estimations, simpler approaches are particularly important because as a rule they are based on more volatile data and shorter data histories, which are strongly influenced by changed business strategies and processes.
No. Freedom of methodological choices should be allowed.
We agree with the rationale behind the alternative reference value approach (also see for example answer to questions 1, 3 or 10). It is essential that reference values can be defined at institution level, given that very often the institutions’ portfolios are composed very differently. This may concern the product mix, the regional concentration or may be motivated by the institution’s channels of customer acquisition.

The alternative distribution approach is simple and works well. It considers the institution’s specific portfolio and recovery processes. A significant amount of effort and costs can be saved.

It is not clear how the proposed add-ons on the discount rates are justified.
In general, simplifications are absolutely essential. In addition, exceptions from strict procedures should be made possible in duly justified cases because the methodology provided for cannot be applied appropriately with respect to all portfolios (see also answer to question 14).
Bausparkassen are using different approaches.
Kathrin Holler
E