1. Trang chủ
  2. » Kinh Doanh - Tiếp Thị

Frontiers of risk management, volume II key issues and solutions

156 28 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 156
Dung lượng 2,39 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Therefore, to understand a bank’s market risk profile, it is important tounderstand the differences in assumptions between its internal economic models and the calculatedregulatory measu

Trang 2

Frontiers of Risk Management

Trang 3

Key Issues and Solutions Volume II

Edited by

Dennis Cox

Trang 4

Frontiers of Risk Management: Key Issues and Solutions, Volume I

Copyright © Business Expert Press, LLC, 2018.

All rights reserved No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means—electronic, mechanical, photocopy, recording, or any other except for brief quotations, not to exceed 400 words, without the prior permission of the publisher.

First published in 2018 by

Business Expert Press, LLC

222 East 46th Street, New York, NY 10017

www.businessexpertpress.com

ISBN-13: 978-1-94709-848-0 (paperback)

ISBN-13: 978-1-94709-849-7 (e-book)

Business Expert Press Finance and Financial Management ​Collection

Collection ISSN: 2331-0049 (print)

Collection ISSN: 2331-0057 (electronic)

Cover and interior design by Exeter Premedia Services Private Ltd., Chennai, India

First edition: 2018

10 9 8 7 6 5 4 3 2 1

Printed in the United States of America.

Trang 5

Frontiers of Risk Management was developed as a text to look at how risk management would

develop in the light of Basel II With an objective of being 10 years ahead of its time, the contributorshave actually had even greater foresight What is clear is that risk management still faces the samechallenges as it did 10 years ago With a series of experts considering financial services riskmanagement in each of its key areas, this book enables the reader to appreciate a practitioner’s view

of the challenges that are faced in practice identifying where appropriate suitable opportunities Aseditor, I have only made changes in the interests of changing regulations but generally have enabledthe original text to remain unaltered since it remains as valid today as when originally published

Keywords

Basel II, credit risk, enterprise risk management, insurance risk, loss data, market risk, operationalrisk, outsourcing, risk appetite, risk management

Trang 6

The Use of Credit Rating Agencies and Their Impact on the IRB Approach

Markus Krebsz, Gary van Vuuren, and Krishnan Ramadurai

Operational Risk

Frontiers of Operational Risk Management

Ralph Nash and Ioanna Panayiotidou

The Issues Relating to the Use of Operational Loss Data—Internal and External

Insurance and Risk Management

Anthony Smith and Dennis Cox

Developments in Pension Fund Risk

Paul Sweeting

Bibliography

Index

Trang 7

Introduction: The IRB Approach—Cornerstone of Basel II

This chapter was originally drafted when Basel II was new Basel III in its various manifestationsdoes not make any major change to Basel II in this regard IFRS 9 requiring a general provision forany facility to be introduced essentially builds upon the IRB framework discussed in this chapterwhich remains as valid today as it was when originally drafted

The IRB approach is a cornerstone in the Basel II capital framework and a critical innovation in theregulatory capital treatment of credit risk Indeed, much of the work of the Committee since June 1999has focused on building and refining the IRB framework, including the form and calibration of thecapital formulas, the operational standards and risk management practices that qualifying banks mustfollow, and the treatment of different types of assets and business activities While this represents anew path in banking regulation, however, the concepts and elements underlying the IRB approach arebased largely on the credit risk measurement techniques that are used increasingly by larger, moresophisticated banks in their economic models The IRB approach is, at heart, a credit risk model—butone that is designed by regulators to meet their prudential objectives

The building blocks of the IRB capital requirements are the statistical measures of an individualasset that reflect its credit risk, including:

probability of default (PD), or the likelihood that the ​borrower defaults over a specified timehorizon;

loss given default (LGD), or the amount of losses the bank expects to incur on each defaultedasset;

remaining maturity (M), given that an instrument with a ​longer tenor has a greater likelihood

of experiencing an adverse credit event; and

exposure at default (EAD), which, for example, reflects the forecast amount that a borrowerwill draw on a commitment or other type of credit facility

Under the most sophisticated or advanced version of the IRB approach, banks are permitted to

Trang 8

calculate their capital requirements using their own internal estimates of these variables (PD, LGD,

M, and EAD), derived from both historical data and specific information about each asset Morespecifically, these internal bank estimates are converted or translated into a capital charge for eachasset through a predetermined supervisory formula Essentially, banks provide the inputs and Basel IIprovides the mathematics

As a credit risk model, the IRB formula has been designed to generate the minimum amount ofcapital that, in the minds of regulators, is needed to cover the economic losses for a portfolio ofassets Therefore, the amount of required capital is based on a statistical distribution of potentiallosses for a credit portfolio and is measured over a given period and within a specified confidencelevel The IRB formula is calculated based on a 99.9 percent confidence level and a one-yearhorizon, which essentially means that there is a 99.9 percent probability that the minimum amount ofregulatory capital held by the bank will cover its economic losses over the next year In other words,there is a one in 1,000 chance that the bank’s losses would wipe out its capital base, if equal to theregulatory minimum

The economic losses covered by the final IRB capital charges represent the bank’s UL (unexpectedlosses), as distinguished from losses that the bank can reasonably anticipate will occur, or EL(expected losses) Banks that are able to estimate EL typically cover this exposure through eitherreserves or pricing In statistical terms, the EL is represented by the amount of loss equal to the mean

of the distribution, while UL is the difference between this mean loss and the potential lossrepresented by the assumed confidence interval of 99.9 percent As seen in Exhibit 1.1, the credit risk

on an asset, reflected both in the UL and the EL, increases as the default probability increases.Likewise, the level of credit risk also increases with higher loss severities, longer maturities, andlarger exposures at default

Trang 9

Exhibit 1.1 Corporates

Source: Fitch Ratings.

In addition (see Exhibit 1.1), EL contributes a relatively small proportion of the capital charge forhigh-quality (or low-PD) borrowers, but an increasingly greater proportion as the borrowers movedown the credit quality spectrum For example, for a loan to a very strong (or low-PD) borrower, thebank anticipates that the asset will perform well and is unlikely to experience credit-relatedproblems Therefore, any severe credit deterioration or loss that might occur on the loan to theborrower would differ from the bank’s expectation and, thus, be explained primarily by UL

By contrast, for a loan to a weaker (or high-PD) borrower, the probability of some credit loss ismuch greater, enabling the bank to build this expectation of loss into its pricing and reservingstrategies Therefore, at the lower end of the credit quality spectrum, EL is a larger component of thecredit risk facing the bank than at the higher end of the quality spectrum

Of course, the amount of economic loss that an asset might incur depends on the type or structure ofthe asset For example, is the exposure to a major corporation or to an individual borrower? Is itsecured by collateral? How does the borrower generate funds for repaying the bank? What is thetypical life or tenor of the asset? How is its value affected by market downturns? Different creditproducts can behave quite differently, given, for example, their contractual features, cash-flowpatterns, and sensitivity to economic conditions Basel II recognizes the importance of product type inexplaining an asset’s credit profile and provides a unique regulatory capital formula for each of themajor asset classes including corporates, banks, commercial real estate (CRE), and retail

Critical Elements of IRB

Trang 10

A critical element of the IRB framework and a key driver of the capital charges are the assumptionsaround correlation and the correlation values used in the formulas Basel II does not recognize fullcredit risk modeling and does not permit banks to generate their own internal estimates of correlation

in light of both the technical challenges involved in reliably deriving and validating these estimatesfor specific asset classes and the desire for tractability

In generating a portfolio view of the amount of capital needed to cover a bank’s credit risk, Basel IIcaptures correlation through a single, systematic risk factor More specifically, the IRB framework isbased on an asymptotic, single-risk factor model, with the assumption that changes in asset values areall correlated with changes in a single, systematic risk factor While not defined under Basel II, thissystematic risk factor could represent general economic conditions or other financial market forcesthat broadly affect the performance of all companies

In summary, a low correlation implies that borrowers largely experience credit problemsindependently of each other due to unique problems faced by particular borrowers On the other hand,higher asset correlations indicate that credit difficulties occur simultaneously among borrowers inresponse to a systematic risk factor, such as general economic conditions

Correlation Assumptions

Under Basel II, the degree to which an asset is correlated to broader market events depends, incertain cases, on the underlying credit quality of the borrower Based on an empirical studyconducted by the Committee, the performance of higher-quality assets tends to be more sensitive to—and more correlated with—market events Although this finding might at first seem counterintuitive, it

is consistent with financial theory that states that a larger proportion of economic loss on high-qualityexposures is driven by systematic risk By contrast, the economic loss on lower-quality exposures isdriven mainly by idiosyncratic, or company-specific, factors and relatively less so by systematic risk.This reasoning suggests that the performance of lower-quality assets tends to be less correlated withmarket events and, therefore, the biggest driver of credit risk is the high-PD value of the borrower or,more broadly, the lower intrinsic credit quality of the borrower

The IRB approach distinguishes between three types of retail assets—credit cards (known formally

as qualifying revolving retail exposures [QRRE]), residential mortgages, and consumer lending(classified under other retail) Basel II has calibrated the three retail capital curves to reflect theunique loss attributes of each of these different products, as seen in Exhibit 1.2 The IRB formulas forthe three retail product types are identical except for the underlying correlation assumption, a keydriver of the shape and structure of the capital requirements Additionally, the Basel II charges aresensitive to the underlying LGD estimate, which in practice can vary substantially across the differenttypes of retail assets For example, loss severities tend to be much higher for credit card assets thanfor residential mortgage lending

Trang 11

Exhibit 1.2 Retail

Source: Fitch Ratings.

The decision, first announced in July 2002, to treat credit cards as a separate asset class underBasel II was an important step in recognizing the typically lower-risk profile of general-purposecredit cards, particularly to prime borrowers Since that decision, the Committee has continued torefine its treatment of credit cards to reflect the unique loss attributes of this asset class

The move under Basel II to a UL-only capital charge implicitly acknowledges the sophisticationand reliability of banks to measure and manage their EL exposure For retail products—and creditcards in particular—the development of sophisticated risk measurement models has enabled manybanks to estimate EL and incorporate it into risk-based pricing and reserving practices For bankswith less sophisticated internal models, the discipline of preparing for the IRB approach will helpthem to develop more refined EL-based pricing and reserving The move to a UL-only frameworkincluded eliminating future margin income (FMI) from the capital calculations Fitch supports thischange, having previously expressed concern over the inclusion of FMI as an offset to regulatorycapital charges The recognition of FMI would have unnecessarily clouded the regulatory capital base

as, in our view, the loss absorption of FMI is not sufficiently reliable to warrant treatment as capital

As FMI is a statistical generation of potential future income ability that fluctuates with interest rates,

as well as the economic cycle, FMI could be affected by market dynamics Competitive pricing couldalso negatively affect the ability of banks to fully realize their estimates of FMI Fitch takes a

Trang 12

conservative view of FMI within the credit-rating process, allowing no capital recognition in ratingfinancial institutions and permitting limited recognition in rating certain more junior classes of creditcard, asset-backed securities (ABS).

Another critical change to the Basel II framework and a flashpoint for the industry has been thelevel of the correlation estimate used in the IRB formula for credit cards More specifically, Basel IIapplies a fixed 4 percent correlation across all PD levels, rather than calibrating correlation as afunction of borrower quality (correlation was previously set to range from 11 percent for high-qualityborrowers to 2 percent for low-quality borrowers) The intuition behind the previous treatment ofsetting the correlation higher for high-quality (or low-PD) assets than for low-quality (or high-PD)assets was the assumption that a larger proportion of the economic risk on high-quality exposures isdriven by systematic (as opposed to idiosyncratic or borrower-specific) risk factors While thisconceptual reasoning is sound, the higher correlations applied to assets at the lower PD levelsappeared to result in fairly onerous capital charges on these assets, at least according to industryestimates

While correlation could theoretically vary within a credit score band, the adoption of the 4 percentcorrelation factor is significantly lower than the 11 percent peak and results in lower capital charges

on high-quality credit card assets For example, as illustrated in Exhibit 1.3, a pool of credit cardswith a PD of 2 percent and an assumed LGD of 85 percent would have required regulatory capital of5.5 percent based on the ranging correlation of 11 percent–2 percent (assuming a UL-onlycalibration) Using instead the fixed correlation of 4 percent, the regulatory capital requirements onthis same pool would decline to about 4.5 percent, or a 100 basis-point reduction in the charge at the

2 percent PD level The fixed 4 percent correlation only provides a capital break on higher-qualityassets (i.e., those with PDs of 3 percent or below) Therefore, banks holding lower-quality creditcard assets do not appear to ​benefit from the new 4 percent correlation assumption

Trang 13

Exhibit 1.3 Credit cards*

Source: Fitch Ratings.

In evaluating Basel II’s changes to the credit card correlation assumptions, the broader issue toexplore is whether the new correlation value results in more appropriate regulatory capital chargesthat better reflect the underlying economic risk of the assets Given the parameters of the credit modelcreated under Basel II, adjusting the correlation value is one of the primary policy levers that theCommittee has at its disposal to alter and modify the shape and structure of the IRB capital curves.The decision to move to a 4 percent correlation assumption reflects not just an effort to identify acorrelation estimate more reflective of industry experience, but also the Committee’s wider mission

of calibrating the overall charges on credit cards to be more reflective of the economic risk of theseassets (particularly for higher-quality borrowers) and achieving other prudential and regulatoryobjectives

In this regard, Basel II’s adoption of a fixed 4 percent correlation estimate appears, on balance, to

be a positive change that will move the overall charges more generally in line with the underlyingeconomic risk on credit cards Lowering the correlation assumption from a peak value of 11 percent

to a fixed 4 percent on higher-quality credit card exposures seems to be more consistent with thetypical loss characteristics and risk profile of these assets, which have experienced low lossvolatility and generally stable, predictable loss patterns for prime borrowers historically Likewise,the increase in correlation values from a low value of 2 percent to a fixed 4 percent for lower-qualitycredit card assets (and the resultant higher capital charges) is also more appropriate, given the more

Trang 14

volatile performance of the subprime market Nonetheless, banks with a heavy mix of subprime creditcard activity will need to ensure that the capital charges rendered by Basel II cover the greatervolatility and higher risk profile of these borrowers.

Concentration Risk

A critical theoretical assumption underlying the IRB capital framework is that the underlyingportfolio of assets held by the bank is highly granular and well diversified Of course, in practice,some banks will have concentrated exposures to single borrowers or particular markets, geographicregions, or industries that, all else being equal, can increase significantly the economic risk facing thebank Therefore, in evaluating a bank’s corporate lending portfolio, it is important to gain a sense ofthe various types of concentration risk to which the bank might be exposed The Basel II capitalformulas do not directly capture risk concentrations, meaning that they do not distinguish between awell-diversified bank and one with concentrated exposure to a few individual borrowers, geographicregions, and business sectors Supervisors view concentration as an important risk and have othertools to address risk concentrations For example, many supervisors have adopted legal lendinglimits, which restrict banks from providing credit to an individual borrower beyond a certain definedthreshold (often expressed as a percentage of their capital base) Additionally, Basel II identifiesconcentration risk as one of the critical elements that supervisors are expected to monitor closely intheir review of banks’ capital adequacy (under Pillar 2, the Supervisory Review Process, of theBasel II framework) Basel II notes that “risk concentrations are arguably the single most importantcause of major problems in banks.”

To gain a better sense of how single-borrower concentration might affect a bank’s measure of creditrisk, Fitch has resurrected and graphed the Basel II granularity adjustment, which was previouslyproposed but then subsequently dropped by the Committee in response to general industry concernsabout the complexity of the capital framework The granularity adjustment essentially was an overlay

to the IRB capital formula, increasing the charges if a bank’s portfolio has larger single-borrowerconcentrations than the industry average (and reducing the charges if a bank’s portfolio is betterdiversified than average) In the granularity analysis, Fitch first composed a typical portfolio ofnonretail assets held by a hypothetical bank, consisting of 50 percent corporate loans (with the 10largest borrowers contributing 20 percent of the corporate book), 30 percent loans to small andmedium-sized enterprises (SMEs; with the 10 largest borrowers contributing 20 percent of the SMEbook), and 20 percent CRE loans (with the five largest borrowers contributing 10 percent of the CREbook) Additionally, given the role of both legal lending limits and economic capital modeling inlimiting borrower concentration, exposures to single borrowers generally do not exceed 2 percent of

a bank’s total assets Therefore, Fitch assumed that no single exposure within this typical portfoliowould exceed 2 percent of the book for a given asset class

As seen in Exhibit 1.4, the previously proposed granularity adjustment for this typical portfolio

Trang 15

does not alter the capital requirements materially This is because the IRB charges have been roughlycalibrated to reflect the average degree of borrower concentration typically found in the industry.

Exhibit 1.4 Single borrower concentration: Impact of previously proposed “granularity

adjustment”

Source: Fitch Ratings.

In the attempt to construct a scenario in which the granularity adjustment would have a materialimpact on the IRB charges, Fitch needed to introduce fairly strong assumptions about the level ofborrower concentration within the portfolio In one such scenario, Fitch now assumes some exposuresrepresent up to 4 percent of the particular book or, more generally, 2 percent of the bank’s totalassets, still consistent with lending limit regulations Therefore, the hypothetical bank’s portfolio hasthe 20 largest corporate borrowers constituting 80 percent of the corporate book, the 20 largest SMEborrowers constituting 80 percent of the SME book, and the 20 largest CRE exposures constituting 80percent of the CRE book As Exhibit 1.4 illustrates, this scenario results in a moderate increase incapital requirements based on the granularity adjustment, suggesting that the final Basel II framework(which does not include a granularity adjustment) might, in certain cases, lead to an understatement ofthe capital needed to support a bank’s borrower concentration, although this adjustment appears tohave a second-order effect on the overall IRB charges

However, there are other important sources of concentration affecting a bank’s credit risk profile,such as geographic and industry concentrations, that even the granularity adjustment would not havepicked up and that are not directly reflected within the IRB framework While supervisors willmonitor credit risk concentration as part of their responsibilities under Pillar 2, it will nonetheless beimportant for market analysts to differentiate between banks that have more pronounced riskconcentrations For example, regional banks could potentially have higher concentrations in specificmarkets or sectors relative to larger, well-diversified institutions Analytically, it is important to

Trang 16

determine how well the bank is evaluating and measuring the several forms of potential concentration

it may face (single borrower, geographic and business sector, among others), how well it is able toaggregate these concentrations and its strategy for managing and mitigating this risk

By not allowing banks to internally estimate portfolio correlation (e.g., pair-wise correlation amongindividual borrowers and across asset categories), the Basel II ratios are insensitive to changes inconcentration risk For example, in cases where a significant portion of a bank’s credit portfolio isconcentrated in a particular geographic market, the underlying correlation among these assets is likely

to be higher than the correlation values provided by Basel II Therefore, in this instance, theunderlying risk of the bank’s portfolio is not fully reflected in the IRB charges Basel II’spredetermined estimates of correlation are important in assessing regulatory capital ratios, not only tounderstand differences in the IRB formulas across different asset classes, but also to assess potentialconcentration risks not captured in the calculations

One can leverage the Basel II ratios as part of its analysis of bank capital adequacy in theinstitutions it rates However, there are several key areas that can be analyzed closely, as assumptionsand practical considerations embedded in the IRB ratios could, in certain instances, lead tounderstating risk exposure For example, a key area Fitch will evaluate is the IRB assumption that thebank’s portfolio is reasonably well diversified Analysts will assess how the bank identifies,aggregates, and manages concentration risk and allocates capital against it Concentration risk is afundamental part of Fitch’s capital analysis, particularly in evaluating more regionally focusedinstitutions Fitch also will look closely at historical data the bank uses to generate its risk estimates.Fitch believes that for certain asset classes with longer market cycles, a longer data history than theminimum requirements established under Basel II might help to reflect a more complete range of lossevents and show that more capital is needed to cover the risk

Pillar 3’s Impact on Market Discipline and Disclosure

Overview

Pillar 3 is, in many ways, one of the most groundbreaking aspects of Basel II The purpose of this part

of the new capital framework is to communicate to the market much of the risk information assembledfor capital adequacy purposes Pillar 3 reflects the Committee’s belief that market participants usingthis information will reward those that manage risk well and shun those that do not Nothing is quite

as effective as the prospect of the loss of business or investor confidence in motivating an errantmanagement team to mend its ways In this way, Pillar 3 should help to reinforce the type of behaviorand the risk management discipline that are envisioned in the other two pillars of the Basel IIframework

To accomplish this goal, Pillar 3 sets out robust disclosure requirements Relative to currentrequirements in most countries today, Basel II mandates much more extensive disclosure about thedistribution of risk within banks’ various portfolios and businesses It also requires discussion of the

Trang 17

underlying policies and valuation techniques used to measure risk The quantitative data requirementsare broad and are expected to give considerably greater detail of a bank’s portfolio and risk appetitethan the current required disclosure.

The increased disclosure in and of itself will be extremely useful to market analysts and Fitchintends to leverage this information in its analysis To use the new information most effectively anddiscern the nuances between banks, analysts will need to understand how Basel II operates and, moreimportantly, to appreciate the nature of the internal rating systems that each bank uses and theassumptions that are used in those systems The Basel II requirements leave sufficient room for banks

to disclose information in a way that works well with the bank’s own management informationsystems

Inherently, common disclosure standards promote greater comparability from one institution toanother However, if not interpreted carefully, they may lull investors into a false sense of uniformity.Behind the numbers produced by the new disclosure standards are still different approaches to riskrating and measurement This is generally viewed favorably by Fitch, as a system that is tooprescriptive will probably inhibit innovation and improvement Yet, it is important to get behind thenumbers to appreciate the nuances in risk profiles across various financial institutions

Lessons from the World of Market Risk

In assessing the types of challenges that it is believed analysts will face, it is helpful to look at theevolution of value-at-risk (VaR) modeling as an analytical and regulatory tool, as its use in measuringmarket risk over the past decade prior to the publication of this book provides some broad parallels

to the implementation of Pillar 3 for credit risk

An important lesson of the evolution of VaR is that by providing a common methodological anddisclosure framework, regulation can help to enable the broad assessment and comparison of riskexposure across institutions Initially, disclosure of VaR reflected a variety of approaches andimplementation techniques, making it difficult for both analysts and supervisors to differentiate thelevel of market risk that each institution faced The Committee, under the 1996 Market RiskAmendment, promoted greater harmonization in methodology and disclosure by establishing acommon framework for calculating VaR and market risk reporting Banks were required to use aminimum 99 percent confidence level, derive loss estimates based on at least a one-year observation

of market data, cover losses over a 10-day period (or a one-day VaR scaled up to 10 days), andencompass the different forms of market risk (e.g., equity, interest rate and foreign exchange, amongothers) Currently, thanks in part to the Basel II regulatory parameters, most of the large banks basetheir VaR measures around these standards

At the same time, banks have continued to push forward in their measurement approach as theymanage risk on an economic basis and as market pressures encourage further innovation in practices.For example, a bank’s internal market risk model might make use of volatility and correlation

Trang 18

calculations that place greater weight on more recent market movements to better capture the relativeimportance of these events This exponentially weighted, moving-average technique contrasts with themarket risk regulatory measure that is based on equally weighted market-movement data over a givenobservation period Therefore, to understand a bank’s market risk profile, it is important tounderstand the differences in assumptions between its internal economic models and the calculatedregulatory measures, in particular any adjustments or innovations that the bank makes when looking atrisk internally.

In modeling risk, the role of stress analysis is critical In a period of low historical volatility, abank could generate a lower VaR measure that might lead to understatement of the potential risks.Risk managers, however, should not assume that the future is a perfect, or even accurate, reflection ofthe recent past If a bank increases its exposure primarily on the basis of generating lower VaRestimates, then the bank’s plans for or anticipation of potential market disruptions need to beassessed, based either on specific historical (and perhaps forgotten) episodes of pronouncedvolatility or on plausibly constructed forecasts of market movements This type of scenario analysisprovides greater insight into the bank’s risk exposure under more extreme market conditions

There are important factors and assumptions underlying the calculation of VaR that are critical tounderstanding the bank’s market risk exposure These require analysts to dig beneath the data and askpenetrating questions that truly assess the market risk profile of the institution Piercing throughdisclosure data to differentiate among bank practices is critical given the variation in banks’ riskmeasurement methodologies and the way in which their risk profiles are portrayed Much of themeaning emerges not just from the final regulatory or economic capital measures, but fromunderstanding how banks think about and manage their risk profiles

Challenges of Basel II

As with the evolution of VaR models for market risk, Basel II pushes the boundaries of credit riskmeasurement and disclosure and provides new opportunities, as well as new challenges, for analystsand investors to better understand a bank’s risk profile and capital allocation approach In leveragingthese new disclosures, some critical issues for analysts to explore include: the bank’s use ofhistorical data and statistical information; the underlying ratings philosophy and approach to internalratings; the bank’s capital allocation strategy over the course of the business cycle, particularly ifduring a volatile market; important differences across different countries and markets and how thesecan affect risk estimates; and, for more sophisticated organizations, how the Basel II measurescompare to the bank’s economic capital models

Historical Data and Statistical Information

To understand a bank’s internal risk rating systems and credit risk measurement approach, the bank’suse of underlying data analysis to derive loss estimates for each rating grade needs to be assessed

Trang 19

Comparing loss estimates from one bank to another will require an appreciation for the similaritiesand differences between companies’ use of historical data.

The economic period covered by the data history is a crucial factor in evaluating the robustness ofthe bank’s loss estimates If the data cover a period of relative calm in markets, the bank’s estimation

of PD or LGD might not capture the potential for future volatility in the asset’s performance Forexample, assuming a bank is using its own internal rating system on CRE loans, incorporating both aderived PD and loss severity based on its own historical experience, and the historical seven-yeardata span (between 1997 and 2004), the amount of capital dictated by the model for these CRE loans

is likely to be very different, and less conservative, than in another bank’s model that spans a longerhorizon and incorporates the more pronounced loss experience in these markets from 1990 to 1992

Another consideration is whether the historical data used are relevant to the bank’s current businessstrategy and asset mix For example, under Basel II, banks entering a new business activity will need

to obtain data that are appropriate to that product; however, how the data are deemed to be relevant,particularly for a relatively untested or new product, becomes an issue In other cases, banks exiting aparticularly troublesome type of lending might determine that historical loss data from that activityshould be excluded from the calculation of its reserves or capital Therefore, cases wheremanagement is pursuing new business activities or taking a deliberate departure from historical dataare of interest

Rating Philosophies

Another critical factor in understanding a bank’s measure of credit risk under Basel II is a bank’sinternal rating philosophy These vary considerably and play a crucial role in credit riskmeasurement Some banks choose to rate by taking into consideration possible stresses through abusiness cycle (a through-the-cycle approach) while others tend to take more of a point-in-timeapproach, recognizing the business cycle through frequent and aggressive rating changes

A bank’s rating philosophy affects the volatility of ratings, how credits are distributed among ratinggrades at a given time and what the underlying PD estimates are for those grades A bank that follows

a point-in-time philosophy will have considerably more rating volatility incorporated into its internalrating systems; the bank’s equivalent of a BBB-rated credit today could fall to a BB or B if thatparticular obligor or segment of the economy weakens, even slightly Therefore, the PDs for thatbank’s portfolio may be very different than those for a bank that rates the same credit a BB—rightfrom the beginning and holds the rating through the business cycle

Banks using more of a point-in-time approach will reflect market shocks more quickly and are muchmore likely to move ratings more than one notch at a time These ratings, however, might also pick upshort-term noise that can lead to overstatement of the risk during periods of market stress If a move,particularly a downward move, leads to overstatement of the risk, banks typically just reverse therating action Analysts also need to remember that rating philosophies can change over time Ratings

Trang 20

that were assigned much farther in the past might not be comparable to those assigned today Forexample, is a particular bank’s BBB equivalent today exactly comparable to its BBB in 1998 or

2000, or has management become more conservative or more liberal in its rating approach?

Basel II appears to offer room for banks to follow either type of rating approach On the one hand,banks are expected to estimate the default risk over a one-year horizon, which would encompass only

a portion of an economic cycle and thus suggest more of a point-in-time approach On the other hand,banks must use longer data histories (i.e., five years of PD and either five or seven years of LGD,depending on the asset) and, according to Basel II, reflect long-term experience in generating riskestimates, which suggests more of a through-the-cycle approach How this plays out in practice willbecome clearer during the implementation process and as regulators further develop their views onbanks’ rating approaches

Stress Testing

Closely related to banks’ rating philosophies is the tendency of the Basel II capital ratios, in moreclosely reflecting the underlying credit risk exposure of banks, to move pro-cyclically In a strongeconomic environment, a bank’s credit risk measures will tend to decline and, in turn, its capitalratios will improve and potentially lead to the bank shedding capital However, if the economydeteriorates, the bank’s risk measures will probably worsen, resulting in weaker Basel II ratios

Analysts and investors should look for signs that a bank is thinking carefully about the amount ofcapital it needs to hold to weather future market distress In this regard, as with the evolution of VaRmodeling, the role of stress testing is critical Banks need to assess carefully both historical examples

of more severe credit problems and possible future scenarios of credit disruption Therefore, howbanks incorporate such stress assessments into their capital allocation process will be an importantarea for analysts to review and one that Fitch considers in its rating process

Robust stress testing is particularly relevant during stronger economic times, when the more recentunderlying data used to generate the Basel II risk estimates (i.e., PD and LGD) might notappropriately reflect potential risks ahead During a market boom, some banks might respond to theirimproving Basel II ratios by repurchasing shares or otherwise lowering their capital base To theextent that a reduction in the level of a bank’s capital is driven principally by an improvement in itsBasel II ratios, Fitch will be looking closely at the bank’s capital strategy, in particular how stresstesting is used to assess the impact of more severe credit problems

Transparency in the bank’s evaluation of stress scenarios and management of capital based on them

is critical Although Fitch recognizes that certain aspects of a bank’s capital allocation strategy andprocess are proprietary, it is important for the bank to communicate the rationale and analysis behindmoves to reduce its level of capitalization Fitch supports the more risk-sensitive Basel II capitalrequirements and, more generally, the movement by several banks to manage their capital levelsbased on internal economic risk assessments At the same time, from a rating perspective, Fitch

Trang 21

believes that banks should seek to explain how well their capital base allows them to navigate the fullarray of risks that can arise over the course of an economic cycle.

Differences Across International Markets, Jurisdictions, and Models

In comparing the Basel II ratios and Pillar 3 disclosures across banks globally, an understanding ofthe differences across markets that can affect banks’ loss estimates is essential For example, twobanks operating in different countries might have markedly different LGD estimates for the same type

of asset This difference does not necessarily mean that one bank is wrong and the other is right, orthat the bank with the higher LGD estimate has a more conservative risk measurement approach thanthe other Rather, analysts and investors need to explore the root causes of this difference Forexample, different bankruptcy regimes or collateral practices affect a bank’s ability to obtain andliquidate collateral on a defaulted exposure In some countries, the laws lean more or less favorablytoward banks when a borrower defaults

Real-estate lending is a good illustration of these issues For instance, in the UK, it is often possiblefor a bank to obtain possession of real-estate collateral quite quickly following a borrower default,which allows the bank subsequently to liquidate the collateral and to achieve recovery in a fairlyshort time This tends to help preserve the value of the property, as bankrupt property ownersgenerally do not have the resources to maintain a property properly In contrast, a U.S bank lending

on property in the state of New Jersey, for example, will encounter a very complicated legal processthat leans heavily toward the borrower It can take years for a bank to obtain legal possession of aproperty once a borrower defaults Therefore, the cost of carry is higher and the value of the propertymay be considerably lower once the bank obtains possession, increasing the bank’s LGD

Differences in market structure or legal practices can result in legitimate differences in a bank’srisk estimates and do not necessarily mean that a bank with, for example, a higher LGD estimate iseither more conservative in its risk measurement or has a higher risk appetite

The Basel II capital framework is based on some of the same risk measurement concepts as in theeconomic capital models that more sophisticated banks use internally However, Basel II, in its goal

of achieving tractability and uniformity, embeds a number of supervisory parameters and simplifyingassumptions—for example, regarding portfolio diversification levels—that will inevitably differfrom the internal structure of banks’ economic capital models

Much like the evolution of VaR modeling, how the Basel II regulatory measure compares with thebank’s management of credit risk on an internal economic basis needs to be examined In making such

a comparison, key areas of departure between the two and how they affect the risk measures should

be assessed It is also important to look for cases in which the Basel II measures are moreconservative than, and hence are binding over, the bank’s economic model, which might createpotential incentives for banks to engage in new forms of regulatory capital arbitrage

It is also important to explore the bank’s assumptions regarding correlation within and across

Trang 22

different portfolios, given that these can be a key driver in the amount of capital generated Forexample, what is the impact of recognizing the risk-reducing benefits of portfolio diversification onthe bank’s overall capital levels, and does the size of the reduction seem reasonable? What kinds ofempirical work has the bank done to validate these estimates? A related issue to consider is thebank’s approach to reflecting potential risks posed by concentrations in risk exposure For example,what types of processes does the bank use to identify, measure, and aggregate different forms ofconcentrations across its various portfolios? How well does the bank capture more subtle forms ofconcentration risk, for example, caused by having credit exposures to CDOs of CDOs?

To make this comparison, data and information are critical Basel II pushes the frontier in the types

of risk-related disclosures banks will need to provide around their credit risk rating systems andmeasurement of regulatory capital Some banks currently provide high-quality disclosure about theircredit risk exposure and approach to economic capital, which, coupled with the heightened risktransparency under Basel II, hopefully could motivate an increasing number of banks to provide moremeaningful information about their economic capital models Such information will be particularlyuseful, given the valuable insights that can be generated by a comparison between a bank’s Basel IIand economic capital measures

Fitch has reviewed the existing level of credit risk-related disclosures across a sample of banksinternationally and has found varying degrees of disclosure quality across different markets Qualitycan vary quite a bit, even within markets, with a very small number of banks having emerged to date

as clear thought leaders in providing robust and insightful risk disclosure Pillar 3 will certainlyprovide more information to analysts and investors than ever before

Looking Ahead

Looking ahead, Fitch will leverage both the enhanced disclosure framework and the greater risksensitivity of the Basel II capital ratios, which are helpful tools in comparing the broad risk profile ofbanks In assessing a bank’s capital, one of several factors included in the rating process, Fitch looks

to the level of capital relative to the bank’s risk exposure, its approach to capital planning and thequality of the bank’s risk management practices For example, Fitch’s analysis addresses a widerange of issues, including how well the institution is positioned to withstand adverse market events,how its capital planning ties into its overall business strategy (e.g., future acquisition plans or newproduct development), and the bank’s ability to access new capital or grow its capital base Inaddition, as banks continue to develop better and more robust internal measures of economic risk, aneven greater portion of Fitch’s analysis will focus on the rigor and assumptions behind its economiccapital modeling and the bank’s use of stress testing or scenario analysis to forecast the capitalimpact of potential risks All of these factors help to shape Fitch’s overall view on the capitalstrength and, more broadly, the credit quality of the bank

Trang 23

PART I

Operational Risk

Trang 24

Introduction: What Is Operational Risk Management?

The emergence of operational risk management as a separate discipline is somewhat murky Itsorigins can be traced to a mixture of “operating risk” (back-office operations, payment systems),audit-style risk assessment, and specific risk management capability (business continuity planning(CBCP), fraud risk management) Several financial institutions started using the term in the 1990s, but

it was given a huge impetus by the emergence of capital requirements for this nebulous group of risksunder Basel II and subsequent EU legislation Even under Basel II, however, the term “operationalrisk” emerged slowly and for reasons that are not entirely clear Between the first and secondconsultation papers on the new capital framework, “other risks” (based on an open-ended definition

of everything except credit, market, and liquidity risk) metamorphosed into operational risk (with aclosed definition) that slowly developed into the current, fairly all-encompassing “risk of lossresulting from inadequate or failed internal processes, people and systems or from external events.”This definition includes legal risk, but excludes certain other risks including liquidity, strategic, andreputational risk

This regulatory wrapper covers a range of routine, low-severity issues and combines them withmajor losses and potentially systemic, or at least solvency-threatening, risks But how does thisdefinition match the day-to-day management of operational risk that happens across all staff anddepartments in a firm and what should operational risk managers do as a result? This chapter aims toexplore these issues by taking stock of the implementation of operational risk in a post-Basel II, pre-Solvency II context and by considering how operational risk functions and resources may be bestdeployed to add value to the firm

What Does Operational Risk Look Like in the Current Environment?

As noted in the preceding paragraphs, operational risk became a buzzword around banks as the

Trang 25

growing pains of Basel II led to a range of approaches for the assessment of capital There can be nodoubt that the threat or promise of a capital charge for operational risk focused the mind of seniormanagement across the banking industry and insurers are now in the same position with Solvency II

on the horizon The development of the operational risk discipline in the shadow of the emergingBasel II requirements meant the focus of effort was on certain aspects of risk management,particularly around data collection and the measurement of operational risk A vicious or virtuouscircle between the banks and supervisory agencies emerged, in which a focus on capital meant thateffort focused on the components of capital assessment, perhaps to the detriment of “real” riskmanagement Furthermore, a few leading banks promised a lot in terms of the ability to measurecapital and the supervisors responded with the birth of the concept of the advanced measurementapproach (AMA)

The AMA is principle-based and allows a degree of flexibility that is not apparent under the creditequivalent in Basel II, the advanced internal ratings-based approach (AIRBA) This flexibility hadsome unexpected and possibly undesirable consequences; initially, there were many factions in thebanking sector stressing the importance of different ways of assessing operational risk including lossdata (loss distribution approach) versus self-assessment (risk driver approach) versus scenarioanalysis (scenario-based approach) Much discussion followed, but the ultimate outcome was that alldata inputs have some kind of relevance The arch quantifiers admitted that qualitative data could fillgaps in their distributions, while the qualitative factions saw the need to have some “factual” lossdata to “validate” their findings As a result, most AMAs now incorporate a range of data sources(internal and external loss data, scenario analysis, and risk and control self-assessment) in either theconstruction or validation of their capital numbers More importantly, it seems that a number of bankshave seen the need to revisit their approaches to AMA and to consider how data collection andanalysis informs and is informed by “real” day-to-day risk management It seems that following therush to quantification there is now a pause for breath to consider what value, over and above aregulatory tick and a potential reduction in regulatory capital (not necessarily a binding constraint), isderived from the operational risk functions

It is, however, all too easy to be critical of the attempts of firms to measure operational risk Therewere few credible alternatives to the AMA-type approach and certainly the simpler regulatoryapproaches under Basel II (the basic indicator approach and the standardized approach), whilegenerating a number for operational risk capital, do not in any way, shape or form “measure”operational risk They are a top-down assessment based on the unproven, but intuitive, assumptionthat income or assets and operational risk are in some way directionally aligned Banks using theseapproaches should be wary of placing weight on the capital numbers generated; they may not even be

an upper threshold, let alone the “right” number The further anomaly with the simpler approaches isthat the entry criteria do not relate exclusively to the ability to perform that capital assessment, butrather to generic operational risk management standards

Trang 26

The focus on operational risk measurement also has important implications on the requirement thatthe operational risk framework is embedded in day-to-day management and that the capital inputs andoutputs are used in practice This creates a number of issues for firms that have existing riskassessment models in place and in use Either the regulatory-based assessment, which may be based

on confidence intervals and definitions that the firm does not recognize, takes precedence andsupersedes the existing internal measurement approach, or there is the need to juggle two sets ofcapital books It is difficult to use the capital outputs in such a case This issue is nothing, however,compared to those for firms that do not manage on a risk-adjusted capital basis where the outputs ofthe AMA are simply unrecognizable The best that can be hoped for here is that the AMA inputs arethe same as those that are used for management purposes and so the use test is achieved

Lessons

The supervisory push toward a capital charge for operational risk and the emergence of AMA hasbeen an important learning curve for banks and the authorities alike There are some importantlessons for Solvency II as a result and the EU, national authorities and individual firms may be able tosave time, money and effort by considering the recent experience of Basel II These issues can begrouped into four categories

Simplicity A range of three approaches to the assessment of operational risk capital is

unnecessary and overcomplicated A simple metric to generate a capital charge (along the

lines of the basic indicator approach) and a framework for the recognition of internal

operational risk models (along the lines of AMA) is sufficient to allow innovation and

development The apparent stepping stone of the standardized approach need not be replicated

in Solvency II This avoids the scope for numerous (fairly fruitless) debates around partial

use, arbitrage, and entry criteria With a simple, permanent, partial-use regime between a

basic approach and an advanced approach, firms could plot a reasonable trajectory for rollingout an advanced approach without undue burden and on a reasonable cost-benefit basis

Spurious precision The breadth of risks captured in operational risk, ranging from small,routine processing losses, to systemic marketwide events, means that pinpoint accuracy in

measurement is a false objective Much debate has centered on the basis for a 1/1,000 yearcapital requirement In practice, such debate is a likely recipe for business disengagement

with operational risk and a more realistic time horizon might be beneficial in supporting

operational risk teams in obtaining business buy-in and adding value Similarly, regulatory

obsession with gaming of capital numbers via dependency and correlation analysis has led to

a protracted debate around the connection between different event types in different firm

functions or locations Again, a commonsense approach in recognizing the intuitive

disconnections between the bundle of events that comprise operational risk would be a big

Trang 27

win for firms and supervisors With the backstop of other supervisory interventions and thesanction of requiring additional capital, fears of capital draining from the system as a result ofadvanced operational risk approaches can be abated and a long and fruitless debate on thismatter avoided Insurers’ experience in managing risk diversification should be ​recognized inSolvency II.

Inputs As referred to the preceding texts, there was a long debate in banks around the

requirement to use different inputs to assess operational risk It is evident that a range of datainputs is necessary to assess operational risk effectively, especially given the range of eventscovered under the operational risk umbrella and the differing severities and frequencies withwhich they occur The main inputs are: internal loss data, external loss data, scenario

analysis, and risk and control self-assessment Internal loss data are a key component of anyrobust operational risk assessment approach In the absence of such data, it is impossible toanswer basic questions such as the current annual cost of operational risk (and hence how

much resource it is worth expending on risk mitigation or transfer) or to learn from issues

across an organization Depending on the type of operational risk event concerned, data

collection thresholds might sensibly vary For instance, firms might have a strong interest incapturing fraud losses at a low level Solvency II should copy Basel II flexibility in terms ofthresholds, but should avoid the spurious pursuit of accuracy by “reconciling” losses to thegeneral ledger

External loss data fall into three main types: consortia data (shared losses between member firms,typically on an anonymous basis, e.g., ORX, ORiC, BBA GOLD); publicly known data (an enhancedpress-cutting service in which public data are classified, packaged, and sold, e.g., Fitch FIRST); andinquiry data (detailed analysis of a particular loss event, typically by a quasi-governmental body,e.g., the Bank of England’s report on Barings, or the Ludwig report on AIB) The use of these datavaries, but consortia data may be used for risk modeling or assessing the completeness of internalloss data, while publicly known data or inquiry reports might be used for “what if” analysis or todesign and validate scenario analysis

Scenario analysis is typically a structured opinion about rare but plausible events and can be used

to identify control weaknesses or dependencies and to assess capital for rare events by creating

“synthetic” loss data Finally, risk and control assessment data are the most common form ofinformation available and is useful in gauging current weaknesses and to flex loss histories

It will be apparent from this quick review of data that the different sources of data inform differentaspects of a comprehensive operational risk analysis and Solvency II should continue this focus on arange of data sources Insurance firms should learn from the experience of banks and rather thanwaste energy on a fruitless debate about the ascendancy of particular data types, they should makepreparations now for the forthcoming Solvency II requirements across a range of data sources,

Trang 28

building on existing business management information that is used and recognized.

Focus on risk management While the focus of Basel II and Solvency II is understandably

on capital, supervisors and especially firms should not omit to ensure that sufficient focus andfunding is applied to “real” risk management This may be manifested through the resourcingand prioritization of fraud risk management programmes, prioritization of IT improvement and(information) security initiatives, robust business continuity planning, disaster recovery (DR)planning and testing, and effective insurance purchasing While the data collection and

analysis inherent in the advanced approaches to operational risk assessment provide a contextfor this work, and should provide some parameters for resource allocation to mitigation

initiatives, at times there can be a disconnection between the “risk measurement” driven underBasel II and the real risk management initiatives happening elsewhere in the business and

operational areas of a firm It is this “use test” that links risk measurement and mitigation thatSolvency II should encourage and that firms themselves have an interest in pursuing

In summary, it is clear that the overall direction of AMA is one that should be copied in Solvency

II There are, however, some key improvements that could be adopted to make the design andimplementation smoother, while firms themselves should end the debate about the need to measureoperational risk and the detailed way of doing it, and rather focus on linking risk measurement andrisk mitigation into a coherent whole, cognizant of, but not driven by, the regulatory agenda

Adding Value?

As discussed previously, many firms have typically focused on the development of operational riskpractices in response to capital and regulatory pressure while enjoying different degrees of success inembedding them into the day-to-day operations of the business The constant challenge aroundoperational risk is how, over and above regulatory compliance, it adds value to an organization and,

in particular, how this is manifested in bottom-line results

In order for organizations to improve financial results they need to minimize operationalinefficiencies and maximize the use of available resources, that is, people, processes, systems, andassets Although compliance with capital regulations has been a key driver for the development ofoperational risk frameworks, organizations are now beginning to experience and focus on how soundoperational risk practices can deliver direct bottom-line value benefits For example, betteroperational risk capital management and supervisory capital relief can be achieved through theoperation of an effective mechanism for the identification, measurement, reporting, and management

of operational risks This allows organizations to allocate capital more effectively across thebusiness and, at the same time, may create additional value for shareholders or re-investmentopportunities by either releasing operational risk capital held or minimizing the prospect ofregulatory sanctions

Trang 29

Operational risk promotes the structured consideration of the potential adverse impact as well asthe opportunities inherent in all business decisions By requiring management to consider the potentialdownside of their actions, operational risk focuses attention on the existing resource managementpractices and control environment effectiveness of an organization This provides management with acomprehensive view of operational exposures across the organization, thereby allowing forintegrated responses to undesirable risk exposures and process inefficiencies Cost savings can,therefore, be achieved through process improvement and fewer unexpected operational failures andlosses.

This environment of enhanced decision making also facilitates better management of projects orprogrammes across an organization to ensure the delivery of anticipated benefits Project andprogramme management is a key business activity that is, however, often promoted by isolateddepartmental drivers and financial considerations Operational risk ensures that the benefits ofprojects or programmes across an organization are balanced against the cost of implementation andconsiders the impact of any project outputs on the business-as-usual environment Therefore, the risk

of under-delivery, unexpected costs, or failures is limited

An operational risk focus area that has delivered considerable, quantifiable benefits toorganizations is internal and external fraud risk management The increased sophistication offraudulent activity targeted at or existing within an organization requires the detailed understanding ofthe causes of fraud and its implications It also requires careful analysis of the effectiveness of thecontrol environment that operates to minimize exposure Organizations that have established fraudrisk investigation programmes have been able to experience the direct benefits of operational risk interms of cost savings delivered through targeted action planning and control process improvements

Similarly, operational risk can facilitate the improvement of processes that operate to mitigate riskexposure which typically focus on the consideration of financial management information, such as anorganization’s insurance programme, product pricing methods, and investment decisions Carefullymonitoring and reporting changes in the operational risk profile of the business can provide suchspecialist teams an enhanced understanding of risk exposures, thereby allowing the delivery ofcost efficiencies

Another important area of business activity where operational risk has provided direct benefits isthird party management Traditionally, the cost saving potential of such arrangements has drivenmanagement action As such arrangements in the financial services industry increased in complexityand number, it became apparent that the identification and management of operational uncertaintiessuch as appropriate contractual and service level agreements, relationship management, and peopleand systems considerations are as critical to the success of the operation as financial considerations

The importance of managing nonfinancial risk exposures and related controls has been highlighted

by a number of corporate failures both in banking and insurance Operational risk provides aframework for the consideration of nonfinancial factors present in every business activity, thereby

Trang 30

promoting informed decision making and optimal resource management that can drive improvements

in established practices such as insurance programmes, product pricing, and investment decisions Asoperational risk management develops in sophistication and the quality and quantity of the datanecessary for the meaningful analysis and reporting of operational risk exposures becomesincreasingly available, the direct and indirect benefits of operational risk are becoming moreapparent to organizations

Specialism Versus Generalism?

Most financial services organizations have established a centralized operational risk managementfunction with a degree of oversight and reporting responsibilities At the same time, operational riskmanagement remains the responsibility of business unit management where risk is often mitigatedthrough a number of well-established specialist functions or teams, such as fraud management, BCP,insurance, security, IT, human resources, and finance In addition, a number of managementcommittees and fora, whose focus may not exclusively or principally be operational risk management,consider and act upon operational risk exposures on a day-to-day basis Although this typicalgovernance model for operational risk has developed primarily in response to regulatoryrequirements, alignment between the central risk functions and risk at business level is not alwaysachieved, limiting the value-adding potential of effective operational risk management In thefollowing texts, we explore briefly how different governance models operated by firms affect thepotential of operational risk management to add value to the business

Some firms operate a governance model for operational risk, where risk managers are integratedwithin the business and risk management forms part of the day-to-day management A benefit of thisapproach is that management is directly engaged in the management of operational risk and considers

it as part of business planning, thereby removing one of the most significant obstacles of othergovernance models where policy and/or management of operational risk are the responsibility ofspecialist teams outside the business unit It allows greater efficiency in operational risk reporting,escalating and action planning, while ensuring that business unit or product-related knowledge isinherent in and informs every part of the risk management process As each business unit is able tomanage operational risk to meet its own needs and reporting requirements, however, this modelfacilitates the creation of disparate and potentially misaligned operational risk management practicesacross a firm These may operate effectively at local level, but can impede risk aggregation at across-functional level and result in the inefficient allocation of resources and capital across a firm’soperations The decentralized model may operate successfully in organizations where the boardand/or board’s risk committee and senior executives set a strong “tone from the Top” and a risk-aware culture is practised It implies, however, that the same resources that set operational riskmanagement standards also implement these and are responsible for business as usual managementactivities This may prevent independence between standard-setting and the day-to-day management

Trang 31

of operational risk in the business, which is a cornerstone of industry good practice and a regulatoryrequirement.

Other firms operate a centralized model for the management of operational risk where a specialistrisk team is responsible both for developing the operational risk management framework and policies

as well as for framework implementation in the business The risk team has direct control overspecialist functions such as fraud, BCP, and so on and business staff with risk responsibilities Thisapproach ensures that a common set of standards is developed by operational risk specialists and isapplied consistently across the firm It allows for improved understanding of the principles ofoperational risk management through the delivery of appropriate training to the organization.Operational risk is identified, measured, and reported by a dedicated resource using a defined set ofmethodologies and tools, thus facilitating risk exposure aggregation and the development of anaccurate risk profile This knowledge aids the efficient allocation of resources, and can informstrategic planning and executive decision making as well as capital allocation across the business

Additionally, as the risk team has direct access to specialist knowledge, this model allows fortimely and accurate identification and reporting of changes to the risk profile, efficient actionplanning, and the development of a system of internal control that is informed by business needs and

is applied uniformly across the organization The effective operation of this approach in larger firmsrequires the availability of a significant number of operational risk resources and a risk-aware cultureboth at senior executive management and functional levels Without these, it may result in the isolation

of the operational risk team from the day-to-day running of the business and create a negativeperception in terms of the business benefits of operational risk management

In response to regulatory and capital requirements, as mentioned previously, and partially due to theshortcomings related to each of the governance models discussed previously, many firms haveimplemented a hybrid model, where a specialist risk function provides oversight and guidance for themanagement of operational risk while risk is managed in the business by specialist functions (such asBCP, fraud, etc.) and business managers This approach ensures that while responsibility for themanagement of operational risk lives with the business, the management of operational risk is set by aspecialist team and is therefore consistent across the organization Operational risk is consistentlyidentified, measured, and reported, allowing for aggregation and informed decision making withregard to risk mitigation and strategic planning The hybrid governance model should combine thebest elements of the former two models and limit their shortcomings

Considering the operating environment complexity of most large financial institutions and thecorresponding multitude of reporting lines and potentially misaligned business objectives of differentbusiness units and individuals, however, the operational risk framework and associated reportingrequirements may not be consistently implemented by the business This is especially likely ingovernance models where the central risk team only has indirect control of risk managementresources in the business As a result, aggregation and analysis of risk exposures, capital allocation,

Trang 32

the ability to agree and implement control environment improvements and, ultimately, strategic actionplanning may be impeded Additionally, the boundaries and crossover between the operational riskteam and other risk management and assurance functions may not be clear, resulting in duplication ofeffort and frustration in the business Therefore, as with the models described previously, seniorexecutive and business management commitment and the practice of a risk-aware culture that ensuresany inherent weaknesses are removed by day-to-day practices are fundamental to the successfuloperation of this approach.

The Rebirth of Operational Risk

Firms’ experience in the management of operational risk has demonstrated that risk culture and seniormanagement commitment are essential for successful implementation Where culture and “tone fromthe top” are right, however, does operational risk need to exist? Is it simply a regulatory wrapper to aseries of disparate and unconnected functions that gained credence under a particular capital regime,but from a management point of view do not necessarily connect?

As long as communication and working practices between operational risk teams, other riskmanagement functions and the business do not facilitate knowledge sharing, effort leveraging, ordelivery of the tangible business benefits, management will continue to view operational risk as aregulatory requirement that adds little value to the running of the business On the other hand, wheresynergies and benefits are demonstrated through mutually beneficial working practices, a successfulpartnership may be created that allows the development of an approach to operational riskmanagement that is appropriate for the business and operates to assist the firms to achieve theirstrategic objectives in a well-controlled operating environment

Alternatively, a genuine risk-shared service, where senior management relies on an operational riskservice provider, might be beneficial An emerging role of the group operational risk function is toprovide assurance that processes are implemented and followed but is the expertise typically in place

to allow this? Most group operational risk functions are resource-constrained and therefore have torely on subjective assessments of implementation effectiveness As firms progress from thedevelopment of operational risk approaches to their practical implementation, these considerationsrequire immediate management attention Firms need to consider and clarify the objectives of theirrisk management programmes in their operating environment and their strategic direction in order toensure that their investment in operational risk processes is not only a regulatory and capital tick-boxexercise, but also a business enabler that gives rise to demonstrable benefits

Basel III in its current version will increase the capital charge for operational risk based on the size

of the institution This is to some extent counterintuitive in that larger firms are able to introducebetter systems and hire higher quality staff reducing loss incidence However, these changes have putOperational Risk back into the forefront of the risk agenda again which renders consideration of theissues raised by this chapter as being of key importance It is perhaps disappointing that so many

Trang 33

firms have worked so hard in these areas but achieved so little value Generally, we see this as beingdue to a failure to fully embed operational risk into the business and also to get business units torealize that it is their responsibility.

Trang 34

CHAPTER 3

The Issues Relating to the Use of

Operational Loss Data —Internal and

External David Breden

HSBC Operational Risk Consultancy1

Introduction

In January 2001, the Basel Committee for Banking Supervision published its first consultative paper

on the reform of capital adequacy regulations for internationally active banks,2 popularly known asBasel II, and ushered in a new era in operational risk management

Because Basel II also seeks to make capital adequacy assessments risk-sensitive, and espouses thetheory that those institutions that are less risky should hold less capital, it also becomes necessary todevelop methodologies for quantifying or measuring operational risk exposures Drawing on theexperience of market and credit risk management, the financial services industry rapidly concludedthat it would need to systematically collect data relating to operational loss events if it was to buildmodels to quantify operational risk exposures

Coupled with the desire to quantify operational risk for the purpose of calculating regulatory capitalwas the logical desire to minimize operational losses In this case, the construction of a robustoperational loss database had already been seen as a key element in enhancing the quality ofoperational risk management before Basel II was published It is simple common sense for businessmanagers to study operational losses to discover their causes and to seek to introduce measures toensure that such losses do not recur The systematic collection of losses in a formal database structureallows firms to build up this knowledge and monitor the completion of appropriate actions to avoidescalating losses due to a known operational weakness

The Characteristics of Operational Risk

The study of operational losses and the developing understanding of the nature of operational riskhave tended to influence the way in which we work with operational loss data It is important tounderstand certain fundamental differences between operational risk on the one hand and credit andmarket risk on the other It is worth exploring these differences before embarking on the consideration

of the database itself.3

Trang 35

Context Dependency Operational loss events occur as a result of the failings of people,processes, and systems and as a result of external threats As such, they are highly dependent

on the internal arrangements of an individual organization It is evident that a firm with a

manual settlement process will be subject to the risk that its staff make mistakes in carryingout their tasks Such mistakes will tend to be relatively frequent, but it is to be hoped that sucherrors will generally also be characterized as being low impact Contrast this to a secondfirm, operating in the same market area, but using an automated settlement process linked totheir dealing ​system This second firm will have eradicated the risk that staff members makeroutine mistakes when settling trades, but will be exposed to the risk of a systems failure thatmay prevent the firm settling transactions at all Such a risk will probably be infrequent, butmay have a much higher impact than the routine human errors suffered by the first firm—

particularly if the automation has been followed by a reduction in staff numbers that prevents

a simple reversion to manual working

This feature of operational risk not only means that firms face issues of relevance when

considering loss events that have occurred in other companies, but also means that their ownloss data may lose relevance as the firm itself evolves and changes This is a significant

factor, because well-managed firms, after suffering an operational loss, will put in placeadditional controls and safeguards to prevent repetition, thereby changing the future context

of the business and impairing the usefulness of the loss event as a signpost to future trends.Absence of defined portfolio size While banks have a clearly defined portfolio of creditand market risk and can describe the maximum size of this portfolio, no such defined portfoliosize exists for operational risk If we consider, for example, the case of a rogue trader, it willrapidly become clear that it is usually impossible to forecast the full potential effect of a

future loss The size of the exposure will depend on factors such as underlying market

volatility, the length of time required to discover and close out the exposure, and so on Thiscan be seen clearly in the Barings case, where the Kobe earthquake and the subsequent sharpfall of the Nikkei index directly impacted the size of the loss It is clearly impossible to

forecast with any certainty the potential combinations of events that may influence a loss to theextent that the amount of the loss can be considered as almost arbitrary, effectively the result

of a random combination of contributory elements

Scarce and incomplete data When examining the contents of an operational loss database,

it will be evident that some ​categories, such as credit card fraud, will be extensively

populated, whilst other categories will have few, if any, events present It is fortunate thatmany of these empty categories relate to potentially high-impact events For example, it is afact that all companies in today’s market are exposed to diversity and discrimination claims,but individual banks may have few events in their database to illustrate the extent of this

potential exposure This absence does not, however, mean that the exposure does not exist or

Trang 36

is minimized in any way, given the evolution of anti-discrimination legislation in recent years.

To exacerbate this problem, new risks are emerging ​continually as banks introduce new

products, systems, and processes for which they will have no data until such time as less isexperienced

These factors all impact on the effectiveness of historic loss data as a wholly reliable indicator offuture exposures and will, therefore, influence the ways in which loss data must be used

The Regulatory Requirements

Basel II has the force of guidance for internationally active banks, but the content of Basel is nowbeing incorporated into local legislation in Europe and elsewhere Whilst limiting myself toconsidering the documents published by the Basel Committee for Banking Supervision at the Bank forInternational Settlements, I must, however, repeat the warning that local regulatory authorities mayhave changed the emphasis contained in the Basel papers for the local market that they regulate

The main body of the Basel documentation4 concentrates on the different formulas to be used tocalculate minimum regulatory capital figures Various approaches for calculating capital areproposed, ranging from the simple, entry-level basic indicator approach (BIA) through thestandardized approach (STA) and on to the advanced measurement approaches (AMA) In all of theseapproaches, the BIS stresses the importance of tracking operational loss events

For the BIA, no specific requirements for qualitative standards can be defined as this is an level approach that financial institutions will use if they are unable to move to a more advancedapproach All institutions are, however, “encouraged to comply”5 with the Basel Committee SoundPractices Paper.6 This paper contains a series of 10 principles designed to assist financial institutions

entry-in buildentry-ing a robust operational risk management framework and entry-in the risk management activities thatshould form part of the framework The fifth principle in the list refers to the need to monitor andreport on exposures to material losses:

Principle 5: Banks should implement a process to regularly monitor operational risk profiles andmaterial exposures to losses There should be regular reporting of pertinent information to seniormanagement and the board of directors that supports the proactive management of operationalrisk.7

The aim of monitoring exposures is explained at length in the body of the paper.8 The objective isfirmly aimed at the quality of operational risk management Prompt detection and appropriate actionwill reduce the potential frequency and the severity of a loss event Indicators should be developed

on the basis of loss events in an effort to develop a forward-looking early warning system for futurelosses and senior management and the board should receive information on events and exposures toensure that appropriate action is taken

Trang 37

Under the standardized approach, the requirement is more specific:

As part of the bank’s internal operational risk assessment system, the bank must systematicallytrack relevant operational risk data including material losses by business line.9

And again:

There must be regular reporting of operational risk exposures, including material operationallosses, to business unit management, senior management, and to the board of directors The bankmust have procedures for taking appropriate action according to the information within themanagement reports.10

We see here that the BIS is once again stressing the qualitative aspects of operational riskmanagement by insisting that action be taken to correct deficiencies revealed by loss events, unless,

of course, the firm decides that it is prepared to accept the risk reported

The advanced measurement approaches, whereby specific modeling activities are proposed (asopposed to the use of a proxy to develop a capital exposure figure in the BIA or STA) are insistent onthe importance of the loss database

The requirement for qualitative action to correct deficiencies identified by loss events is repeated11but this is reinforced with an obligation to use loss data in the creation of the model for development

of a minimum regulatory capital figure

Any operational risk measurement system must have certain key features to meet the supervisorysoundness standard set out in this section These elements must include the use of internal data,relevant external data, scenario analysis, and factors reflecting the business environment andinternal control systems.12

The different components must be weighted and combined in such a way as to enable the bank toestablish that it has explored the total extent of its exposures to the full range of operational risks to a99.9 percent confidence level This will imply a clear understanding of the different factors that driveloss in the organization and in the industry The challenge is to gaze into an uncertain fog of potentialexposures and select a level sufficient to ensure that the bank will always hold adequate regulatorycapital to allow it to withstand all but the most catastrophic loss

Internal Loss Data

Trang 38

functioning of a credible operational risk measurement system.13

Financial institutions are reminded of the importance that data should remain relevant, tying the use

of loss data to their current range of activities Data should be collected over a five-year observationperiod (reduced to three years when a bank first moves to AMA) and it should be mapped to thebusiness line and loss data structure outlined in the Basel II papers.14

Loss data must be comprehensive above a threshold (suggested to be Ä10,000) and details gatheredshould include a wide range of data with both qualitative and quantitative objectives

Policies must be created to deal with losses in centralized functions (such as IT, etc.) or those thatspan business lines or time periods Operational losses that have historically been included in creditrisk capital assessments (incomplete or inadequate documentation leading to loss) will be identified

in the operational risk database for qualitative action They will continue, however, to be included inthe credit risk model and will, therefore, be excluded from the operational risk model Losses related

to market risk will now be included in the operational risk database and model.15

Building the Internal Loss Database

Basel II indicates that any financial institution needs an operational loss database Before we canproceed to build the database, however, we need to answer a basic question—what do we want touse the loss data for?

From the BIS papers, it can be seen that the BIA and STA focus on a largely qualitative use of datawhere the focus is on correcting flaws in order to prevent unnecessary repetition of loss events TheAMA, however, also requires an increased concentration on the use of data in a capital adequacymodel, while ensuring that the output of the model is actively used in the management of operationalrisk

We need to decide where our focus lies Do we only wish to use the data to improve the way inwhich we carry out our business, or do we just intend to use the data to develop an AMA model forcalculation of minimum operational risk regulatory capital? Is it the case that we wish to meet bothobjectives? The answer to this very basic question will determine what data we collect and how wechoose to use it

Once this decision has been made, we will also be in a good position to conclude what information

we wish to extract from the database and what reports need to be submitted to the business unit, tosenior management, and to the board This will determine the structure of the database and ensure thatmanagement receives the information it needs

The Contents of the Internal Loss Database

Losses

It will seem evident that we need to include losses in our database At first sight, this appears to be aneasy concept to grasp and comply with, but in fact we will need to be careful to draw clear guidelines

Trang 39

as to what should be reported and included This will require a definition of the term “loss” and anindication of the characteristics of the event to be reported Some of the questions that we will need

to consider relate to the level of any threshold, the relevance of the loss, and the nature of the impact

(a) Threshold levels

A business whose prime aim is to enhance the quality of business operations will wish to examine alllosses, however small, on the basis that a small loss in one specific context can become a major event

in another situation

If we take the example of “fat finger” syndrome, where a trader or dealer using the numerical section

of a standard keyboard hits the 0 and the adjacent 00 key at the same time, we will understand thisargument better If a dealer buys or sells 1,000 shares instead of 10, the impact will generally besmall If, however, the same dealer buys or sells 1,000,000 instead of 10,000, then the loss becomesmagnified and much more significant, as has been seen in markets in Japan and London In thisexample, an identical error has been made and the effects of the larger error can be easilyextrapolated from the smaller event The amount of the loss will vary with the context surrounding theinsertion of the additional 00 From the qualitative point of view, if the problem is addressed when asmall error occurs then it is possible that corrective action may be taken before the large loss canoccur This may entail fitting plastic screens to the keyboard or removing the 00 key to prevent itbeing accidentally struck, or even the introduction of systems enhancements to query transactionsabove a limit before execution A quantitative analysis might involve the exploration of potential lossscenarios to explore the full extent of the risk to ensure that the appropriate weight is attached to theissue

The argument against reporting all losses from a qualitative point of view is one of volume It isimpossible to examine every fraudulent transaction in a credit card unit, so it is important to decidefrom the outset the policy on threshold levels for reporting All losses will certainly be reportedlocally, but it is normal that only losses over a threshold are reported centrally and included in themodeling exercise The local business unit should, however, be encouraged to report any events thatcan be considered unusual or indicative of a potential for increasingly frequent or severe losses

The debate from the quantitative point of view is slightly different The modeling of many, very smallevents that from their nature cannot develop into large events is time-consuming and will notsignificantly affect capital calculations As a consequence, many firms choose to impose a thresholdfor practical reasons, on the basis that the small losses are budgeted for and can, therefore, either beexcluded from the capital calculation or possibly be reflected in a scaling operation to reflect aconstant bedrock of small losses It has to be recognized that such a decision to truncate data willhave to be taken into account when conducting a frequency analysis of loss events because of theunderstatement of the number of low-value losses in the data sample, which may skew a modeler’sfrequency distribution

(b) Relevance

Trang 40

We have seen from the regulatory documents that there is an insistence on the need for relevancewhen including loss events in a set of data that are to be modeled An illustration of the importance ofthe question can best be illustrated by the example of a firm that suffers a lengthy series of smalllosses due to reconciliation failures in a manual reconciliation process If the firm, tired of suchlosses, decides to replace the manual process with an automated system, then the firm has changed itsrisk profile and the loss history is no longer relevant.

Equally, if a subsidiary is sold, the view may be taken that losses incurred by that business should beexcluded from the database, as they are no longer relevant A contrary view states that these lossescan remain in the database as a proxy for new activities where losses have yet to be suffered

It is probably most appropriate to examine the losses to decide whether these are events that areinextricably linked to the activity that has been disposed of or whether they reflect exposures that arelinked to the remaining parts of the organization Again, it will be for the central operational riskfunction to set out a policy for the organization to comply with

(c) Nature of losses

It is tempting to think of losses as being monetary in nature and easily traceable to entries in thegeneral ledger account This, however, is not always the case Operational events will often havelittle or no financial impact on the firm, but their effect will be felt by the staff or by customers, orwill impact on the firm’s image, leading to loss of future business

If we consider the example of a major failure of an ATM network over a holiday weekend, it is easy

to see that the direct financial impact to the firm may be negligible This statement disregards theinconvenience to clients who are unable to withdraw cash and whose dissatisfaction will be reflected

in complaints that will require response resources, negative press reports, and potential loss ofbusiness Another example would be the failure of a system where tangible loss is avoided by staffworking long hours of overtime to ensure that transactions are properly completed

These are unquestionably loss events that have an impact beyond the financial and, for the sake ofcompleteness, a procedure should be devised to capture the indirect or soft costs associated with theevent

Near Misses and Fortuitous Profits

A near miss is probably best defined as an operational loss event where all established controls havefailed, but loss has been prevented by an extraneous intervention An example is the case where fundsare sent in error to the wrong client and who, when the error is discovered, returns the funds withoutdiscussion The error has been made and funds paid away, but the honesty of the recipient hasavoided the loss

Such events should be recorded and the lessons connected to the event learnt Analysis of the eventwill reveal that a loss could have been suffered and the elements that would have contributed to such

a loss can be analyzed and used to create loss scenarios Such scenarios will ensure that the proper

Ngày đăng: 21/01/2020, 09:06

TỪ KHÓA LIÊN QUAN

🧩 Sản phẩm bạn có thể quan tâm

w