LO 45.2: Describe extreme value theory (EVT) and its use in risk management.
Extreme value theory (EVT) is a branch of applied statistics that has been developed to address problems associated with extreme outcomes. EVT focuses on the unique aspects of extreme values and is different from central tendency statistics, in which the central-limit theorem plays an important role. Extreme value theorems provide a template for estimating the parameters used to describe extreme movements.
Page 96
2018 Kaplan, Inc.
Topic 45 Cross Reference to GARP Assigned Reading – Dowd, Chapter 7
One approach for estimating parameters is the Fisher-Tippett theorem (1928). According to this theorem, as the sample size n gets large, the distribution of extremes, denoted M , converges to the following distribution known as the generalized extreme value (GEV) distribution:
F ( X |
= e x p
F(X |
= exp fl+ ^ x ——- 1
>
f – X Li
l + ^ x ——- 1 l cr
> – 1 / t
J
exp{ ( x \1
exp{
Y
J.
if i = 0
For these formulas, the following restriction holds for random variable X:
The parameters (i and a are the location parameter and scale parameter, respectively, of the limiting distribution. Although related to the mean and variance, they are not the same. The symbol is the tail index and indicates the shape (or heaviness) of the tail of the limiting distribution. There are three general cases of the GEV distribution: 1. > 0, the GEV becomes a Frechet distribution, and the tails are heavy as is the
case for the ^-distribution and Pareto distributions.
2. = 0, the GEV becomes the Gumbel distribution, and the tails are light as is the
case for the normal and log-normal distributions.
3. < 0, the GEV becomes the Weibull distribution, and the tails are lighter than a
normal distribution.
Distributions where 0 and = 0. Therefore, one practical consideration the researcher faces is whether to assume either > 0 or = 0 and apply the respective Frechet or Gumbel distributions and their corresponding estimation procedures. There are three basic ways of making this choice. 1. The researcher is confident of the parent distribution. If the researcher is confident it is
a ^-distribution, for example, then the researcher should assume > 0.
2. The researcher applies a statistical test and cannot reject the hypothesis = 0. In this
case, the researcher uses the assumption = 0.
3. The researcher may wish to be conservative and assume > 0 to avoid model risk.
2018 Kaplan, Inc.
Page 97
Topic 45 Cross Reference to GARP Assigned Reading – Dowd, Chapter 7
P e a k s – O v e r -T h r e s h o l d
Articles by kenli
LO 45.1: Explain the importance and challenges of extreme values in risk
LO 45.1: Explain the importance and challenges of extreme values in risk management.
The occurrence of extreme events is rare; however, it is crucial to identify these extreme events for risk management since they can prove to be very costly. Extreme values are the result of large market declines or crashes, the failure of major institutions, the outbreak of financial or political crises, or natural catastrophes. The challenge of analyzing and modeling extreme values is that there are only a few observations for which to build a model, and there are ranges of extreme values that have yet to occur.
To meet the challenge, researchers must assume a certain distribution. The assumed distribution will probably not be identical to the true distribution; therefore, some degree of error will be present. Researchers usually choose distributions based on measures of central tendency, which misses the issue of trying to incorporate extreme values. Researchers need approaches that specifically deal with extreme value estimation. Incidentally, researchers in many fields other than finance face similar problems. In flood control, for example, analysts have to model the highest possible flood line when building a dam, and this estimation would most likely require a height above observed levels of flooding to date.
E x t r e m e V a l u e T h e o r y
LO 44.3: Describe general and specific criteria recommended by the Basel
LO 44.3: Describe general and specific criteria recommended by the Basel Committee for the identification, collection, and treatment of operational loss data.
Banks that incorporate the loss component into the SMA calculation must follow the following general criteria: Documented processes and procedures must be in place for the identification, collection,
and treatment of internal loss data.
A bank must maintain information on each operational risk event, including gross loss amounts, the date of occurrence (when the event first began or happened), the date of discovery (when the bank became aware of the event), the date of accounting (when the reserve, loss, or loss provision was first recognized in the banks income statement, any gross loss amount recoveries, and what the drivers were of the loss event itself). Specific criteria must exist for loss data assignments stemming from centralized function events and related events over time (considered grouped losses).
For the purposes of calculating minimum regulatory capital per the SMA framework,
operational risk losses tied to credit risk will be excluded from the calculation. Operational risk losses tied to market risk will be included in the SMA calculation.
A bank has to be able to document any criteria used to allocate losses to specific event types. In addition, a bank must be able to categorize historical internal loss data into the appropriate Level 1 supervisory categories per the Basel II Accord (Annex 9) and be prepared to provide this to supervisors when requested.
An observation period of 10 years must be used as a basis for internally generated loss
data calculations. On an exception basis and as long as good-quality data is not available for more than a five-year period, a bank first moving to the SMA can use a five-year observation period. Internal loss data must be comprehensive in nature and capture all material exposures and activities across all geographic locations and subsystems. When a bank first moves to the SMA, a 20,000 de minimis gross loss threshold is acceptable. Afterward, this threshold is lowered to 10,000.
2018 Kaplan, Inc.
Page 91
Topic 44 Cross Reference to GARP Assigned Reading – Basel Committee on Banking Supervision
In addition to the general criteria noted previously, specific criteria must also be followed as described as follows: A policy must exist for each bank that sets the criteria for when an operational risk event or loss (which is recorded in the internal loss event database) is included in the loss data set for calculating the SMA regulatory capital amount (i.e., the SMA loss data set). For all operational loss events, banks must be able to specifically identify gross loss
External expenses (legal fees, advisor fees, vendor costs, etc.) directly tied to the amounts, insurance recoveries, and non-insurance recoveries. A gross loss is a loss before any recoveries, while a net loss takes into account the impact of recoveries. The SMA loss data cannot include losses net of insurance recoveries. In calculating the gross loss for the SMA loss data set, the following components must be included’.
External expenses (legal fees, advisor fees, vendor costs, etc.) directly tied to the
operational risk event itself and any repair/replacement costs needed to restore the bank to the position it was in before the event occurring. Settlements, impairments, write-downs, and any other direct charges to the banks
Settlements, impairments, write-downs, and any other direct charges to the banks
income statement as a result of the operational risk event. .Any reserves or provisions tied to the potential operational loss impact and booked to the income statement. Losses (tied to operational risk events) that are definitive in terms of financial impact
Losses (tied to operational risk events) that are definitive in terms of financial impact
but remain as pending losses because they are in transition or suspense accounts not reflected on the income statement. Materiality will dictate whether the loss is included in the data set. Timing losses booked in the current financial accounting period that are material
Timing losses booked in the current financial accounting period that are material in nature and are due to events that give rise to legal risk and cross more than one financial accounting period. The total cost of improvements, upgrades, and risk assessment enhancements and In calculating the gross loss for the SMA loss data set, the following components must be excluded.
The total cost of improvements, upgrades, and risk assessment enhancements and
initiatives that are incurred after the risk event occurs. Insurance premiums. The costs associated with general maintenance contracts on property, plant, and
The costs associated with general maintenance contracts on property, plant, and
equipment (PP&E).
For every reporting year of the SMA regulatory capital, the gross losses included in the loss data set must incorporate any financial adjustments (additional losses, settlements, provision changes) made within the year for risk events with reference dates up to 10 years before that reporting year. The operational loss amount after adjustments must then be identified and compared to the 10 million and 100 million threshold.
The only two dates a bank can use to build its SMA loss data set are the date of discovery
or the date of accounting. For any legal loss events, the date of accounting (which is when the legal reserve representing the probable estimated loss) is the latest date that can be used for the loss data set.
Any losses that are related to a common operational risk event or are related by
operational risk events over time are considered grouped losses and must be entered as a single loss into the SMA loss data set.
The circumstances, data types, and methodology for grouping data should be defined
with criteria found in the individual banks internal loss data policy. In instances where individual judgment is needed to apply the criteria, this must be clarified and documented.
Page 92
2018 Kaplan, Inc.
Topic 44 Cross Reference to GARP Assigned Reading – Basel Committee on Banking Supervision
K e y C o n c e p t s
LO 44.1 The standardized measurement approach (SMA) includes both a business indicator (BI) component accounting for operational risk exposure and an internal loss multiplier and loss component accounting for operational losses unique to an individual bank. While the BI component is factored into the SMA for banks of all sizes, the impact it has on the SMA calculation will vary depending on where the bank is classified from buckets 15. The loss component is factored in for all banks classified in buckets 25.
LO 44.2 The older advanced measurement approach (AMA) allowed banks to use a vast range of models that were inherently more flexible for individual banks but prevented valuable comparisons among banks. From this, the SMA was created as a non-model-based approach used to assess operational risk using both financial statement measures and loss data unique to individual banks.
LO 44.3 For identifying, collecting, and accounting for operational loss data, the Basel Committee has outlined several general and specific criteria that should be used. Key general criteria include processes and procedures, documentation needed, thresholds for capturing losses, and appropriate periods. Specific criteria include how to calculate gross losses (what is included versus what is excluded), key dates used to capture the losses, how to quantify grouped losses, and policies needed.
2018 Kaplan, Inc.
Page 93
Topic 44 Cross Reference to GARP Assigned Reading – Basel Committee on Banking Supervision
C o n c e p t C h e c k e r s
1.
2.
3.
4.
3.
The business indicator (BI) component in the standardized measurement approach (SMA) calculation for a bank with a BI of 13 billion will be closest to: A. 1.43 billion. B. 1.91 billion. C. 2.43 billion. D. 13.00 billion.
Which of the following items from the profit & loss (P&L) statement should be included in the BI component calculation? A. Administrative expenses. B. Insurance premiums paid. C. Depreciation related to capitalized equipment. D. Provision reversals related to operational loss events.
Which of the following components within the BI calculation takes into account a banks trading and banking book P&L results? A. Loss component. B. Services component. C. Financial component. D. Interest, lease, dividend component.
Which of the following statements best describes a difference between the SMA and the older operational risk capital approaches? A. The standardized approach (TSA) and the alternative standardized approach
(ASA) were variations of the SMA.
B. The advanced measurement approach (AMA) was more flexible in its
C. The SMA accounts for internal loss experiences that were not factored into the
D. The SMA uses a model-based methodology, while the AMA was more flexible
application than the SMA.
AMA.
and principles-based.
In deriving the SMA loss data set for an individual bank, each of the following items will most likely be included in the gross loss calculation except: A. legal fees of 900,000 associated with an unusual risk event. B. a 2 million settlement tied to a recent operational risk event. C. a 1.4 million reserve booked to the income statement to cover a potential
D. 1.73 million spent on maintenance contracts tied to the banks property, plant,
operational loss.
and equipment (PP&E).
Page 94
2018 Kaplan, Inc.
Topic 44 Cross Reference to GARP Assigned Reading – Basel Committee on Banking Supervision
C o n c e pt C h e c k e r An s w e r s
1. C A bank with a BI of 13 million will fall into bucket 4, which covers a BI range of 10
billion to 30 billion. With the BI component formula of 1.74 billion + 0.23(BI – 10 billion) for bucket 4 banks, the BI component for this bank will be equal to 1.74 billion + 0.23(13 billion – 1 0 billion) = 2.43 billion.
2. D A provision reversal would normally be excluded except when it relates to operational loss
events. Each of the other three choices represents a P&L item that should be excluded from the BI component calculation.
3. C The formula for the financial component of the BI calculation is equal to:
abs(net P<B
) + abs(net P&LBB
vo
) vo
with TB representing the trading book and BB representing the banking book.
4. B Because banks were able to use a wide range of models for calculating the AMA, there was more flexibility to these approaches than under the new SMA. TSA and ASA were older approaches rather than variations of the SMA. AMA did account for internal losses. The SMA is non-model-based, whereas the AMA did incorporate bank-specific models.
5. D The costs associated with maintenance contracts for PP&E are outlined in the specific
criteria for collecting operational loss data as excluded for the purposes of calculating the gross loss for the SMA loss data set.
2018 Kaplan, Inc.
Page 95
The following is a review of the Operational and Integrated Risk Management principles designed to address the learning objectives set forth by GARP. This topic is also covered in:
Pa r a m e t r i c A ppr o a c h e s (II): Ex t r e me Va l u e
Topic 45
E x a m F o c u s
Extreme values are important for risk management because they are associated with catastrophic events such as the failure of large institutions and market crashes. Since they are rare, modeling such events is a challenging task. In this topic, we will address the generalized extreme value (GEV) distribution, and the peaks-over-threshold approach, as well as discuss how peaks-over-threshold converges to the generalized Pareto distribution.
M a n a g i n g E x t r e m e V a l u e s
LO 44.2: Compare the SMA to earlier methods of calculating operational risk
LO 44.2: Compare the SMA to earlier methods of calculating operational risk capital, including the Alternative Measurement Approaches (AMA), and explain the rationale for the proposal to replace them.
Before the development of the SMA, banks were using either the advanced measurement approach (AMA), the standardized approach (TSA), or its variation, the alternative standardized approach (ASA), to assess operational risk. The advanced measurement
Page 90
2018 Kaplan, Inc.
Topic 44 Cross Reference to GARP Assigned Reading – Basel Committee on Banking Supervision
approach, which was introduced as part of the Basel II framework in 2006, allowed for the estimation of regulatory capital based on a range of internal modeling practices. This approach was a principles-based framework allowing for significant flexibility. Although the hope of the Basel Committee was for best practices to emerge as flexibility declined, this never happened and challenges associated with comparability among banks (due to a wide range of modeling practices) and overly complex calculations remained.
Given these challenges, the Basel Committee set a goal of creating a new measure to allow for greater comparability and less complexity relative to prior methods. The SMA was created as this measure, with the intent of providing a means of assessing operational risk that would include both a standardized measure of operational risk and bank-specific loss data. Unlike AMA, the SMA is a single, non-model-based method used to estimate operational risk capital that combines financial statement information with the internal loss experience of a specific bank. The SMA is to be applied to internationally active banks on a consolidated basis, whereas it is optional for non-internationally active institutions. Although it is a relatively new measure, the SMA combines key elements of the standardized approach along with an internal loss experience component that was central to older approaches.
I d e n t i f i c a t i o n , C o l
l e c t i o n , a n d T r e a t m e n t o f O p e r a t i o n a l L o s s D a t a
LO 44.1: Explain the elements of the proposed Standardized Measurement
LO 44.1: Explain the elements of the proposed Standardized Measurement Approach (SMA), including the business indicator, internal loss multiplier and loss component, and calculate the operational risk capital requirement for a bank using the SMA.
The standardized measurement approach (SMA) represents the combination of a financial statement operational risk exposure proxy (termed the business indicator, or BI) and operational loss data specific for an individual bank. Because using only a financial statement proxy such as the BI would not fully account for the often significant differences in risk profiles between medium to large banks, the historical loss component was added to the SMA to account for future operational risk loss exposure. As such, the loss component serves to both enhance the SMAs sensitivity to risk and to offer an incentive for a bank to improve on its operational risk management practices. A bank will be required to hold less in operational risk regulatory capital with fewer operational risk losses and a more effective risk management system.
The Business Indicator
The business indicator (BI) incorporates most of the same income statement components that are found in the calculation of gross income (GI). A few differences include: Positive values are used in the BI (versus some components incorporating negative values
The BI includes some items that tie to operational risk but are netted or omitted from
into the GI).
the GI calculation.
Page 86
2018 Kaplan, Inc.
Topic 44 Cross Reference to GARP Assigned Reading – Basel Committee on Banking Supervision
The SMA calculation has evolved over time, as there were several issues with the first calculation that were since remedied with the latest version. These items include: Modifying the service component to equal max(fee income, fee expense) + max(other
operating income, other operating expense). This change still allowed banks with large service business volumes to be treated differently from banks with small service businesses, while also reducing the inherent penalty applied to banks with both high fee income and high fee expenses. Including dividend income in the interest component, which alleviated the differing treatment among institutions as to where dividend income is accounted for on their income statements.
Adjusting the interest component by the ratio of the net interest margin (NIM) cap (set at 3.5%) to the actual NIM. Before this adjustment, banks with high NIMs (calculated as net interest income divided by interest-earning assets) were penalized with high regulatory capital requirements relative to their true operational risk levels.
For banks with high fee components (those with shares of fees in excess of 50% of
the unadjusted BI), modifying the BI such that only 10% of the fees in excess of the unadjusted BI are counted.
Netting and incorporating all financial and operating lease income and expenses into the
interest component as an absolute value to alleviate inconsistent treatment of leases.
Business Indicator Calculation
The BI is calculated as the most recent three-year average for each of the following three components:
BI = ILDC + SC + FC
avg
avg
avg
where: ILDC = interest, lease, dividend component SC = services component FC = financial component
The three individual components are calculated as follows, using three years of average data:
interest, lease, dividend component (ILDC) =
min[abs(II avg v avg
L
-IE avg’
), 0.035 x IEA 5
avgJ
] + abs(LI
v
avg
– LE
) + DI
avg7
avg
where: abs = absolute value II = interest income (excluding operating and finance leases) IE = interest expenses (excluding operating and finance leases) IEA = interest-earning assets LI = lease income LE = lease expenses DI = dividend income
2018 Kaplan, Inc.
Page 87
Topic 44 Cross Reference to GARP Assigned Reading – Basel Committee on Banking Supervision
services component (SC) =
max(OOIavg, OOEavg) + max{abs(FIavg – FEavg), min[max(FIavg, FEavg), 0.5 x uBI + 0.1 x (max(FIavg, FEavg) – 0.5 x uBI)]}
where: OOI = other operating income OOE = other operating expenses FI = fee income FE = fee expenses uBI = unadjusted business indicator =
ILDC + max(OOI
avg
v
, OOE
avg’ + max(FIavg, FEavg) + FCavg
)
avg
abs(net P<B
) + abs(net P&LBB vo
) vo
financial component (FC) =
where: P&L = profit & loss statement line item TB = trading book BB = banking book
For the purposes of calculating the SMA, banks (based on their size for the BI component) are divided into five buckets as shown in Figure 1.
Figure 1: BI Buckets
Bucket
BI Range
1 2 3 4 5
0 billion1 billion 1 billion-3 billion 3 billion-10 billion 10 billion-30 billion
30 billion – +oo
BI Component
0.11 x BI
110 million + 0.15(BI 1 billion) 410 million + 0.19(BI – 3 billion) 1.74 billion + 0.23(BI – 10 billion) 6.34 billion + 0.29(BI – 30 billion)
While a banks internal losses are not factored in for the bucket 1 group, internal losses are factored in for banks in buckets 25 to the extent that they allow for differentiation among banks with different risk profiles. As is evident from Figure 1, there is both a linear increase in the BI component within a given bucket and an increase in the marginal impact (i.e., 0.11 for bucket 1, 0.15 for bucket 2, etc.) of the BI for banks in higher versus lower buckets.
The BI component calculation should exclude all of the following P&L items: administrative expenses, recovery of administrative expenses, impairments and impairment reversals, provisions and reversals of provisions (unless they relate to operational loss events), fixed asset and premises expenses (unless they relate to operational loss events), depreciation and amortization of assets (unless it relates to operating lease assets), expenses tied to share capital repayable on demand, income/expenses from insurance or reinsurance businesses, premiums paid and reimbursements/payments received from insurance or reinsurance policies, goodwill changes, and corporate income tax.
Page 88
2018 Kaplan, Inc.
Topic 44 Cross Reference to GARP Assigned Reading – Basel Committee on Banking Supervision
Internal Loss Multiplier Calculation
Through the addition of a loss component, the SMA becomes more sensitive to risk than it would be with just the BI component alone. As highlighted above, internal losses become a relevant factor for banks in buckets 25. Internal losses are factored into the SMA calculation via the internal loss multiplier, which is calculated as follows:
internal loss multiplier =
loss component BI component
v
where: loss component = 7 x average total annual loss only including loss events above 10 million 7 x average total annual loss + 7 x average total annual loss only including loss events above 10 million + 5 x average total annual loss only including loss events above 100 million The loss component serves to reflect the operational loss exposure based on a banks internal loss experiences. To differentiate between banks with similar average loss totals but differing loss distributions, the loss component distinguishes between smaller loss events versus those above 10 million and 100 million. The logarithmic function contained within the internal loss multiplier suggests that it increases at a decreasing rate (with the loss component) and has a lower bound equal to: [Ln^e1 1) = 0.541].
Ideally, a bank will have 10 years of quality data to calculate the averages that go into the loss component calculation. If 10 years are not available, then during the transition to the SMA calculation, banks may use 5 years and add more years as time progresses until they reach the 10-year requirement. If a bank does not have 5 years of data, then the BI component becomes the only component of the SMA calculation.
A bank whose exposure is considered average relative to its industry will have a loss component equivalent to its BI component; this implies an internal loss multiplier equal to one and an SMA capital requirement equal to its BI component. If a banks loss experience is greater (less) than the industry average, its loss component will be above (below) the BI component and its SMA capital will be above (below) the BI component.
SMA Capital Requirement Calculation
The SMA is used to determine the operational risk capital requirement and is calculated as follows:
For BI bucket 1 banks:
SMA capital = BI component
For BI bucket 25 banks:
SMA capital = 110M + (BI component – 110M) x internal loss multiplier
2018 Kaplan, Inc.
Page 89
Topic 44 Cross Reference to GARP Assigned Reading – Basel Committee on Banking Supervision
The amounts used in the BI component, which are bucket-dependent, will follow the equations shown in the BI component column of Figure 1. The internal loss multiplier is calculated per the previous section.
For banks that are part of a consolidated entity, the SMA calculations will incorporate fully consolidated BI amounts (netting all intragroup income and expenses). At a subconsolidated level, the SMA uses BI amounts for the banks that are consolidated at that particular level. At the subsidiary level, the SMA calculations will use the BI amounts from the specific subsidiary. If the BI amounts for a subsidiary or subconsolidated level reach the bucket 2 level, the banks must incorporate their own loss experiences (not those of other members of the group). If a subsidiary of a bank in buckets 25 does not meet the qualitative standards associated with using the loss component, the SMA capital requirement is calculated using 100% of the BI component.
It is possible that the Committee will consider an alternative to the calculation of the internal loss multiplier shown earlier, which would replace the logarithmic function with a maximum multiple for the loss component. The formula for the internal loss multiplier would then be updated as:
m x LC + (m 1) x BIC LC + (2m 2) x BIC
where: m = factor to be calibrated LC = loss component BIC = business indicator component
Example: Computing the SMA Capital Requirement
PS Bank Inc., has a BI of 18.48 million for the current fiscal year. Calculate PS Banks capital requirement with the standardized measurement approach.
Answer:
PS Bank is a bucket 1 bank because its BI falls within the range of 0 billion1 billion. For bucket 1 banks, the only component of the SMA calculation is the BI component and the calculation is: 0.11 x 18.48 million, or 2.03 million.
SMA v s . E a r l
i e r O p e r a t i o n a l R i s k C a p i t a l A p p r o a c h e s
LO 43.6: Explain the use of scenario analysis and the hybrid approach in modeling
LO 43.6: Explain the use of scenario analysis and the hybrid approach in modeling operational risk capital.
Scenario analysis data is designed to identify fat-tail events, which is useful when calculating the appropriate amount of operational risk capital. The advantage of using scenario analysis is that data reflects the future through a process designed to consider what if scenarios, in contrast to the LDA which only considers the past. The major disadvantage of scenario analysis is that the data is highly subjective, and it only produces a few data points. As a result, complex techniques must be applied to model the full loss distribution,
2018 Kaplan, Inc.
Page 81
Topic 43 Cross Reference to GARP Assigned Reading – Girling, Chapter 12
as the lack of data output in scenario analysis can make the fitting of distributions difficult. In addition, small changes in assumptions can lead to widely different results.
There are many different approaches to scenario analysis, but whichever method is used, a scarcity of data points is likely. This makes pure scenario analysis a difficult approach to defend in estimating risk capital. Also, the more reliance there is on scenario analysis, the more robust the program must be because sometimes there is little or no loss data available and a model may need to rely purely on scenario analysis for a particular risk category. Consequently, it is acceptable to have different modeling techniques for various risk categories as long as the differences are justified. While some scenario-based models have been approved in Europe, U.S. regulators generally do not accept them.
In the hybrid approach, loss data and scenario analysis output are both used to calculate operational risk capital. Some firms combine the LDA and scenario analysis by stitching together two distributions. For example, the LDA may be used to model expected losses, and scenario analysis may be used to model unexpected losses. Another approach combines scenario analysis data points with actual loss data when developing frequency and severity distributions.
I n s u r a n c e
Banks have the option to insure against the occurrence of operational risks. The important considerations are how much insurance to buy and which operational risks to insure. Insurance companies offer polices on everything from losses related to fire to losses related to a rogue trader. A bank using the AMA for calculating operational risk capital requirements can use insurance to reduce its capital charge. However, the recognition of insurance mitigation is limited to 20% of the total operational risk capital required.
The LDA allows for a risk profiling of an institution, which can include the risk reducing effect of insurance, which then alters the aggregate loss distribution. Typically this is done by reducing the severity of the losses that exceed a given deductible in the insurance policy. In other words, insurance typically lowers the severity but not the frequency.
Operational risk capital may need to be billions of dollars, so it can be worthwhile to pursue insurance as a means to reduce the amount of capital needed. Insurance companies are attempting to accommodate industry needs through new insurance products that meet Basel requirements.
Page 82
2018 Kaplan, Inc.
Topic 43 Cross Reference to GARP Assigned Reading – Girling, Chapter 12
K e y C o n c e pt s
LO 43.1 The three methods for calculating operational risk capital requirements are (1) the basic indicator approach (BIA), (2) the standardized approach (TSA), and (3) the advanced measurement approach (AMA). Large banks are encouraged to move from TSA to the AMA in an effort to reduce capital requirements.
LO 43.2 The first requirement to use the AMA is that the model must hold sufficient capital to cover all operational risk losses for one year with a certainty of 99.9%. The second requirement is that internal loss data, external loss data, scenario analysis, and business environment internal control factors must be included in the model. The third requirement is that there must be a method for allocating capital that incentivizes good behavior.
LO 43.3 The loss distribution approach (LDA) relies on internal losses as the basis of its design. It uses internal losses as direct inputs, with the remaining data elements being used for stressing or allocation purposes. However, regardless of its model design, a bank must have at least three years of loss data. The advantage of the LDA model is that it is based on historical data relevant to the firm. The disadvantage is that the data collection period is likely to be relatively short and may not capture all fat-tail events.
LO 43.4 When developing a model of expected operational risk losses, the first step is to determine the likely frequency of events on an annual basis. The most popular distribution for modeling frequency is the Poisson distribution. In a Poisson distribution, there is only one parameter, \, which represents the average number of events in a given year. The next step in modeling expected operational risk losses is to determine the severity of an event. The most common and least complex distribution is to use a lognormal distribution.
LO 43.5 Once the frequency and severity distributions have been established, the next step is to use them to generate data points to better estimate the capital required at a 99.9% confidence level. Monte Carlo simulation is a method for combining frequency and severity distributions to produce additional data points that have the same characteristics as observed data points.
LO 43.6 Scenario analysis data is designed to identify fat-tail events and is useful in calculating the appropriate amount of operational risk capital. In the hybrid approach, loss data and scenario analysis output are both used to calculate operational risk capital.
2018 Kaplan, Inc.
Page 83
Topic 43 Cross Reference to GARP Assigned Reading – Girling, Chapter 12
Co n c e pt Ch e c k e r s
1.
2.
3.
4.
5.
Under the basic indicator approach (BIA), what is Alpha Banks capital charge if it has revenues of $100 million, $150 million, and $200 million in the first three years? A. $22.0 million. B. $22.5 million. C. $23.0 million. D. $23.5 million.
Which of the following statements is not a requirement to apply the advanced measurement approach (AMA)? A. The model must hold capital to cover all operational risk losses for one year with
a certainty of 99.9%.
B. Internal loss data, external loss data, scenario analysis, and business environment
internal control factors must be included in the model.
C. Capital must be allocated to minimize risk. D. There must be a method for allocating capital that incentivizes good behavior.
Which of the following reasons is not a disadvantage of the loss distribution approach (LDA) to modeling operational risk capital requirements? A. The LDA is based on historical data. B. Most firms have limited historical data. C. Fat-tail events may not be captured by modeling. D. Historical data is not reflective of the future.
When modeling risk frequency, it is common to: A. use a Poisson distribution. B. assume that risks are highly correlated. C. assume risk frequency and severity are the same. D. use a straight-line projection from the most recent loss data.
Extreme losses in the tail of the operational risk loss distribution most likely follow which type of process/distribution? A. Generalized Pareto distribution. B. Historical simulation method. C. Poisson distribution. D. Extreme value theory.
Page 84
2018 Kaplan, Inc.
Topic 43 Cross Reference to GARP Assigned Reading – Girling, Chapter 12
C o n c e pt C h e c k e r A n sw e r s
The BIA is based on 15% of the banks annual gross income over a three-year period and is computed as follows:
[(100 + 150 + 200) x 0.15]
K b ia
= $22.5 million
2. C There is no specific requirement under the AMA to minimize risk.
3. A An advantage of the LDA model is that it is based on historical data relevant to the firm.
4. A
It is common to use a Poisson distribution to model loss frequency. A Poisson distribution has a single parameter, X, which can be varied to accurately describe loss data.
5. A The most common and least complex approach for modeling extreme losses is to use a
lognormal distribution. However, low frequency losses may be a better fit to distributions such as Generalized Gamma, Transformed Beta, Generalized Pareto, or Weibull.
2018 Kaplan, Inc.
Page 85
The following is a review of the Operational and Integrated Risk Management principles designed to address the learning objectives set forth by GARP. This topic is also covered in:
St a n d a r d i z e d M e a su r e me n t A ppr o a c h f o r O pe r a t i o n a l Ri s k
Topic 44
E x a m F o c u s
The focus of this topic is on the calculation of the standardized measurement approach (SMA). In particular, candidates should understand how the business indicator (BI) is derived and how buckets are used to group banks by size such that the BI will have a different impact on the SMA given a banks bucket. Candidates should also know how to calculate the internal loss multiplier and the loss component, along with understanding how this component impacts the SMA given a banks bucket classification. The SMA has evolved over time from earlier approaches that were more model-based and allowed too much flexibility. Candidates should also be familiar with the Basel Committee s outline of general and specific criteria applicable to operational loss data.
T h e S t a n d a r d i z e d M e a s u r e m e n t A p p r o a c h
LO 43.3: Explain how Monte Carlo simulation can be used to generate additional
LO 43.3: Explain how Monte Carlo simulation can be used to generate additional data points to estimate the 99.9th percentile of an operational loss distribution.
Once the frequency and severity distributions have been established, the next step is to combine them to generate data points that better estimate the capital required. This is done to ensure that likely losses for the next year will be covered at the 99.9% confidence level. Monte Carlo simulation can be used to combine frequency and severity distributions (a process known as convolution) in order to produce additional data points with the same characteristics as the observed data points.
With this process, we make random draws from the loss frequency data and then draw those events from the loss severity data. Each combination of frequency and severity becomes a potential loss event in our loss distribution. This process is continued several thousand times to create the potential loss distribution. To find the 99.9% confidence level, with a million observations for example, we would select the 1,000th item in an ordered list (from largest to smallest loss) to represent the maximum loss that will be experienced in a single year with 99.9% certainty.
S c e n a r i o A n a l y s i s
LO 43.4: Explain how frequency and severity distributions of operational losses
LO 43.4: Explain how frequency and severity distributions of operational losses are obtained, including commonly used distributions and suitability guidelines for probability distributions.
Modeling Frequency
When developing a model of expected operational risk losses, the first step is to determine the likely frequency of events on an annual basis. The most common distribution for
2018 Kaplan, Inc.
Page 79
Topic 43 Cross Reference to GARP Assigned Reading – Girling, Chapter 12
modeling frequency is the Poisson distribution. This distribution uses only one parameter, \, which represents the average number of events in a given year, as well as the distributions mean and variance. In an LDA model, \ can be obtained by observing the historical number of internal loss events per year and then calculating the average.
The Poisson distribution represents the probability of a certain number of events occurring in a single year. As shown in Figure 2, lower values of \ produce more skewed and leptokurtic annual loss distributions than higher values of \.
Figure 2: Comparing Poisson Distributions
Modeling Severity
The next step in modeling expected operational risk losses is to determine the likely size (i.e., severity) of an event. The most common and least complex approach is to use a lognormal distribution. However, low frequency losses may be a better fit to distributions such as Generalized Gamma, Transformed Beta, Generalized Pareto, or Weibull. Regulators are interested in the selected distributions goodness of fit.
Regardless of the distribution selected, the probability density function must exhibit fat tails. Events that are more than three standard deviations from the mean are more likely to occur than in a normal distribution; thus, the distribution will be skewed as seen in Figure 3.
Page 80
2018 Kaplan, Inc.
Figure 3: Example Severity Probability Distribution
Topic 43 Cross Reference to GARP Assigned Reading – Girling, Chapter 12
Monte Carlo Simulation
LO 43.3: Describe the loss distribution approach to modeling operational risk
LO 43.3: Describe the loss distribution approach to modeling operational risk capital.
The loss distribution approach (LDA) relies on internal losses as the basis of its design. A simple LDA model uses internal losses as direct inputs with the remaining three data elements being used for stressing or allocation purposes. However, according to Basel II, a bank must have at least five years of internal loss data regardless of its model design but can use three years of data when it first moves to the AMA.
The advantage of the LDA is that it is based on historical data relevant to the firm. The disadvantage is that the data collection period is likely to be relatively short and may not capture fat-tail events. For example, no firm can produce 1,000 years of data, but the model is supposed to provide a 99.9% confidence level. Also, some firms find that they have insufficient loss data to build a model, even if they have more than five years of data. Additionally, banks need to keep in mind that historical data is not necessarily reflective of the future because firms change products, processes, and controls over time.
LO 43.2: Describe the modeling requirements for a bank to use the Advanced
LO 43.2: Describe the modeling requirements for a bank to use the Advanced Measurement Approach (AMA).
to
1
Year 3
II vP0s
0 X 0
on
5 x 12% = 0.6
-2.1
Year 2
10 x 18%= 1.8
0K>
1
II vP0s
o X b
on to
1
-1.2
Year 1
0 O+
II vP0s
0 X 0
O b\ II nPc+
> X K
1.5
On
On
Business Line Corporate Finance Retail Banking Total
Topic 43 Cross Reference to GARP Assigned Reading – Girling, Chapter 12
Unanticipated Results from Negative Gross Income
The BIA and TSA capital charge methodologies can produce inappropriate results when accounting for negative gross income. For example, consider the following gross income amounts multiplied by the corresponding beta factors (in $100 millions):
The advanced measurement approach (AMA) allows banks to construct their own models for calculating operational risk capital. Although the Basel Committee allows significant flexibility in the use of the AMA, there are three main requirements. A bank must: Demonstrate an ability to capture potentially severe fat-tail losses (banks must use
99.9th percentile events with a one-year time horizon). Include internal loss data, external loss data, scenario analysis, and business environment internal control factors (i.e., the four data elements).
Allocate capital in a way that incentivizes good behavior (i.e., create incentives to
improve business line operational risk management).
Under the AMA, capital requirements should be made for all seven risk categories specified by Basel II. Some firms calculate operational risk capital at the firm level and then allocate down to the business lines, while others calculate capital at the business line level. Capital
Page 7 8
2018 Kaplan, Inc.
calculations are typically performed by constructing a business line/event type matrix, where capital is allocated based on loss data for each matrix cell.
Topic 43 Cross Reference to GARP Assigned Reading – Girling, Chapter 12
Additional quantitative requirements under the AMA include: The approach must capture all expected and unexpected losses and may only exclude
expected losses under certain criteria as stated in Basel II.
The approach must provide sufficient detail to ensure that fat-tail events are captured. The bank must sum all calculated cells in the business line/event type matrix and be able
to defend any correlation assumptions made in its AMA model.
All four data elements must be included in the model, including the use of internal and
external data, scenario analysis, and business environment factors.
The bank must use appropriate weights for the four data elements when determining
operational risk capital.
While the four data elements must be considered in the capital calculations, many banks use some of these elements only to allocate capital or perform stress tests, and then adjust their models, rather than using them as direct inputs into capital calculations. Regulators have accepted many different types of AMA models, such as the loss distribution approach, given the rapid development of modeling operational risk capital.
Loss D i s t r i b u t
i o n A p p r o a c h