LO 77.2: Explain the effects of forced deleveraging and the failure of covered

LO 77.2: Explain the effects of forced deleveraging and the failure of covered interest rate parity.
Effects of Forced Deleveraging
Forced deleveraging refers to the reduction in leverage by a borrower following capital market events that necessitate deleveraging. A good gauge of leverage is the haircut in the repurchase agreement (repo) market. The haircut refers to the difference between the value of the collateral pledged and the amount borrowed. For example, a 2% haircut implies that a bank is able to borrow $98 dollars for every $100 in securities pledged. The lower
2018 Kaplan, Inc.
Page 201
Topic 77 Cross Reference to GARP Assigned Reading – Shin
the haircut, the higher the leverage implied in the transaction. The 2% haircut implies a leverage factor (or leverage ratio) of 50 for the bank (computed as total assets over equity).
Prior to the financial crisis of 20072009, leverage factors of 50 were not uncommon, with corresponding very low haircut values. However, such high leverage left many banks exposed to potential forced deleveraging. Indeed, in the period immediately following the financial crisis, the leverage factor dropped to around 25 in the U.S. securities broker-dealer sector, with the haircut increasing from 2% to 4%. In general, such a large change has material consequences. Assuming a banks equity isnt impacted, the bank would have to cut its assets in half. If the bank also suffers losses, the impact is even worse.
VIX as a Gauge of Leverage
Up until the onset of the financial crisis, the volatility index (VIX) represented a good gauge of the appetite for leverage in the markets. The VIX measures implied volatility from stock option (call and put) prices. Prior to the financial crisis, a low VIX implied low fear and therefore high leverage. The VIX was able to adequately capture the risk appetite within the financial system. Given that banks typically borrow in order to lend funds, easy conditions for borrowing also created easy conditions for lending, creating a circular series of events that led to ever easier borrowing and liquidity as well as higher leverage.
However, the VIX as a reliable gauge of leverage shifted dramatically following the financial crisis. The previous relationships of high VIX-low leverage / low VIX-high leverage ceased to hold and the VIX lost its explanatory power of leverage. While there remains considerable risk appetite for stocks, as witnessed by high stock valuations and low volatility, the banking sector has not fared comparatively well, with low market-to-book value ratios.
So what has changed? One explanation is that monetary easing has calmed markets and compressed credit spreads, although this explanation generally holds best when policy rates are positive. Another explanation could be the role of regulation, which impacts bank behavior and may constrain leverage. A counterargument to the role of regulation is that the financial crisis was not brought on by regulatory change (although regulatory change certainly followed in the post-crisis period). Capitalization also plays a role in the financial health of the banking sector. Better capitalized banks weathered the crisis better and have fared well compared to their weaker capitalized counterparts.
Failure of Covered Interest Arbitrage
Covered interest parity (CIP) is a parity condition that states that the interest rates implied in foreign exchange markets should be consistent with the money market rate for each currency. In other words, the interest rate implied between the forward and spot rates on a U.S. dollar forward or swap (one side borrows U.S. dollars and lends another currency) should be the same as the money market interest on the dollar. If the relationship does not hold, an arbitrage opportunity would exist for earning a profit on borrowing cheap in one currency, lending out funds at a higher rate in another currency, and concurrently fully hedging currency risk.
CIP held up reasonably well before the financial crisis. However, the relationship no longer worked well in the post-crisis period, and a gap between CIP-implied rates and observed
Page 202
2018 Kaplan, Inc.
Topic 77 Cross Reference to GARP Assigned Reading – Shin
rates has persisted. The primary reason for the difference is that CIP is a theoretical concept based on certain simplifying assumptions, including the ability to take on any position in any currency at prevailing market prices. In reality, borrowers and lenders need to transact through banks, which may not have sufficient capital to enter into these transactions or may find the spreads to be uneconomical. Capital may be insufficient partly due to regulation, although banks typically have capital well above regulatory requirements.
U.S. D o l l a r a s a Ga u g e o f Le v e r a g e

LO 77.1: Describe the links between banks and capital markets.

LO 77.1: Describe the links between banks and capital markets.
Market finance has been an influential force in the financial system. Market finance can either connect borrowers and investor directly, or it can connect them through a bank intermediary in the wholesale market. A well-known example of how market finance can affect financial institutions is the case of Northern Rock. At the onset of the most recent financial crisis, Northern Rock, a British bank, collapsed in 2007 following a run on the bank by depositors. The run was preceded by a forced deleveraging by wholesale creditors in the capital markets. The key takeaway is that the link between banks and capital markets is now global.
Fo r c e d D e l e v e r a g in g a n d C o v e r e d In t e r e s t Pa r it y

LO 76.3: Compare and assess methods a CCP can use to help recover capital when

LO 76.3: Compare and assess methods a CCP can use to help recover capital when a member defaults or when a liquidity crisis occurs.
There are two methods a CCP can use to help recover capital (in the event a member defaults or a liquidity crisis occurs): default fund assessments and variation margin haircuts (VMGH).
With default fund assessments, the CCP could ask all non-defaulted members for a supplementary contribution that is proportional to their prior contribution and capped at that prior contribution amount. However, assuming that the largest clearing members have defaulted, there is a reasonable risk that some of the non-defaulted members have been subjected to the same losses. As a result, the non-defaulted members may have insufficient liquid resources to cover the assessment. Or if they do have sufficient resources, they may simply choose to avoid the assessment by closing out their positions or moving them to another CCP. Therefore, the shortfall in the default fund demonstrates wrong-way risk, whereby the probability of non-payment is positively related to the default events that would lead to an assessment.
A clearing member may accumulate a large amount of losses over time and ultimately default. Prior to the default, that defaulting member would have already made a corresponding large amount of variation margin payments to other members. With variation margin haircuts (VMGH), the CCP collects the full variation margin payment from the member with the loss, and the CCP discounts the payment (on a pro-rata basis) to the corresponding member with the gain. The difference is held by the CCP to enhance the CCPs liquidity. The liquidity is financed by the members but if clearing members are already subject to liquidity constraints in weaker market conditions, a haircut on the variation margin payment could exacerbate the liquidity constraints.
Page 196
2018 Kaplan, Inc.
CCPs clearly benefit from having assessments and recovery options but they impose significant liquidity demands on non-defaulted members in weak market conditions. Such demands could ultimately cause those members to eventually default.
Topic 76 Cross Reference to GARP Assigned Reading – Cont
2018 Kaplan, Inc.
Page 197
Topic 76 Cross Reference to GARP Assigned Reading – Cont
Ke y Co n c e pt s
LO 76.1 Key advantages of central clearing include: Halting a potential domino effect of defaults in a market downturn. More clarity regarding the need for collateral. Lower operational risk. Better price discovery. More regulatory transparency in OTC markets. Better risk management.
LO 76.2 With central clearing, OTC exposures net of collateral between key banks have fallen to a small percentage of bank equity. The initial conclusion is that there is reduced insolvency and contagion risk because CCPs have removed counterparty risk.
Central clearing has enhanced financial stability and reduced systemic risk, but it has not completely eliminated systemic risk.
LO 76.3 The central clearing requirements do not change the clearing members overall balance sheet value (assets or equity) so there is no solvency impact. However, there is a reclassification of assets between liquid and non-liquid so there is a liquidity impact. Therefore, the clearing member is giving up counterparty risk and accepting liquidity risk.
LO 76.4 A CCP has the following liquidity resources to cover potential losses should a clearing member default (in the following sequence): 1. 2. Default contribution of defaulting member. 3. Mutualization of large losses. 4. Recovery. 3. Failure resolution.
Initial margin.
LO 76.3 CCPs clearly benefit from having assessments and recovery options but they impose significant liquidity demands on non-defaulted members in weak market conditions. Such demands could ultimately cause those members to eventually default. There are two methods to consider in this situation: (1) default fund assessments and (2) variation margin haircuts (VMGH).
Page 198
2018 Kaplan, Inc.
Co n c e pt Ch e c k e r s
Topic 76 Cross Reference to GARP Assigned Reading – Cont
1.
2.
3.
4.
3.
Which of the following items is not a key advantage of central clearing? A. Lower liquidity risk. B. Lower operational risk. C. Greater price discovery. D. Greater clarity with regard to the need for collateral.
Which of the following statements with regard to central clearing, liquidity, and solvency is correct? A. Central clearing has enhanced financial stability and eliminated counterparty
B. An illiquid firm has total liquid asset values that are lower than total liability
risk.
values.
operating.
C. A firm that becomes insolvent will immediately impact its ability to continue
D. The use of short-term repurchase agreements and borrowing against the value of
assets is used to protect against insolvency.
Within a central counterpartys (CCP) role to absorb losses, which of the following risks is most relevant to the CCP? A. Insolvency risk. B. Liquidity risk. C. Market risk. D. Operational risk.
Which of the following items has the greatest impact on a clearing members balance sheet? A. Initial margin. B. Variation margin. C. Default fund contribution. D. Skin-in-the-game contribution.
Initial margins requirements for clearing members are based on market risk with computations most likely at a confidence level of: A. 90%. B. 93%. C. 99%. D. 100%.
2018 Kaplan, Inc.
Page 199
Topic 76 Cross Reference to GARP Assigned Reading – Cont
Co n c e pt Ch e c k e r An s w e r s
1. A Central clearing changes counterparty risk to liquidity risk, so the impact is greater liquidity risk which is a disadvantage of central clearing. The other items are all advantages of central clearing.
2. A Central clearing has eliminated counterparty risk; however, it has now introduced liquidity
risk.
An illiquid firm has total liquid asset values that are lower than total short-term liability values. Even if a firm is insolvent, if the firm has sufficient liquid assets to cover its short-term liabilities, then it will not immediately impact a firms ability to continue operating. The use of short-term repurchase agreements and borrowing against the value of assets is used to protect against illiquidity.
3. B Losses that arise due to the default of a clearing member will only impact a CCP to the
extent that the CCP must make payments to the defaulted counterparties. Therefore, such cash payments represent a potential liquidity risk for CCPs, which should be its primary concern.
A CCPs assets are subject to market risk and insolvency risk, but because most the CCPs assets are low risk and highly liquid, both risks are of little consequence. Operational risk is present in all entities but it does not have special prominence within a CCP.
4. C Default fund contributions are subject to a 2% capital charge so there is an impact on the
clearing members balance sheet.
Initial margin and variation margin have no impact on the value of the clearing members assets since the clearing member still owns the cash collateral that is posted as margin. A skin-in-the-game contribution is made by the CCP, not the clearing member.
5. C Market risk measures such as value at risk or expected shortfall are usually computed at a
confidence level between 99% and 99.75%.
Page 200
2018 Kaplan, Inc.
The following is a review of the Current Issues in Financial Markets principles designed to address the learning objectives set forth by GARP. This topic is also covered in:
Th e Ba n k /C a pi t a l M a r k e t s N e x u s G o e s G l o b a l
Topic 77
Ex a m Fo c u s
In this topic, we examine the interrelationship between banks and capital markets. As the dollar replaced the volatility index (VIX) as the gauge of deleveraging in the capital markets, the role of the dollar became increasingly more important. For the exam, understand the causes of deleveraging and the role of covered interest parity (CIP) prior to the financial crisis, and why the CIP relationship failed in the post-crisis period. In addition, it is important that you understand not only why the dollar is now considered a better estimator of leverage than the VIX, but also the impacts of a strengthening dollar on banks lending and hedging activities. A stronger dollar typically raises the cost of dollar borrowing and reduces banks lending and hedging activities.
Ba n k s a n d Ca pit a l M a r k e t s

LO 76.4: Explain how liquidity of clearing members and liquidity resources of

LO 76.4: Explain how liquidity of clearing members and liquidity resources of CCPs affect risk management and financial stability.
CCPs hold mostly liquid and low-risk assets on their balance sheets, thereby requiring very little capital to guard against insolvency risk. The corresponding liabilities are short-term and mainly represent margin balances owed to clearing members. Assuming no member defaults, CCPs receive margin and default fund contributions from members. The CCPs role is to redistribute variation margin payments from members with negative balances to
2018 Kaplan, Inc.
Page 193
Topic 76 Cross Reference to GARP Assigned Reading – Cont
those with positive balances; the net impact to the CCP should be zero. Losses that arise due to the default of a clearing member will only impact a CCP to the extent that the CCP must make payments to the defaulted counterparties. Therefore, such cash payments represent a potential liquidity risk for CCPs, which should be its primary concern from a risk management perspective. In that regard, insolvency risk and capital sufficiency for CCPs are far less relevant.
CCP Loss Sequence
A CCP has the following liquidity resources to cover potential losses should a clearing member default (in the following sequence): 1.
Initial margin’. The initial margin paid to the CCP by each clearing member is used to cover only the direct losses incurred from the members default.
2. Default contribution o f defaulting member. Losses greater than the initial margin may be
covered by the defaulting members default fund contribution.
3. Mutualization o f large losses: Losses greater than #1 and #2 combined are first covered by a maximum contribution by the CCP (skin-in-the-game) to cover the remaining loss. If the skin-in-the-game is not enough, then the remaining losses are covered by other members default contributions.
4. Recovery: Should the entire default fund be insufficient to cover the losses, the CCP
could request additional default fund contributions by non-defaulting members, usually limited to the amount of the initial contribution to the default fund. Another source of funds for CCPs is variation margin haircutting (VMGH), which involves CCPs collecting variation margin payments from members with negative balances but keeping a specified percentage to boost its liquidity resources and transferring only the remaining amount to the counterparties.
3. Failure resolution’. May occur if the CCP is unable to recover sufficient funds or if the
CCP or its members do not attempt to go through the recovery provisions.
The loss allocation process (or loss waterfall) is illustrated in Figure 2.
Page 194
2018 Kaplan, Inc.
Topic 76 Cross Reference to GARP Assigned Reading – Cont
Figure 2: Process of Loss Allocation Upon Default of a Clearing Member
Liquidation cost in excess of margin + DF of defaulting member
Failure resolution
Recovery provisions:
Default fund assessments, VMGH
CCP contribution: Skin in the game
Default fund
CCP contribution: Skin in the game
Mutualization of loss across non-defaulted
CMs
Defaulting
Clearing Member Clearing Member Clearing Member Clearing Member
Default Fund Contribution
Default Fund Contribution
Margin
Margin
Default Fund Contribution
Margin
Default Fund Contribution
Margin
Source: Chart 3: Loss Waterfall: Allocation of Losses in the Event of a Clearing Member Default. Reprinted from Rama Cont, Central Clearing and Risk Transformation, Norges Bank Research, March 2017, 9.
Margin Requirements and Liquidation Costs
Initial margins cover losses as the first step of the sequence just listed. Initial margins are likely calculated based on market risk measures such as standard deviation (SD), value at risk (VaR), or expected shortfall (ES) at a 99% to 99.73% confidence level. The calculation makes use of either: (1) historical data, (2) scenario analysis, or (3) simulation using specific assumptions on relevant risk factors. The risk horizon can be described as the amount of time needed to liquidate a defaulting members positions. That can be anywhere from one day to a few days and is computed based on the asset class being cleared, not the actual portfolio or position.
The CCP may incur a loss based on the clearing members portfolio when the clearing member defaults. Because variation margin would have been provided prior to default, the CCP is only subject to (incremental) liquidation cost, which is the decline in portfolio value between the time of default and the time the portfolio is liquidated. Market risk measures are not effective at capturing liquidation costs because they ignore liquidity, market depth, and bid-ask spread variances between different financial instruments. Additionally, market risk is based on the net position size while liquidation costs are more related to gross notional size.
Liquidation costs can be significant for large positions or concentrated positions. A proper disposition of an unusually large position would often take additional time beyond the stated risk horizon to achieve, therefore resulting in a liquidation horizon that is greater than the risk horizon. As a result, there is a nonlinear relationship between liquidation
2018 Kaplan, Inc.
Page 195
Topic 76 Cross Reference to GARP Assigned Reading – Cont
costs and portfolio size. A typical risk measure such as SD, VaR, or ES has a 1:1 linear relationship to portfolio notional size (N). In contrast, the relationship between liquidation cost to position size is N x N 1/2, or N 3/2. For example, if N is doubled, SD, VAR, and ES would double as well but liquidation costs would increase by 2.83 times (23/2).
Proper risk management of CCPs would incorporate a liquidity charge in margin calculations to cover the implied extra costs the CCP would be responsible for when liquidating a defaulted position. The charge would increase for larger position sizes and illiquid assets (illiquid assets are being cleared more frequently nowadays). An accurate liquidity charge may encourage clearing members not to build up concentrated and/or illiquid positions so as to reduce their liquidity risk.
When determining the amount of a CCPs default fund, liquidation costs should be included. The largest clearing members provide the greatest risk to the CCP given that the former would likely engage in transactions that are more difficult to liquidate. Computing the CCPs default exposure should be more detailed and consider increased bid-ask spreads and liquidation costs. Liquidation costs are particularly relevant for large clearing members because liquidation costs are proportional to gross, not net positions.
CCP M e t h o d s f o r Re c o v e r in g Ca pit a l

LO 76.3: Describe the transformation of counterparty risk into liquidity risk.

LO 76.3: Describe the transformation of counterparty risk into liquidity risk.
In the absence of margin requirements for a bilateral OTC trade, the two counterparties would simply mark to market (MTM) their position each day. Such MTM gains/losses are unrealized in nature so they do not have any cash flow impact (i.e., no liquidity impact). However, they do impact asset values and reported income so there is an impact on solvency.
In contrast, the same trade with a CCP has three distinct cash flow impacts: An initial margin from each counterparty must be paid up front. MTM gains/losses between the CCP and clearing members must be settled on a cash
basis each day or even more often (i.e., variation margin).
Clearing members could be required to contribute to a default/guaranty fund to cover
member defaults.
>From an overall balance sheet and solvency perspective: The initial margin deposit by the clearing members to the CCP is simply that and is not a transfer of (cash) assets. The clearing member maintains the asset on its balance sheet so there is virtually no impact on solvency.
The variation margin deposits (if applicable) would have been previously accounted for
as a MTM loss. The actual cash payment to the CCP is treated similarly to the initial margin deposit in that there is no transfer of assets to the CCP. There is simply a transfer from the clearing members liquid to non-liquid assets (i.e., classification change).
The default fund contributions are treated similarly to the initial and variation margin defaults. However, the clearing member is subject to a 2% capital charge for the default fund contributions.
>From a liquidity perspective:
Initial and variation margins must be deposited as liquid assets (i.e., cash) so there is a noted reduction in liquidity.
In summary, the central clearing requirements do not change the clearing members overall balance sheet value (assets or equity) so there is no solvency impact. However, there is a reclassification of assets between liquid and non-liquid so there is a liquidity impact. Therefore, the clearing member is giving up counterparty risk and accepting liquidity risk.
CCP Liq u id it y Re s o u r c e s

LO 76.2: Assess whether central clearing has enhanced financial stability and

LO 76.2: Assess whether central clearing has enhanced financial stability and reduced systemic risk.
With central clearing, OTC exposures net of collateral between key banks have fallen to a small percentage of bank equity. The initial conclusion is that there is reduced insolvency and contagion risk, because CCPs have removed counterparty risk.
However, there needs to be a consideration of unrealized (non-cash) losses impacting solvency versus realized (cash) losses impacting liquidity. Liquid assets would consist of cash or securities that are easily converted to cash. In short, there are some situations whereby a firm may be solvent but illiquid or insolvent but liquid. A solvent (insolvent) firm simply has total asset values that are higher (lower) than total liability values. A liquid (illiquid) firm has total liquid asset values that are higher (lower) than total short-term liability values.
A firm may become insolvent due to a default by a large counterparty, whereby the decline in asset value is greater than the equity value. However, if that firm has sufficient liquid assets to cover its short-term liabilities, then the insolvency will not immediately impact the firms ability to continue operating. In contrast, if a firm is in default on any payment required in the short-term (i.e., one day for a margin call), then there is a liquidity problem.
In practice, capital requirements to protect against declines in asset values are used to protect against insolvency. The use of short-term repurchase agreements (repos) and borrowing against the value of assets is used to protect against illiquidity. Unfortunately, it is still possible for lenders to call existing loans or abstain from future lending (resulting in a liquidity problem known as a bank run) even if the firm has excess capital and is considered solvent.
In short, central clearing has certainly enhanced financial stability and reduced systemic risk, but it has not completely eliminated systemic risk.
Page 192
2018 Kaplan, Inc.
Topic 76 Cross Reference to GARP Assigned Reading – Cont
Tr a n s f o r m in g C o u n t e r pa r t y Ris k t o Liq u id it y Ris k

LO 76.1: Examine how the clearing of over-the-counter transactions through

LO 76.1: Examine how the clearing of over-the-counter transactions through central counterparties has affected risks in the financial system.
Using central counterparties (CCPs) to clear over-the-counter (OTC) transactions has boosted the importance of CCPs in the financial system. CCPs essentially eliminate the counterparty risk inherent in bilateral transactions by making the CCP the counterparty to each side of the trade so that virtually no default risk remains.
Key advantages of central clearing include: Halting a potential domino effect of defaults in a market downturn. More clarity regarding the need for collateral. Lower operational risk. Better price discovery. More regulatory transparency in OTC markets. Better risk management. With mandatory payments of initial and variation margins, the negative impact to a defaulted counterparty is reduced or eliminated. Normally, a default by one counterparty (clearing member) means a loss for other counterparties. With the introduction of CCPs, clearing members are now exposed to the CCP and any defaults by one clearing member no longer result in a loss for the other clearing members. In case of default, assuming the CCP has the available funds, the CCP will pay the variation margins that are owed to the non- defaulting members.
Figure 1 demonstrates numerically how multilateral netting reduces counterparty exposures.
2018 Kaplan, Inc.
Page 191
Topic 76 Cross Reference to GARP Assigned Reading – Cont
Figure 1: Reduction of Risk Exposure Through Multilateral Netting
Bilateral Netting
Multilateral Netting
Im pa c t o f C e n t r a l C l e a r in g

LO 75.3: Analyze the application of machine learning in three use cases:

LO 75.3: Analyze the application of machine learning in three use cases: Credit risk and revenue modeling Fraud Surveillance of conduct and market abuse in trading
Credit Risk and Revenue Modeling
Financial institutions recently moved to incorporate machine learning methods with traditional models in order to improve their abilities to predict financial risk. In turn, they have moved away from the less complex traditional linear credit risk model regressions.
Page 184
2018 Kaplan, Inc.
Topic 75 Cross Reference to GARP Assigned Reading – van Liebergen
However, machine learning models are often unfit to be successfully incorporated into the ongoing risk monitoring of financial institutions. Machine learning models can be overly complex and sensitive to overfitting data. Their (often extreme) complexity makes it difficult to apply jurisdictionally consistent definitions of data, and the models are too complex for regulatory purposes, including internal models in the Basel internal ratings- based (IRB) approach, because it is very difficult for auditors to understand them.
Despite their disadvantages, machine learning models can be successfully used in optimizing existing models with regulatory functions. For example, both linear and less complex nonlinear machine learning models can be applied to existing regulatory and revenue forecasting models.
Fraud
Banks have successfully used machine learning in the detection of credit card fraud. Models are used to detect fraudulent transactions, which can then be blocked in real time. Credit card fraud can incorporate machine learning more usefully than other risk areas because of the very large number of credit card transactions that are needed for the training, backtesting, and validation of models. The models then predetermine the key features of a fraudulent transaction and are able to distinguish them from normal transactions. Models can also be successfully used in anti-money laundering or combating the financing of terrorism (AML/CFT) activities through unsupervised learning methods, such as clustering. Clustering identifies outliers that do not have strong connections with the rest of the data. In this way, financial institutions can detect anomalies and reduce the number of false positives.
Many banks still rely on traditional fraud detection through identifying individual transactions or simple patterns, but these systems lead to a large number of false positives and lack the predictive capabilities of the more sophisticated machine learning models. In addition, the traditional models still require significant human involvement to filter the false positives from suspicious activities. Data sharing, data usage, and entrenched regulatory frameworks can also hinder the successful use of machine learning.
Other factors also make the use of machine learning more difficult. Money laundering is difficult to define, and banks do not receive adequate feedback from law enforcement agencies on which transactions were truly fraudulent. As a result, it is difficult to use only historical data to teach money-laundering detection algorithms to detect fraudulent activity.
Surveillance of Conduct and Market Abuse in Trading
Surveillance of trader conduct breaches is another growing area in which machine learning is being increasingly used to detect rogue trading, insider trading, and benchmark rigging activities. Financial institutions find early detection of these violations important because they can cause material financial and reputational damage to the institution.
Early monitoring techniques tended to rely on monitoring trading behavior and assessing single trades. With machine learning, monitoring techniques were enhanced to evaluate entire trading portfolios, and connect information to other activities of the trader, including emails, calendar items, phone calls, and check-in and check-out times. The traders behavior
2018 Kaplan, Inc.
Page 185
Topic 75 Cross Reference to GARP Assigned Reading – van Liebergen
could then be compared to other traders normal behavior. The system detects any deviation from the normal pattern and alerts the financial institutions compliance team.
One of the challenges facing financial institutions in successfully applying machine learning includes the legal complexities of sharing past breach information with developers. Also, systems need to be auditable, but because machine learning models are designed to continuously learn from the data, it can be difficult to explain to a compliance officer why a certain behavior set off an alert. As a remedy to these problems, systems can be designed to combine machine learning with human decisions. By incorporating human decisions with machine learning, systems data can be used to know a comprehensive set of information about a trader, and create a system that is less complex and more suitable for audit and regulatory purposes.
Page 186
2018 Kaplan, Inc.
Topic 75 Cross Reference to GARP Assigned Reading – van Liebergen
Ke y Co n c e pt s
LO 75.1 Machine learning uses algorithms that allow computers to learn without programming. Supervised machine learning predicts outcomes based on specific inputs, whereas unsupervised machine learning analyzes data to identify patterns without estimating a dependent variable.
Three broad classes of statistical problems include regression, classification, and clustering. Regression problems make predictions on quantitative, continuous variables, including inflation and GDP growth. Classification problems make predictions on discrete, dependent variables. Clustering observes input variables without including a dependent variable.
Overfitting is a problem in nonparametric, nonlinear models which tend to be complex by nature. Boosting overweights less frequent observations to train the model to detect these more easily. Bagging involves running a very large number of model subsets to improve its predictive ability.
Deep learning differs from classical learning models in that it applies many layers of algorithms into the learning process to identify complex patterns.
LO 75.2 Machine learning is a powerful tool for financial institutions because it allows them to adequately structure, analyze, and interpret a very large set of data they collect, and improve the quality of their supervisory data.
Financial institutions can use both conventional machine learning techniques to analyze high-quality, structured data, and use deep learning techniques to analyze low-quality, high frequency data.
LO 75.3 Three cases of machine learning include (1) credit risk and revenue modeling, (2) fraud detection, and (3) surveillance of conduct and market abuse in trading.
Credit risk and revenue modeling, despite their disadvantages stemming from their complexity and overfitting, have been successfully used to optimize existing models with regulatory functions. These include both linear and less complex nonlinear machine learning models which can be paired with existing regulatory and revenue forecasting models.
Traditional fraud detection systems identify individual transactions or simple patterns, leading to a large number of false positives and require significant human involvement to filter the false positives from suspicious activities. Machine learning systems can help financial institutions detect fraudulent transactions and block them in real time. Clustering refers to identifying outliers that do not have strong connections with the rest of the data.
2018 Kaplan, Inc.
Page 187
Topic 75 Cross Reference to GARP Assigned Reading – van Liebergen
Drawbacks of machine learning include difficulty identifying money laundering, and lack of adequate feedback from law enforcement agencies.
Surveillance of trader conduct breaches through machine learning allows for monitoring techniques to evaluate entire trading portfolios, and connecting information to other activities of the trader and comparing this information to traders normal behavior.
Page 188
2018 Kaplan, Inc.
Topic 75 Cross Reference to GARP Assigned Reading – van Liebergen
Co n c e pt Ch e c k e r s
1.
2.
3.
4.
Which of the following classes of statistical problems typically cannot be solved through supervised machine learning? A. Regression problems. B. Penalized regression. C. Classification problems. D. Clustering.
Which of the following concepts best identifies the problem where a highly complex model describes random error or noise rather than true underlying relationships in the data? A. Bagging. B. Boosting. C. Overfitting. D. Deep learning.
Which data type is most characteristic of big data? A. High-quality data. B. Low frequency data. C. Structured supervisory data. D. Low-quality, unstructured data.
Which of the following factors does not explain why machine learning systems have been less widespread in the anti-money laundering (AML) space? A. Existence of unsupervised learning methods. B. Lack of a universal definition of money laundering. C. Inadequate feedback from law enforcement agencies. D. Inadequacy of historical data for money laundering detection algorithms.
5.
A credit analyst makes the following statements:
Statement 1: Financial institutions face barriers in applying machine learning systems because supervisory learning approaches are difficult to apply.
Statement 2: Combining machine learning with human decisions tends to produce inferior model results.
The analyst is accurate with respect to: A. statement 1 only. B. statement 2 only. C. both statements. D . neither statement.
2018 Kaplan, Inc.
Page 189
Topic 75 Cross Reference to GARP Assigned Reading – van Liebergen
Co n c e pt Ch e c k e r An s w e r s
1. D Clustering typically involves applying unsupervised learning to a dataset. It involves
observing input variables without knowing which dependent variable corresponds to them (e.g., detecting fraud without knowing which transactions are fraudulent).
Regression problems, including penalized regression, and classification problems involve predictions around a dependent variable. These statistical problems can be solved through machine learning.
2. C Overfitting is a concern where highly complex models describe noise or random error rather than true underlying relationships in the model. Overfitting is a particular concern in non- parametric, nonlinear models.
Boosting overweights less frequent observations to train the model to detect these more easily. Bagging involves running a very large number of model subsets to improve its predictive ability. Deep learning differs from classical learning models in that it applies many layers of algorithms into the learning process to identify complex patterns.
3. D
Big data is data that arises from large volumes of low-quality, high-frequency, unstructured data.
4. A Unsupervised learning methods can be used in AML detection to identify and learn relevant
patterns in client activity.
Money laundering is difficult to define, and financial institutions do not receive adequate feedback from law enforcement agencies on which transactions were truly fraudulent. As a result, it is difficult to use only historical data to teach money-laundering detection algorithms to detect fraudulent activity.
5. A
Incorporating human decisions with machine learning can improve data, because systems data can be used to identify a comprehensive set of information about a trader, and create a system that is less complex and more suitable for audit and regulatory purposes.
Financial institutions have difficulty in successfully applying machine learning because of legal complexities of sharing past breach information with developers.
Page 190
2018 Kaplan, Inc.
The following is a review of the Current Issues in Financial Markets principles designed to address the learning objectives set forth by GARP. This topic is also covered in:
Ce n t r a l Cl e a r i n g a n d Ri s k Tr a n s f o r m a t i o n
Topic 76
Ex a m Fo c u s
This topic emphasizes liquidity risk, as opposed to solvency risk and counterparty risk, as being the primary concern for central counterparties (CCPs) and clearing members. For the exam, focus on the advantages of central clearing as well as the sequencing involved in the five-step CCP loss waterfall, primarily the details of the third and fourth steps. A good understanding of liquidation costs is also essential.
C e n t r a l C l e a r in g o f OTC Tr a n s a c t io n s

LO 75.2: Describe the application of machine learning approaches within the

LO 75.2: Describe the application of machine learning approaches within the financial services sector and the types of problems to which they can be applied.
Financial institutions deal with an increasingly large volume of data they need to analyze, which requires complex analytical tools. In response to new regulations and compliance measures, following the 20072009 financial crisis, financial institutions have been required to report more comprehensive details on balance sheet metrics and business models. These include stress tests, and reporting on liquidity measures, capital, and collateral.
As a result, financial institutions need to be able to adequately structure, analyze, and interpret the data they collect. Various regulatory standards were introduced on data delivery with the aim to improve the quality of supervisory data, including the Basel Committees Principles for Risk Data Aggregation (Basel 239) and IFRS 9.
Financial institutions are also faced with an exceptionally large amount of low-quality, unstructured data, called big data, from the output of consumer apps, social media feeds, and various systems metadata. It has become increasingly more important that institutions are able to effectively analyze this high volume of data, including using conventional machine learning techniques as well as more complex deep learning techniques.
Financial institutions should use conventional machine learning techniques for mining high-quality, structured supervisory data. Deep learning and neural networks should be used for low-quality, high-frequency, big data type sources.

LO 75.1: Describe the process of machine learning and compare machine learning

LO 75.1: Describe the process of machine learning and compare machine learning approaches.
Machine learning is a field of artificial intelligence (AI) that uses algorithms which allow computers to learn without programming. There are two forms of machine learning: supervised and unsupervised. In supervised machine learning, a statistical model is built in order to predict outcomes based on specific inputs (e.g., predicting GDP growth based on inputs of various macroeconomic variables). In unsupervised machine learning, data analysis is performed to identify patterns without estimating a dependent variable.
Machine learning is important because it can analyze data samples in order to identify patterns and relationships in the data, and can make out-of-sample predictions. Models are then analyzed thousands or millions of times so that the model can improve its predictive capability. In this respect, machine learning is closely tied to the big data revolution. Supervised machine learning can also analyze nonparametric and nonlinear relationships that can fit any given model and make inferences about the dependent and independent variables.
Machine Learning Approaches
Although many approaches exist to analyzing machine learning, it can be applied to three broad classes of statistical problems: regression, classification, and clustering. Both
Page 182
2018 Kaplan, Inc.
Topic 75 Cross Reference to GARP Assigned Reading – van Liebergen
regression and classification can be addressed through supervised machine learning, while clustering follows an unsupervised approach. 1. Regression problems make predictions on quantitative, continuous variables, including
inflation and GDP growth. Regressions can involve both linear (e.g., partial least squares) and nonlinear (e.g., penalized regression in which complexity is penalized to improve predictability) learning methods.
2. Classification problems make predictions on discrete, dependent variables such as
filtering spam email and blood types, where the variable can take on values in a class. Observations may be classified by support vector machines.
3. Clustering involves observing input variables without including a dependent variable.
Examples include anti-money laundering (AML) analysis to detect fraud without knowing which variables are fraudulent. Data can be grouped into clusters, where outputs from unsupervised learning are used as inputs for supervised learning methods.
As mentioned, machine learning can be used to make out-of-sample predictions, for example, predicting borrowers ability to repay their obligations and borrower default. However, a good predictive model does not need to also be good at explaining or inferring performance. For example, a credit scoring model will make inferences as to why borrowers default, whereas a good predictive model only needs to identify which indictors lead to borrower default.
Other Concepts in Machine Learning
Models that are very complex may describe noise or random error rather than true underlying relationships in the model. This is called overfitting. Overfitting is a particular concern in nonparametric, nonlinear models which tend to be complex by nature. Models that describe noise will only fit that specific dataset and will not perform well in out-of- sample datasets.
Boosting (or bootstrapping) refers to overweighting scarcer observations to train the model to detect these more easily. For example, overweighting scarcer fraudulent transactions in a dataset can train the model to better detect them. Bagging describes the process of running several hundreds of thousands of models on different subsets of the model to improve its predictive ability. These models may also be combined with other machine learning models, called an ensemble, in order to further improve their out-of-sample predictive capabilities.
Machine learning uses past, in-sample data to make predictions about future, out-of-sample data. As a result, it has been criticized at times for being backward looking and for making predictions without truly understanding the underlying relationships.
Deep Learning
Deep learning approaches move away from the classic model approaches we have been discussing until now. Whereas classic models focus on well-defined and structured datasets, deep learning essentially mimics the human brain by applying several layers of algorithms into the learning process and converts raw data to identify complex patterns. Each
2018 Kaplan, Inc.
Page 183
Topic 75 Cross Reference to GARP Assigned Reading – van Liebergen
algorithm focuses on a particular feature of the data (called representations), and the layering of these representations allows the model to incorporate a wide range of inputs, including low quality or unstructured data. Importantly, the layers are not designed by engineers, but instead learned by the model from the various data.
For example, deep learning has been used in face-recognition and natural language learning models. Models have been complex enough to be able to classify not only the discussion topics, but also the emotions of the people involved. However, deep learning models are extremely complex, often requiring several million or hundreds of millions of datasets.
Th e Ap p l ic a t io n o f M a c h in e Le a r n in g