LO 74.3: Examine the areas for collaboration between econometrics and machine

LO 74.3: Examine the areas for collaboration between econometrics and machine learning.
There are several different areas where useful collaboration could exist between econometrics and machine learning. Most machine learning assumes that data is independently and identically distributed and most datasets are cross-sectional data. In practice, time series analysis may be more useful. Econometrics can use tools like Bayesian Structural Times Series models to forecast time series data.
Perhaps the most important opportunity for collaboration relates to causal inference, which can be a natural by-product of big data. Correlation does not always indicate causation. Traditionally, machine learning has been most concerned with pure prediction, but econometricians have developed numerous tools to reveal cause and effect relationships. Combining these tools with machine learning could prove to be a very meaningful collaboration.
Consider a basic causation-correlation example. Police precincts that have a higher amount of police usually also have higher crime rates. There is a correlation, but having more police does not necessarily cause higher crime rates. A strong historical relationship does exist, but
2018 Kaplan, Inc.
Page 177
Topic 74 Cross Reference to GARP Assigned Reading – Varian
it is not really useful for predicting the causal outcome of adding more police to a precinct. One idea to solve this problem is to use econometrics to forecast what would have happened if no additional police were added and then contrast this with what actually did happen.8
This same concept can be applied in many different disciplines. Consider a standard problem in marketing where a firm wants to gauge the effectiveness of an advertising campaign. They could run the new ad campaign in one region and then not run it in another region to contrast the outcomes. There are two big problems with this. First, you may have lost revenue in the control region while the test is ongoing. Second, the contrast could be from an external factor like weather or demographic differences. To avoid these problems, the firm could use econometrics to forecast the expected sales outcome in a region without additional advertising and then run the ads and measure the contrast between the predicted and the actual outcomes. A good model for prediction can be better than a random control group.
8. Donald B. Rubin, Estimating Causal Effects ofTreatments in Randomized and
Nonrandomized Studies, Journal o f Educational Psychology, 66, no. 5 (1974): 688.
Page 178
2018 Kaplan, Inc.
Topic 74 Cross Reference to GARP Assigned Reading – Varian
Ke y Co n c e pt s
LO 74.1 Large datasets require tools that are exponentially more advanced than simple spreadsheet analysis. Overfitting and variable selection are two ongoing challenges that big data present.
LO 74.2 To solve inherent issues like spurious correlations and overfitting, researchers have applied more creative tools to analyzing large datasets. The tools include classification and regression trees, cross-validation, conditional inference trees, random forests, and penalized regression.
LO 74.3 There are several ways in which the field of econometrics can assist the world of machine learning. One way is to use time series forecasting tools that are commonly applied in econometrics to big data, which has traditionally only featured cross-sectional data. Another potential collaboration is to better understand the relationship differences between correlation and causation.
2018 Kaplan, Inc.
Page 179
Topic 74 Cross Reference to GARP Assigned Reading – Varian
Co n c e pt Ch e c k e r s
1.
2.
3.
4.
3.
Which of the following statements is not a problem common to the contemporary world of big data? A. A researcher might find a strong in-sample prediction that does not produce
good out-of-sample results.
B. Traditional spreadsheet analysis is not robust enough to capture relationships
with multiple interactions and millions of data points.
C. Access to data is difficult. D. The periodic presence of spurious correlations requires active variable selection.
Which of the following statements is not involved in conducting a 10-fold cross validation? A. Test your prediction on an out-of-sample dataset to validate accuracy. B. Rotate which fold is the testing set. C. Conduct at least 10 different tests and average the testing results. D. Break a large dataset into 10 smaller subsets of data.
Which of the following statements most accurately describes the process of growing a random forest? A. Select a bootstrapped sample from a large dataset and grow a tree with random
variables that were selected using a lambda (X) tuning parameter. Average the results from a large number of trees that fill out the random forest.
B. Break the full dataset into 10 identifiable subsets and build 10 different trees each having the same variables that were selected using a lambda (X) tuning parameter.
C. Break the full dataset into a random number of small unique datasets. Grow
trees and average the results.
D. Select a bootstrapped sample (with replacement) from a large dataset and grow a tree with random variables and no pruning. Average the results from a large number of trees that fill out the random forest.
Which of the following statements is least likely related to conditional inference trees (ctrees)? A. A ctree can help to better understand if a relationship truly exists between
B. A ctree involves creating multiple trees to test for accuracy. C. A ctree involves splitting variables into the smallest possible factor that can be
variables.
isolated for testing.
D. A ctree will isolate predictors into the most specific terms possible.
The fields of econometrics and machine learning have much that can be shared. Which of the following statements is incorrect concerning the collaboration between these two disciplines? A. Collaboration can be sought to better explore the blurred lines between
correlation and cross-sectional prediction.
B. More collaboration can be done to better understand time series data. C. Collaboration can be sought to better explore the blurred lines between
D. Combining econometric tools with machine learning could prove to be a very
correlation and causation.
meaningful collaboration.
Page 180
2018 Kaplan, Inc.
Topic 74 Cross Reference to GARP Assigned Reading – Varian
Co n c e pt Ch e c k e r An s w e r s
1. C Our modern world is filled with computerized commerce. This trend has created a seemingly endless stream of information that can be dissected using machine learning. Overfitting and spurious correlations are two clear issues and traditional spreadsheet analysis is simply not robust enough to capture the interactions in very large pools of data.
2. A Cross validation is used to conduct testing within a dataset that attempts to create virtual out-of-sample subsets that are actually still in-sample. In this example, the large dataset is broken into 10 folds and then 1 fold is selected for testing. Parameters from the other training sets are tested against the testing set and the testing set is rotated so that each fold gets a turn as the testing set. Parameters from each test are then averaged to get a population parameter used for prediction.
3. D Growing a random forest involves a bootstrapped sample (with replacement) from a larger
data set. Researchers will then grow a tree from this sample. They will construct a large number of trees using computerized assistance and average the results to find the population parameters.
4. B A ctree is only one tree. A random forest is the analysis that constructs multiple trees. A
ctree helps to understand relationships more deeply and it all starts with splitting variables into the smallest identifiable factor that can be isolated. The main idea of a ctree is to isolate predictors into the most specific terms possible.
5. A Current machine learning already has a fairly developed understanding of cross-sectional prediction. The most likely areas for collaboration with the field of econometrics include prediction with time series data and better understanding the blurred lines between correlation and causation. Combining econometric tools with machine learning could prove to be a very meaningful collaboration.
2018 Kaplan, Inc.
Page 181
The following is a review of the Current Issues in Financial Markets principles designed to address the learning objectives set forth by GARP. This topic is also covered in:
M a c h i n e Le a r n i n g : A Re v o l u t i o n i n Ri s k M a n a g e m e n t a n d C o m pl i a n c e ?
Topic 75
Ex a m Fo c u s
Financial institutions have been increasingly looking to complement traditional and less complex regulatory systems and models with more complex models that allow them to better identify risks and risk patterns. This topic focuses on machine learning within artificial intelligence models that have been successfully used in credit risk modeling, fraud detection, and trading surveillance. For the exam, understand the various forms of models, including supervised and unsupervised machine learning, and the three broad classes of statistical problems: regression, classification, and clustering. While machine learning can provide tremendous benefits to financial institutions in combatting risks, there are considerable limitations with these highly complex models, which can be too complex to be reliably used from an audit or regulatory perspective.
Th e P r o c e s s o f M a c h in e Le a r n in g

LO 74.2: Explain and assess different tools and techniques for manipulating and

LO 74.2: Explain and assess different tools and techniques for manipulating and analyzing big data.
Using big data to make predictions is the focus of machine learning. This science may utilize regression if a linear relationship is present. Machine learning might deploy tools, such as classification and regression trees, cross-validation, conditional inference trees, random forests, and penalized regression, if a nonlinear relationship exists.
Classification can be thought of as a binomial decision tree. For example, someone either survived the tragedy of the Titanic or they did not. This can be organized as a discrete variable regression where the values are either 0 or 1. This is essentially a logit regression and the output is shown in Figure 1.
Figure 1: Logistic Regression of Titanic Survival vs. Age2
Coefficients Intercept Age
Estimate 0.465 -0.002
Standard Error
0.035 0.001
t-Stat 13.291 -1.796
P-value 0.000 0.072
The logit regression results in Figure 1 show that age was not a significant factor in determining survival of Titanic passengers. Perlich, Provost, and Simonoff (2003) find that while logit regression can work very well for smaller datasets, larger pools of data require classification and regression tree (CART) analysis.2 3 Figure 2 shows a CART for the Titanic using two factors: age and cabin class, and Figure 3 shows the rules used in developing this CART.
2. Hal R. Varian, Big Data: New Tricks for Econometrics, Journal of Economic Perspectives 28, no.
2: 3-28, www.aeaweb.org/articlesPidM0.1257/jep.28.2.3.
3. Claudia Perlich, Foster Provost, and Jeffrey S. Simonoff, Tree Induction vs. Logistic Regression:
A Learning-Curve Analysis, Journal of Machine Learning Research (June 2003): 211-255, www.jmlr.org/papers/volume4/perlich03a/perlich03a.pdf.
2018 Kaplan, Inc.
Page 173
Topic 74 Cross Reference to GARP Assigned Reading – Varian
Figure 2: A Classification Tree for Survivors of the Titanic4
class >
no
Figure 3: Titanic Tree Model in Rule Form4
Features Class 3 Class 1-2, younger than 16 Class 2, older than 16 Class 1, older than 16
Predicted
Actual/Total
Died Lived Died Lived
370/501 34/36 145/233 1741276
Classification and regression trees can be very useful in explaining complex and non- linear relationships. In the case of the Titanic, CART analysis shows that both age and cabin classification were good predictors of survival rates. This can be further dissected in Figure 4, which shows the fraction of those who survived organized into age bins.
Hal R. Varian, Big Data: New Tricks for Econometrics, Journal of Economic Perspectives 28, no. 2: 3-28, www.aeaweb.org/articles?id=10.1257/jep.28.2.3.
Page 174
2018 Kaplan, Inc.
Figure 4: Titanic Survival Rates by Age Bin5
Topic 74 Cross Reference to GARP Assigned Reading – Varian
Figure 4 clearly shows that those in the lowest age bracket (children) had the highest survival rates, and that those in their 70s had the lowest. For those in between these age markers, their attained age did not really impact their survival rates. Raw age mattered less than whether a person was either a child or elderly. This process enables researchers to think dynamically about relationships in large datasets.
One concern with using this process is that trees tend to overfit the data, meaning that out- of-sample predictions are not as reliable as those that are in-sample. One potential solution for overfitting is cross-validation. In a &-fold cross validation, the larger dataset is broken up into k number of subsets (also called folds). A large dataset might be broken up into 10 smaller pools of data.
This process starts with fold 1 being a testing set and folds 2-10 being training sets. Researchers would look for statistical relationships in all training sets and then use fold 1 to test the output to see if it has predictive use. They would then repeat this process k times such that each fold takes a turn being the testing set. The results are ultimately averaged from all tests to find a common relationship. In this way, researchers can test their predictions on an out-of-sample dataset that is actually a part of the larger dataset.
.Another step that could be taken is to prune the tree by incorporating a tuning parameter (A.) that reduces the complexity in the data and ultimately minimizes the out-of-sample
5. Hal R. Varian, Big Data: New Tricks for Econometrics, Journal o f Economic Perspectives 28, no.
2: 3-28, www.aeaweb.org/articlesPidH0.1257/jep.28.2.3.
2018 Kaplan, Inc.
Page 175
Topic 74 Cross Reference to GARP Assigned Reading – Varian
errors. However, building a conditional inference tree (ctree) is an option that does not require pruning with tuning parameters. The ctree process involves the following steps: 1. Test if any independent variables are correlated with the dependent (response) variable,
and choose the variable with the strongest correlation.
2. Split the variable (a binary split) into two data subsets.
3. Repeat this process until you have isolated the variables into enough unique
components (each one is called either a node or a leaf on the ctree) that correlations have fallen below pre-defined levels of statistical significance.
The main idea of a ctree is to isolate predictors into the most specific terms possible. Consider research conducted by Munnell, Tootell, Browne, and McEneaney (1996) that studies mortgage lending in Boston to test whether ethnicity plays a role in mortgage application success6 7. Their logistic regression finds a statistically significant relationship between being declined for a mortgage and being African American. When this data is analyzed using a ctree, as shown in Figure 3, it becomes more apparent that the true cause of mortgage application failure in this dataset is being denied mortgage insurance (dmi in Figure 3) not simply being African American (black in Figure 5). A separate test would be useful to see if being denied mortgage insurance is correlated with ethnicity.
Figure 5: Ctree for Mortgage Application Success in Boston7
Constructing random forests is also a way to improve predictions from large datasets. This method uses bootstrapping to grow multiple trees from a large dataset. Using random forests to average many small models produces very good out-of-sample fits even when dealing with nonlinear data. Computers have made this method much more viable as
6. Alicia H. Munnell et al., Mortgage Lending in Boston: Interpreting HMDA Data, The
American Economic Review 86, no. 1 (March 1996): 25-53, www.jstor.org.ezaccess.libraries.psu. edu/stable/pdf/2118254.pdf.
7. Hal R. Varian, Big Data: New Tricks for Econometrics, Journal o f Economic Perspectives 28, no.
2: 3-28, www.aeaweb.Org/articlesPkLT0.1257/jep.28.2.3.
Page 176
2018 Kaplan, Inc.
Topic 74 Cross Reference to GARP Assigned Reading – Varian
sometimes thousands of trees can be grown in a random forest. There are four steps to creating random forests: 1. Select a bootstrapped sample (with replacement) out of the full dataset and grow a tree.
2. At each node on the tree, select a random sample of predictors for decision-making. No
pruning is needed in this process.
3. Repeat this process multiple times to grow a forest of trees.
4. Use each tree to classify a new observation and choose the ultimate classification based
on a majority vote from the forest.
Researchers might also use penalized regression, where a penalty term (A,) is applied to adjust the regression results. Consider a multivariate regression where we predict^ as a linear function of a constant, bQ., with P predictor variables:
This form of penalized regression is known as LASSO (least absolute shrinkage and selection operator) regression. The LASSO process improves upon OLS regression by using the penalty term (X) to limit the sum of model parameters. As lambda (X) increases, some of the regression coefficients will be driven to zero and drop out of consideration. This penalizing process enables researchers to focus on the variables that are most likely to be strong predictors. If lambda is zero, then you just have OLS regression, but as lambda increases model variance decreases.
C o l l a b o r a t io n b e t w e e n Ec o n o m e t r ic s a n d M a c h in e Le a r n in g

LO 74.1: Describe the issues unique to big datasets.

LO 74.1: Describe the issues unique to big datasets.
Researchers often use a spreadsheet to organize and understand datasets. However, when the spreadsheet expands to a million or more rows, a more robust and relational database is needed. Structured Query Language (SQL) databases are used for the smaller of the large datasets, but customized systems that expand upon SQL are needed for the largest pools of data. According to Sullivan (2012)1, Google answers 100 billion search queries every month and crawls 20 billion URLs every day. This is one example of a significantly large dataset that needs customized databases to properly understand the inherent relationships involved. A system like this would be operated not on a single computer, but rather on a large cluster of computers like the type that can be rented from vendors such as Amazon, Google, and Microsoft.
Professors Note: Using big data to make predictions is precisely what Amazon is trying to do when they make recommendations for additional purchases based on the current product search, previous purchases from the same customer, and alternative purchases made by other customers.
Another potential issue in dealing with a large dataset is known as the overfitting problem. This is encountered when a linear regression captures a solid relationship within the dataset, but has very poor out-of-sample predictive ability. Two common ways to address this
1. Danny Sullivan, Google: 100 Billion Searches per Month, Search to Integrate Gmail,
Launching Enhanced Search App for iOS, Search Engine Land, August 8, 2012, https:// searchengineland. com/google-search-press-129925.
Page 172
2018 Kaplan, Inc.
Topic 74 Cross Reference to GARP Assigned Reading – Varian
problem are to use less complex models and to break the large dataset into small samples to test and validate if overfitting exists.
In practice, researchers work with independently distributed, cross-sectional samples of a larger dataset. This enables them to focus on summarization, prediction, and estimation with a more manageable pool of information. Basic summarization often takes the form of (linear) regression analysis, while prediction seeks to use various tools to predict a value for the dependent variable, y, given a new value of the independent variable, x. This process seeks to minimize a loss function (i.e., sum of squared residuals) that is associated with new out-of-sample observations of x.
Methods are also being deployed to screen variables to find the ones that add the most value to the prediction process. Active variable selection can also help to mitigate spurious correlations and potentially help to decrease overfitting in a world where more and more data becomes available with every internet search and purchase at a retail store.
To o l s a n d Te c h n iq u e s f o r An a l y z in g Big Da t a

LO 73.4: Examine the impact on the financial system posed by the standards.

LO 73.4: Examine the impact on the financial system posed by the standards.
The impact of the IASB standard would cause a dramatic rise in loss provisions at the start of an economic downturn, specifically the increase in amounts between stage 1 (12-month ECL) and stage 2 (lifetime ECL). One argument for a more proactive stance on recording losses is that it restates the balance sheet assets at more conservative levels to make way for possible future recoveries.
In one sense, the standards would have no impact for banks that have established sufficiently large capital buffers that could withstand the impact of the increased loan provisions.
The provisioning requirements of the standards could end up smoothing the issuance of loans throughout the economic cycle (i.e., slowing the growth of loans in a strong economy while preventing the slowing of growth of loans in a weak economy). That is because the prior provisions already taken on loans should prevent the capital cost of lending from increasing when the economy weakens. In a simulation exercise involving the earlier provisioning for loans, it was found that bad debts were lower (higher) in years when loan loss provisions were high (low). With earlier provisioning taken from capital, there would be reduced levels of lending prior to an economic downturn or crisis.
Page 168
2018 Kaplan, Inc.
Topic 73 Cross Reference to GARP Assigned Reading – Cohen and Edwards
Ke y Co n c e pt s
LO 73.1 Key reasons to provision for expected credit losses (ECL) include: 1. Determining a more accurate cost of lending.
2. Reducing the procyclicality of bank lending (by provisioning earlier for loan losses).
3. Reporting earnings in a more conservative manner, which may be useful to financial
statement users.
LO 73.2 There are two main differences between the IASB and FASB standards. 1. FASB requires ECL to be computed over the term of a loan commencing right from the
start while IASB requires a series of three stages.
2.
IASB permits the recording of accrued interest income on delinquent loans. FASB requires the use of the cash basis and/or the cost recovery method.
LO 73.3 Based on surveys conducted with banks regarding the implementation of IFRS 9, overall it appears that only minimal progress has been made as of 2016.
For many banks, there are currently weaknesses in general data quality and the computation of lifetime default probabilities for loans. In addition, many banks reported that they had insufficient technical resources to complete the implementation.
LO 73.4 The impact of the IASB standard would cause a dramatic rise in loss provisions at the start of an economic downturn, specifically the increase in amounts between stage 1 (12-month ECL) and stage 2 (lifetime ECL).
The provisioning requirements of the standards could end up smoothing the issuance of loans throughout the economic cycle (i.e., slowing the growth of loans in a strong economy while preventing the slowing of growth of loans in a weak economy).
2018 Kaplan, Inc.
Page 169
Topic 73 Cross Reference to GARP Assigned Reading – Cohen and Edwards
Co n c e pt Ch e c k e r s
1.
2.
3.
4.
5.
Forward-looking provisions for credit losses should be made: A. before loan origination. B. at the same time as loan origination. C. 3 months after loan origination. D. 12 months after loan origination.
For which measurement basis does IASB (IFRS 9) permit the recording of interest income on delinquent loans? A. Accrual. B. Cash. C. Cost recovery. D. A combination of cash and cost recovery.
Both the FASB and IASB standards measure loss given default (LGD) and exposure at default (EAD) as: A. backward-looking estimates. B. downturn estimates. C. forward-looking estimates. D. neutral estimates.
For banks that have been able to make estimates of loan loss provision increases, the percentages are, on average, closest to which of the following amounts? A. 3%. B. 20%. C. 33%. D. 50%.
Under the IASB standard, in which stage(s) would the impact of the provisioning rules be the greatest on a banks income statement? A. Stage 1. B. Stage 2. C. Stage 3. D. Stages 2 and 3.
Page 170
2018 Kaplan, Inc.
Topic 73 Cross Reference to GARP Assigned Reading – Cohen and Edwards
Co n c e pt Ch e c k e r An s w e r s
1. B The nature of forward looking provisions is that they should be made in the current period in anticipation of losses to occur in the future. Therefore, they should be made at the same time as loan origination. Provisions cannot be made before loan origination because there is no information available about the likelihood of default until the loan is actually originated.
2. A
IASB permits the recording of accrued interest income on delinquent loans, regardless of whether loan payments are being received. FASB requires the use of the cash basis (no interest income accrual), cost recovery method (payments applied to principal first, and once principal is repaid, the excess is recorded as interest income), or a combination of both in order to provide a more conservative and reliable method for income recognition on delinquent loans.
3. D Both standards measure loss given default and exposure at default as neutral estimates.
4. B For banks that have been able to make estimates, the loan loss provision increases are an
average of 20%, with the range typically being between 10% and 30%.
5. B There is a change from the one-year expected loss in stage 1 to a lifetime loss in stage 2,
which is a very dramatic change on the income statement. There is not much of a change on the income statement between stages 2 and 3 because lifetime ECL is reported in both stages, but interest revenue is reduced in stage 3 because it is calculated only on the carrying amount less the loss allowance.
2018 Kaplan, Inc.
Page 171
The following is a review of the Current Issues in Financial Markets principles designed to address the learning objectives set forth by GARP. This topic is also covered in:
Bi g D a t a : N e w Tr i c k s f o r Ec o n o m e t r i c s
Topic 74
Ex a m Fo c u s
This topic focuses on ways to view the explosion in data that has resulted from the growth in economic transactions that involve computers. Large amounts of data are being captured daily and this trend will only increase over time. For the exam, understand that large datasets require tools beyond ordinary least squares (OLS) regression to properly understand the inherent relationships. Machine learning offers tools like classification and regression trees, cross-validation, conditional inference trees, random forests, and penalized regression. Further opportunities exist for the field of econometrics to bring time series forecasting tools to the world of big data.
Is s u e s In v o l v e d w it h Big Da t a s e t s

LO 73.3: Assess the progress banks have made in the implementation of the

LO 73.3: Assess the progress banks have made in the implementation of the standards.
The IASB standard is effective as of January 1, 2018 (although early adoption is allowed) and the FASB standard as of January 1, 2020 for public companies and January 1, 2021 for all other applicable entities.
Based on surveys conducted with banks regarding the implementation of IFRS 9, overall it appears that only minimal progress has been made as of 2016. For example, a significant number of banks were unable to quantify the impact of the new standard. For the banks that were able to make estimates, loan loss provision increases were estimated at an average of 20%, with the range typically being between 10% and 30%. A large portion of the increase is due to the recording of lifetime ECL for stage 2 loans. The amounts are not as significant for the related capital decreases [i.e., 50bp to 73bp decrease in Common Equity Tier 1 (CETl) spread and total capital ratio], but the key issue here is that the banks are generally unaware of how regulators will ultimately revise the regulatory capital amounts.
The Enhanced Disclosure Task Force (EDTF) has recommended specific risk disclosure by banks in the transition period prior to implementation of IFRS 9. The disclosures are qualitative (i.e., differences from current approach, implementation strategy, capital planning impact), but also include quantitative assessments of the impact of using the ECL approach. From the surveys, about 40% of the banks had no intention to make any quantitative disclosures prior to 2018.
A takeaway from the surveys is that for many banks, there are currently weaknesses in general data quality and the computation of lifetime default probabilities for loans. Improvements in the modelling processes are required as well. To date, there has been limited spending by most banks regarding IFRS 9 implementation. Specifically, many banks reported that they had insufficient technical resources to complete the implementation. Furthermore, to date there appears to have been limited involvement of key personnel such as the board of directors and senior credit risk staff.
2018 Kaplan, Inc.
Page 167
Topic 73 Cross Reference to GARP Assigned Reading – Cohen and Edwards
Im pa c t o f IASB a n d FASB St a n d a r d s

LO 73.2: Compare and contrast the key aspects of the IASB (IFRS 9) and

LO 73.2: Compare and contrast the key aspects of the IASB (IFRS 9) and FASB (CECL) standards.
The IASB and FASB standards are similar in that ECL must be initially recorded at the outset of all loans and updated at the end of each reporting period, taking into account any changes in credit risks of their loan assets. In addition, the standards do not require any specific catalyst to occur in order to report a credit loss. Finally, the standards mandate the use of reliable historical, current, and forecast information (including macroeconomic factors) in computing ECL. For example, both standards measure probability of default (PD) at a point in time (rather than in context of the economic cycle) and measure loss given default (LGD) and exposure at default (EAD) as neutral estimates (rather than downturn estimates).
There are two main differences between the IASB and FASB standards: 1. FASB requires ECL to be computed over the term of a loan commencing right from
the start while IASB requires a series of three stages. This difference will be discussed in more detail shortly.
2.
IASB permits the recording of accrued interest income on delinquent loans, regardless of whether loan payments are being received. FASB requires the use of the cash basis (no interest income accrual), cost recovery method (payments applied to principal first, and once principal is repaid, the excess is recorded as interest income), or a combination of both in order to provide a more conservative and reliable method for income recognition on delinquent loans.
International Accounting Standards Board (IASB)
Under IFRS 9, ECL is reported in three stages to represent the deterioration of assets: stage 1 (performing), stage 2 (underperforming), and stage 3 (impaired). Upon loan purchase or origination, stage 1 begins and the 12-month ECL is recorded (expense on income statement and contra-asset on balance sheet). However, interest revenue is computed on the original loan amount, not the amount net of the ECL. The 12-month ECL is computed as the expected lifetime credit loss on the loan asset multiplied by the probability of default within the upcoming 12-months after the end of the reporting date.
Stage 2 for a loan asset occurs upon severe deterioration of credit quality to require classification into a high credit risk category. That would be presumed to occur after the loan is 30 days past due according to IFRS 9. The entire lifetime ECL is now recorded (based on the present value of losses due to future defaults), which is likely a large increase in amount from stage 1. The difference in computation of 12-month and lifetime ECL can be explained primarily by the maturity of the loan together with the movement of default risks and recovery values during the term of the loan. Note that the interest revenue computation in stage 2 remains the same as in stage 1.
Page 166
2018 Kaplan, Inc.
Topic 73 Cross Reference to GARP Assigned Reading – Cohen and Edwards
Stage 3 involves loan assets that are credit-impaired or generating credit losses. The entire lifetime ECL continues to be recorded but the interest revenue is now computed on the original loan amount less the loss allowance.
Financial Accounting Standards Board (FASB)
In contrast to IASB, FASB requires the entire lifetime ECL to be recorded as a provision from the outset instead of dealing with stages. As a result, the FASB standard will result in earlier and larger recognition of losses (whereas there is some delay in IASB for loans classified in stage 1). The two standards are the same when dealing with loans that have considerable credit deterioration (i.e., IASB stages 2 and 3).
Im pl e m e n t a t io n o f IASB a n d FASB St a n d a r d s

LO 73.1: Describe the reasons to provision for expected credit losses.

LO 73.1: Describe the reasons to provision for expected credit losses.
Historical evidence suggests that loan interest rates were determined in unstable market conditions and, therefore, did not always account for all credit risks. As a result, forward- looking provisions should be made at the same time as loan origination.
The requirement for banks to set aside funds as capital reserves is unlikely to reduce a banks lending activities during strong economic periods. The result may be excessive lending by banks. Therefore, by provisioning for expected credit losses (ECL), a more accurate cost of lending may be determined (which may ultimately control the amount of lending).
The concept of procyclicality refers to being positively correlated with the overall state of the economy. Reducing the procyclicality of bank lending is likely to occur with earlier provisioning for loan losses. Increased (decreased) regulatory requirements pertaining to provisions tend to reduce (increase) the level of bank lending.
The use of forward-looking provisions essentially results in the earlier recording of loan losses, which may be beneficial to financial statement users from the perspective of conservatism in a banks reporting of earnings.
2018 Kaplan, Inc.
Page 165
Topic 73 Cross Reference to GARP Assigned Reading – Cohen and Edwards
IASB a n d FASB St a n d a r d s

LO 72.7: Describe elements that can be included as part of a due diligence

LO 72.7: Describe elements that can be included as part of a due diligence questionnaire. * 1
Properly designed due diligence questionnaires that are thoroughly and honestly answered by respondents can yield valuable information to a potential investor and may provide a list of concerns that need further assessment. The questionnaire should make the following inquiries:
1.
Inquiry into general information on the manager provides a starting point in the due diligence process. Examples of such information include: Confirmation of proper registration with regulatory authorities. Determination of ownership form (e.g., corporation) and structure.
Identification of key shareholders.
2018 Kaplan, Inc.
Page 159
Topic 72 Cross Reference to GARP Assigned Reading – Mirabile, Chapter 12
Reference checks.
Business contact information.
Information on past performance.
2.
3.
4.
3.
6.
7.
8.
Fees. Inquiry into general information on the fund also is critical. Examples of general information that should be collected include:
Lockup periods. Redemption policies. Primary broker.
Fund director. Administrator. Compliance: auditor and legal advisor.
Financial: assets under administration, investment capacity, and historical Financial: assets under administration, investment capacity, and historical performance (also see financial statements).
Historical drawdown levels.
Inquiry into execution and trading as well as service providers may provide some insight on the speed and accuracy of transaction processing and the existence of related-party service providers, the latter of which may raise red flags with potential investors as discussed earlier.
Inquiry regarding the firms third-party research policy may be useful to determine a funds sources of research information, thereby allowing the assessment of the extent and quality of the due diligence performed by the fund in its investment process.
Inquiry regarding compliance processes, the existence and degree of involvement of in- house legal counsel, and the existence of anti-money laundering policy and procedures may help provide comfort that the fund and its managers have a desire to operate in an ethical manner and/or within the boundaries of the law.
Inquiry into the existence of information regarding disaster recovery and business continuity plans as well as insurance coverage and key person provisions may provide some assurance regarding the stability of the firm and, therefore, the safety of any invested funds.
Inquiry into the investment process and portfolio construction provides the potential investor with information required to make an informed decision whether the overall risk and return profile of the fund is consistent with the investors investment objectives.
Inquiry into risk controls such as leverage, liquidity, asset concentrations, portfolio diversification, and market risk factors give the investor a more complete picture of the investment risks and how the managers attempt to manage and mitigate them.
The existence of financial statements, especially if audited with an unqualified opinion, provide objective and historical financial information on the fund that can be used to assess performance. Information on the composition of the invested assets may also be helpful to the potential investor. Finally, interim statements (not necessarily audited) may provide more timely information to make a more current assessment of the fund by the potential investor.
Page 160
2018 Kaplan, Inc.
Topic 72 Cross Reference to GARP Assigned Reading – Mirabile, Chapter 12
Ke y Co n c e pt s
LO 72.1 Past fund failures can be attributed to poor investment decisions, fraud, extreme events, excess leverage, lack of liquidity, poor controls, insufficient questioning, and insufficient attention to returns.
LO 72.2 The due diligence process for assessing investment managers should include information on the investment background and reputation of the managers and past performance. In addition, there should be an assessment of the funds investment process, risk controls, operations, and business model.
LO 72.3 In evaluating a manager, investors should consider four broad themes including strategy (e.g., evolution, risk management, quantification, types of investments), ownership, track record (e.g., comparison with peers, independent verification of results), and investment management (e.g., manager interviews, reference checks, background checks).
LO 72.4 Criteria that could be used in assessing a funds risk management process includes risk (e.g., types, culture, quantification/models), security valuation, portfolio leverage and liquidity, tail risk exposure, risk reports, and consistency of the fund terms with the investment strategy.
LO 72.3 Performing due diligence on a funds operating environment focuses on:
Internal control assessment (i.e., qualifications and attitude of personnel, written policies and procedures, compliance system, counterparty risk, effectiveness of governance).
Documents and disclosure (i.e., confirmations with the funds legal counsel regarding
fund documents, corroborating terms of the offering memorandum, conflicts of interest, disclosure of risks, managers authority, managers reporting duties to investors, financial statements, and fees paid to the manager, net contributions/withdrawals by the general partner). Service provider evaluation.

2018 Kaplan, Inc.
Page 161
Topic 72 Cross Reference to GARP Assigned Reading – Mirabile, Chapter 12
LO 72.6 Business model risk can be assessed by considering revenues and expenses (detailed examination), sufficiency of working capital, existence of budgets, computation of breakeven points, ability to increase investment asset base, existence of key person insurance, and existence of a succession plan.
Fraud risk can be assessed by considering the existence of related-party transactions, illiquidity, litigation, unreasonably high (stated) investment returns, personal trading by the manager of the same or similar securities as those held by the fund, and shorting transactions.
LO 72.7 Items to include as part of the due diligence questionnaire include general information on the manager and the fund, execution and trading, service providers, third-party research policy, compliance processes, existence and degree of involvement of in-house legal counsel, existence of anti-money laundering policy and procedures, existence of information regarding disaster recovery and business continuity plans, insurance coverage, key person provisions, details of the investment process and portfolio construction, risk controls, and information contained in the funds financial statements.
Page 162
2018 Kaplan, Inc.
Topic 72 Cross Reference to GARP Assigned Reading – Mirabile, Chapter 12
Co n c e pt Ch e c k e r s
1.
2.
3.
4.
3.
Based on historical evidence, which of the following factors is least likely to result in the eventual failure of a hedge fund? A. Excessive controls in place. B. Taking on more systematic risk. C. Making decisions in a committee setting. D. Materially misstated financial statements.
In performing due diligence on a potential investment manager, which of the following factors is the least important for the investor to consider? A. Risk controls. B. Business model. C. Past performance. D. Investment process.
Which of the following items is least likely to be included as requested information on a due diligence questionnaire? A. Insurance coverage. B. Returns attribution analysis. C. Disaster recovery procedures. D. Anti-money laundering policy.
Which of the following statements regarding the assessment of a funds risk management process is correct? A. The periodic valuation of a funds securities is best performed by the fund
manager.
B. The existence of written policies and procedures for internal controls is useful in
measuring and monitoring risk.
C. The risk reports received by investors are preferably prepared by a third-party
risk provider instead of by the fund itself.
D. The key requirement for information technology resources used to quantify the
risks is that they measure items consistently.
LisaTahara, FRM, is considering an institutional investment in a hedge fund that has experienced volatile and generally positive returns in the past. Which of the following considerations about the funds track record is least relevant for consideration in her investment decision? A. Size of investment assets. B. Absolute level of past returns. C. Verification of returns by a third party. D. Employment continuity of the investment team.
2018 Kaplan, Inc.
Page 163
Topic 72 Cross Reference to GARP Assigned Reading – Mirabile, Chapter 12
Co n c e pt Ch e c k e r An s w e r s
1. B
If a fund takes on more systematic risk (i.e., regular market risk), it is less likely to result in a failure unless there is a significant market downturn. Taking on more unsystematic risk, however, is more likely to result in a failure. Excessive controls to reduce operational risk may be a good idea but may also result in excessive expenses and insufficient returns, thereby leading to a possible failure of the fund.
In a committee-style decision-making process, there may be a dominant member who sways the decision and/or members who are afraid to voice any valid concerns. Materially misstated financial statements are a form of accounting fraud, which significantly increases the risk of the eventual failure of a fund.
2. C
Investors should assess potential managers and their investment strategies with an objective and unbiased mind. They should not be unduly concerned with a managers past successes given that past performance is not always indicative of future performance. Risk controls, the business model, and the investment process are all fundamental parts of the due diligence process.
3. B A returns attribution analysis could be performed to determine how a funds returns were generated. Return attributions are not generally part of a due diligence questionnaire but such an analysis could subsequently be performed based on some of the information received from the questionnaire. The other items (insurance coverage, disaster recovery procedures, and anti-money laundering policy) are all standard items that would be found in most, if not all, due diligence questionnaires.
4. D
It is very important for the information technology resources used to quantify risks to measure items consistently. Securities valuation is an important and potentially subjective task, therefore, independence and objectivity is critical. Policies and procedures tend to be general and only demonstrate the intention to have a strong control environment. Their existence alone provides little assurance that they are properly measuring and monitoring risk. In general, the reporting of risk measures is a more objective task and as a result, there is little or no preference for the reporting to be done internally or externally.
5. B The absolute level of past returns is least relevant here given the volatile returns in the past. Also, past returns are not an assurance of similar returns in the future. The relative level of returns is more important than the absolute level. Verification of returns by a third party provides assurance that the return calculations were computed fairly and accurately by the fund. It is relevant to ascertain whether most or all of the staff on the investment team that generated the past results are still currently employed by the fund. It provides some (but not absolute) assurance that similar returns may be generated in the future.
Page 164
2018 Kaplan, Inc.
The following is a review of the Current Issues in Financial Markets principles designed to address the learning objectives set forth by GARP. This topic is also covered in:
Th e N e w Er a o f Ex pe c t e d C r e d i t Lo s s P r o v i s i o n i n g
Topic 73
Ex a m Fo c u s
This topic looks at how the accounting rules are changing to require banks and financial institutions to account for loans using expected credit losses (ECL) from the time of origination rather than waiting for specific events that suggest an ensuing high probability of losses. For the exam, understand the possible interaction between earlier provisions for loans and the impact on lending. Also, be able to compare and contrast the International Accounting Standards Board (IASB) and the U.S. Financial Accounting Standards Board (FASB).
P r o v is io n f o r Ex pe c t e d Cr e d it Lo s s e s

LO 72.6: Explain how a fund’s business model risk and its fraud risk can be

LO 72.6: Explain how a funds business model risk and its fraud risk can be assessed.
In addition to the previous due diligence, potential investors need to closely examine the fund to ensure that the risks associated with its business model and potential fraud are not excessive.
Business Model Risk
Evaluating business model risk requires assessing whether managers know how to operate the business as well as generate high returns. Typical risks, potentially leading to failure and closure of the fund, include a lack of cash and working capital, a lack of a succession plan, and excessive redemptions in a short period of time.
A funds business model risk can be assessed by performing the following tasks: Examining the nature of the revenues and expenses. For example, are revenue
items stable, recurring, or one-time? Can costs be reduced or are they increasing uncontrollably?
Calculating the percentage of revenues derived from variable incentive or performance
fees (that may not materialize in market downturns).
Assessing the significance of the gap between management fees (revenue) and operating
expenses.
Page 158
2018 Kaplan, Inc.
Topic 72 Cross Reference to GARP Assigned Reading – Mirabile, Chapter 12
Considering the sufficiency of the amount of working capital (especially cash) in place to
cover revenue shortfalls and/or expense overages for a reasonable period of time.
Determining how frequently budgets are created and for what period of time. Determining the funds breakeven points in terms of assets under management and
required performance level. Comparing those amounts to current (actual) and future (projected) amounts.
Ascertaining if there is sufficient personnel or capacity to increase the funds investment

asset base. .Ascertaining the existence of key person insurance on relevant individuals and the existence of a succession plan.
Fraud Risk
Fraud risk can always exist even though extensive due diligence has been performed on the manager and fund prior to investing. A funds fraud risk can be assessed by determining the existence of the following factors:
Frequent related-party transactions, including trading through a broker or using a Frequent related-party transactions, including trading through a broker or using a valuator who is a related party. Frequent instances of illiquidity, including significant concentrations of illiquid investments (especially those that are valued by the manager only). Frequent litigation as a defendant, especially regarding claims of fraud.

Unreasonably high (stated) investment returns.
Frequent personal trading by the manager of the same or similar securities as those held by the fund. Frequent shorting transactions.
Fraud risk may be mitigated by performing the following actions: Check the SEC website for any prior regulatory infractions. Check court records for any prior litigation and bankruptcy records for examples of
financial irresponsibility. Inquire with service providers for assurance over their competence and independence from the manager. Perform extensive background checks on the manager.

D u e D il ig e n c e Q u e s t io n n a ir e

LO 72.5: Explain how due diligence can be performed on a funds operational

LO 72.5: Explain how due diligence can be performed on a funds operational environment.
Investors should focus on several key areas when performing operational due diligence on a fund. The focus areas are internal control assessment, documents and disclosure, and service provider evaluation.
Internal Control Assessment
A starting point in due diligence is examining the qualifications and attitudes of the personnel. For instance, does the CEO believe in controls and compliance with the rules? An analyst must also assess whether the internal control staff have sufficient technical and work experience to perform their compliance duties properly. Have they been properly trained and do they continue to expand their skills in compliance? Some assurance may be required regarding whether the back and middle office managers are sufficiently experienced in performing supervisory duties. Finally, background checks on critical internal control staff members might be required.
Examining the funds policies and procedures may also be useful. Related documents may cover areas such as trading, derivatives usage, and transaction processing. One drawback is that these documents tend to be general and only demonstrate the intention to have a strong control environment. In other words, merely reading the documents provides little assurance that the policies and procedures are actually being followed or are effective. It is
Page 156
2018 Kaplan, Inc.
Topic 72 Cross Reference to GARP Assigned Reading – Mirabile, Chapter 12
usually a good sign if a fund has been proactive and obtained an audit report and opinion on the effectiveness of its controls. If this report is available, it should be reviewed.
The due diligence process should include an examination of the in-house or outsourced compliance system that is in place. Examples of specific items to consider include the code of ethics (if one exists) and any restrictions on employee trading and related-party transactions.
There should be an investigation into how the funds deal with counterparty risk arising from OTC derivatives and other counterparties. Is such risk mitigated by dealing with more than one counterparty? Are the counterparties monitored for risk on a daily basis?
Finally, there should be an assessment as to the effectiveness of corporate governance. Is it pervasive throughout the organization? Are examples of internal control breaches followed up with appropriate actions to remedy and prevent future recurrence?
Documents and Disclosure
As part of the due diligence process, investors must confirm with the funds legal counsel its involvement in preparing the original version of the fund documents as well as any subsequent revisions. The investor should also confirm if the law firm remains as the funds legal counsel. A physical check of the documents should be made to look for any changes made after the date indicated on the documents.
The investor should corroborate the terms of the offering memorandum by examining other documents such as the Form ADV, subscription agreement, and investment management agreement. Consistency is important here. Terms relating to fees, redemption rights, liquidity, and lockups should be examined closely and clarified with the manager if required.
Conflicts of interest that are disclosed in the offering memorandum should be scrutinized carefully. Lack of clarity in the disclosure may be a red flag and warrant further discussion with the manager and/or require independent information.
Similarly, lack of clarity or sufficiency in the disclosure of risks may warrant further investigation. The discussion of very general or irrelevant risk factors may be cause for concern.
The focus of any due diligence should be on the manager. As a starting point, the potential investor should determine the extent of the managers authority. Are the provisions very broad (potentially more risky) or quite specific? Is the manager subject to limitations on the amount of leverage employed or on the percentage of the fund invested in specific securities, sectors, or industries? Can the manager be indemnified for his actions outside of fraud, gross negligence, or malicious intent? Additionally, there should be a consideration of the managers reporting duties to investors (e.g., audited financial statements, disclosure of the tax treatment of the funds income and transactions).
In analyzing the financial statements, the investor should begin by ensuring the audit opinion is unqualified (i.e., the auditor believes the financial statements contain no material misstatements). The balance sheet and income statement should be examined for
2018 Kaplan, Inc.
Page 157
Topic 72 Cross Reference to GARP Assigned Reading – Mirabile, Chapter 12
consistency with the funds investment strategy (e.g., a high leverage fund should have high interest expense on the income statement and high liabilities on the balance sheet). Any inconsistencies should be discussed with the manager on a timely basis. In addition, the footnotes (which are also audited) should be examined carefully since they provide more detailed information on key items (e.g., contingent liabilities, related-party transactions) than the corresponding financial statements.
Fees paid to the manager by the fund should be scrutinized and recalculated. They should be corroborated with the offering memorandum. Specifically, there should be a check of any incentive fees paid in loss years.
Finally, there should be a check for the level of net contributions to the fund by the general partner. .Any fund withdrawals should be questioned.
Service Provider Evaluation
Third-party service providers may be hired by a fund for trade execution, information technology, valuation, verification, and asset safeguarding purposes.
A starting point for assessing the actual service providers is to examine the internal control letters issued by its auditors and its audited financial statements. Further due diligence could be performed through in-person discussions regarding the service providers role.
Bu s in e s s M o d e l a n d Fr a u d Ris k