LO 40.3: Describe the operational data governance process, including the use of scorecards in managing information risk.
Operational data governance refers to the collective set of rules and processes regarding data that allow an organization to have sufficient confidence in the quality of its data.
Specifically, a data governance program should exist that clarifies the roles and responsibilities in managing data quality. A data quality scorecard could be used to monitor the success of such a program.
In short, operational data governance aims to detect data errors early on and then set into motion the steps needed to sufficiently deal with the errors on a timely basis. As a result, there should be minimal or no subsequent impact on the organization.
Data Quality Inspection vs. Data Validation
Data validation is a one-time step that reviews and assesses whether data conforms to defined business specifications. In contrast, data quality inspection is an on-going set of steps aimed to: 1.
reduce the number of errors to a tolerable level,
2. spot data flaws and make appropriate adjustments to allow data processing to be
completed, and
3. solve the cause of the errors and flaws in a timely manner. The goal of data quality inspection is to catch issues early on before they have a substantial negative impact on business operations.
D a t a Q u a l
i t y S c o r e c a r d
A base-level metric is straightforward in that it is measured against clear data quality criteria. It is relatively easy to quantify whether the criteria is met in terms of arriving at a data quality score.
In contrast, a complex metric is a combined score that could be a weighted average of several different metrics (customized to the specific user(s)). Such a combined metric allows for a qualitative reporting of the impact of data quality on the organization. A data quality scorecard could report the metric in one of three ways: by issue, by business process, or by business impact.
Page 38
2018 Kaplan, Inc.
Topic 40 Cross Reference to GARP Assigned Reading – Tarantino and Cernauskas, Chapter 3
Complex Metric Scorecard Viewpoints
D ata quality issues view : Considers the impact of a specific data quality problem over multiple business processes. The scorecard shows a combined and summarized view of the impacts for each data
problem. By going into more detail, one can obtain further information on the sources of data problems. This allows for prioritization in terms of solving individual problems.
Business p rocess view : For each business process, the scorecard has complex metrics that quantify the impact of
each data quality problem. It allows for the ability to determine exactly where in the business process the data problem is originating. This will assist in solving the problem efficiently.
Business im p a ct view : The scorecard provides a high-level understanding of the risks embedded in data quality
problems (i.e., a combined and summarized view). It considers various data quality problems that occur in various business processes.
By going into more detail, one can identify the business processes where the problems occur. An even more detailed examination will reveal the specific problems within each business process.
Motivation
Business managers may wish to take advantage of an opportunity to assess the relationship between the impacts of flawed data versus the pre-defined parameters of acceptable data quality. Such an assessment could occur with a data quality scorecard, with data being measured against the benchmark (acceptable data quality). The scorecard, therefore, serves as a strong management technique if it can summarize important organizational information as well as provide warning signs to management when corrective actions are required.
Mechanics
Regardless of the preferred view, a data quality scorecard is comprised of a hierarchy of base-level and complex metrics that tie into different levels of accountability within the organization. With regard to metrics, the same measurement might be used in different contexts, which allows for different error tolerances and weights. Finally, scorecards can be customized to present varying levels of detail depending on the intended user(s).
2018 Kaplan, Inc.
Page 39
Topic 40 Cross Reference to GARP Assigned Reading – Tarantino and Cernauskas, Chapter 3
Ke y C o n c e pt s
LO 40.1 Data errors (e.g., missing data, inconsistent data, nonstandard formats) whether they are accidental or not, may lead to inconsistent reporting, incorrect product pricing, or failures in trade settlement.
LO 40.2 Key dimensions that characterize acceptable data include: accuracy, completeness, consistency, reasonableness, currency, and uniqueness.
LO 40.3 Operational data governance refers to the collective set of rules and processes regarding data that allow an organization to have sufficient confidence in the quality of its data.
Three different viewpoints regarding scorecards include: data quality issues view, business process view, and business impact view.
Data quality scorecards serve as a strong management technique if they are able to summarize important organizational information as well as provide warning signs to management when corrective actions are required.
Page 40
2018 Kaplan, Inc.
Topic 40 Cross Reference to GARP Assigned Reading – Tarantino and Cernauskas, Chapter 3
C o n c e pt C h e c k e r s
1.
2.
3.
4.
3.
Ryan Vail is a corporate manager who recently made a series of incorrect business decisions as a result of faulty data obtained internally. Which of the following negative business impacts best describes his incorrect decisions? A. Compliance impact. B. Confidence-based impact. C. Financial impact. D. Risk impact.
Data consistency is important to ensure that there are no clear conflicts in data values between data sets. Which of the following types of data consistency refers to consistency between one set of data values and another set of data values in different records? A. Record level. B. Temporal level. C. Cross-record level. D. Cross-temporal level.
Which of the following data issues is least likely to increase risk for an organization? A. Duplicate records. B. Data normalization. C. Nonstandard formats. D. Data transformations.
Which of the following statements regarding data quality inspection is correct? It attempts to: A. catch errors early in the process. B. reduce the number of errors to zero. C. solve the cause of any errors immediately. D. review and assess whether data conforms with defined business specifications.
Which of the following viewpoints regarding data quality scorecards is best described as providing a high-level understanding of the risks embedded in data quality problems? A. Business impact view. B. Business process view. C. Data quality issues view. D. Data process issues view.
2018 Kaplan, Inc.
Page 41
Topic 40 Cross Reference to GARP Assigned Reading – Tarantino and Cernauskas, Chapter 3
C o n c e pt Ch e c k e r A n s w e r s
1. B An example of a confidence-based (negative) impact would be a manager who makes
incorrect business decisions based on faulty data.
2. C Record level consistency is consistency between one set of data values and another set within the same record. Cross-record level consistency is consistency between one set of data values and another set in different records.
3. B Data normalization is a process to better organize data in order to minimize redundancy
and dependency, so it is least likely to increase risk. All of the other data issues are likely to increase risk, especially complex data transformations.
4. A Data quality inspection is intended to catch issues early on before they have a substantial negative impact on business operations. The idea is to reduce the number of errors to a tolerable level, not necessarily to zero. In addition, it aims to solve the cause of the errors in a timely manner, not necessarily immediately.
5. A With the business impact view, the scorecard provides a high-level understanding of the risks
embedded in data quality problems (i.e., a combined and summarized view). It considers various data quality problems that occur in various business processes.
Page 42
2018 Kaplan, Inc.
The following is a review of the Operational and Integrated Risk Management principles designed to address the learning objectives set forth by GARP. This topic is also covered in:
O p Ri s k Da t a a n d G o v e r n a n c e
Topic 41
E x a m F o c u s
This topic discusses the seven level 1 categories of operational risk (OpRisk) events defined in Basel II and describes level 2 examples of operational risk events for each category. For the exam, understand how the collection and reporting of loss data, the risk control self assessment (RCSA), identification of key risk indicators (KRIs), and scenario analysis are all important elements of a firms OpRisk process. Also, be familiar with the OpRisk profiles across various financial sectors with emphases on the highest frequency percentages and severity percentages. Finally, be prepared to describe the typical progression through four organizational risk designs for large firms.
E v e n t – D r i v e n R i s k C a t e g o r i e s