Jul

19

The Art of Control (part 3 of 3)

The 3 Elements of Integrated Control

1. Completeness
The first, most obvious and most commonly used, control area is to check whether all the required data for a report is present.

Primary and secondary data dependencies
Identifying the main data required by the report is a key aspect in determining the completeness. The primary data dependencies are normally relatively straightforward to check for missing values. Secondary data dependencies are a little more complicated. This can be compounded if this data does not live in the system that you are reporting from. As an example, we could take the valuation of a portfolio. On a primary level we receive the net asset value, on a secondary level this is derived from the value of the individual positions of that portfolio. If a price is missing or a cash flow is not booked then we have complete but incorrect data. In such cases, it is essential that the secondary level data is available to check or clearly indicates an error which can be caught by a quality control process.Static and Reference data

Probably the largest cause of the "small inconsequential change" phenomenon. Changes to static data can cause reporting queries to loose significant chunks of results. Similarly, reference data definitions, for example portfolio reporting currency, can wreak havoc on a report if they are not defined. All static and reference data requirements should be validated to ensure report completeness.

Gaps

Not all missing data is readily identifiable as the frequencies may be irregular. If these frequencies are known, then it should be possible to check a data set for any gaps.

2. Quality

The second area, that is less often applied, is to validate the quality level of the data behind the report.Readiness of data
Integration the control into the report benefits from being bound also to the main control process. Having indicators to let the report know that the base data is available and has been validated prevents people from having to analyze problems only to find out that they simply executed the report too early.
Additionally, this is an excellent of flagging a report as a "first cut" or "early indications" report and making sure that this information remains clearly identifiable to any person that the report is distributed to.Anomalies

Checking for data anomalies is very report and system specific. Where possible a trend should be established and then a plausibility check made against that trend. This can be, for example, a portfolio return that lies beyond a certain deviation point when compared to the average return of similarly managed portfolios. Or an instrument return that deviates strongly from its associated benchmark using differing boundaries for monthly, quarterly and yearly returns. Or an unexpected volatility in a series of currency rates.

Duplication
A great way to mess up any report or query is to add a duplicate record. In most cases this creates the lovely effect of doubling all values. Such problems are usually easy to spot on a report but can be very troublesome to locate in the base data. More subtle data duplications may cause only minor shifts and go more easily unnoticed.Neutralizing known inconsistencies
This is an essential part of the control mechanism and without it everything falls apart. The simple fact is that many of the discrepancies found by using the above controls might simply reflect reality and are in fact correct or nothing can be done about it. The consequence is that you have a report with an ever growing list of errors, which people simply start ignoring and new problems get lost amongst the known problems. Having the ability to register known and verified problems, preferably with some form of comment explaining the issue, allows you to eliminate or expand upon repetitive errors from the reports. This means that when someone sees an error on a report they know it is something that has not already been analyzed.

3. Reconciliation

Probably the most neglected, but most powerful, control mechanism that truly comes into its own as an integrated control.

It's all about cross-checking and is the best way to check consistency within a report.

Look for values that should sum up to another value that is independently calculated.

Reconciliation is a very report specific area, but when used it can be a very strong validation mechanism. Identifying values on a report that are related provides assurance that all these values are properly present on the report. Sometimes simply adding some independently retrieved control figures can confirm the quality of the report.
For example, in a contribution report we assume that the sum of the contributions should correspond to the portfolio return. An attribution report (depending on method) could use the sum of the effects in comparison to the relative return. A portfolio report could use the sum of the net asset values in comparison to the total assets under management of the firm.
The more connections you can determine, the better the validation of the report will be. As an added bonus, most of the additional data needed to check the values are usually summary values and are quick to retrieve while the detail data is already present in the report itself.
While your other controls check the quality of the underlying data, this control checks whether the data is properly handled within the report. This is an ideal method for protecting against queries that go wrong due to configuration and reference data changes as well as protecting against calculation errors.

Conclusion

With ever increasing pressure on the time to market of reports and the increasing flexibility of reporting systems, there is a proportional increase in the risk of control process breaks. A few simple integrated controls can be implemented to support the standard control process and mitigate the regulatory and reputational risks. With integrated controls you can easily block reports from being distributed to customers (internal or external) in case of erroneous data or provide internal first-cut reports safe in the knowledge that the quality level of the data is accurately represented.