Skip to Content

​On the Frontlines: Data-driven vs. Data-informed

Blogs Maosen Cai, CIA, CISA, CMA, FRM Nov 10, 2021

​Find the right balance between analytics approaches.

Data analytics can help uncover patterns and trends that internal auditors relying only on human judgment might have missed. However, there are pitfalls in being overly data-reliant and forgetting that data has limitations too. Auditors should seek a balance between being data-driven and data-informed.

Available in: Spanish, Portuguese

​Data is a powerful thing. The essential role that data analytics play in today's audit profession cannot be emphasized enough, especially in light of the digital transformation taking place across industries. Many auditors use data in one of two ways. "Data-driven" internal auditors make use of abundant data sets and analytics techniques to reach unexpected and valuable audit insights. Meanwhile, "data-informed" internal auditors build on professional wisdom, experience, and pragmatism to carry out audit engagements, using any data they come across as a support to their rationale and judgment.

Be it data-driven or data-informed, neither is absolutely right or wrong. However, the extremes of either style can have negative consequences. Being purely data-informed could be used as an excuse to avoid the hard work of distilling data. On the other hand, data-driven auditors may be tempted to over-analyze everything that comes into view, without stepping back and looking at the big picture, guided by human judgment.

Auditors should therefore seek a balance between the two extremes. In fact, there are common pitfalls that internal auditors should take care to avoid to reap the full potential of data analytics.

Focusing Only on the Known Unknowns

When planning for an audit engagement, auditors often easily compile a list of risks and concerns based on a past understanding and information available at hand. Starting from there, auditors then build a series of analytics procedures and metrics to either conform or disprove auditor skepticism. This is an approach of "finding the known unknowns" and is undoubtedly fundamental to developing an audit analytics program. However, auditors solely relying on this approach may miss what could prove to be their secret weapon —"finding the unknown unknowns," or finding risks or anomalies not on the auditor's radar at the outset.

Admittedly, this is easier said than done. One approach is for auditors to conduct an exploratory analysis into any accessible key business metrics or datasets related to the audit scope, prior to finalizing an audit plan. Hopefully, this will align facts and numbers with auditor assumptions and create opportunities to fine-tune the audit plan with more pertinent considerations.

Assuming the Data Is Clean and Organized

Garbage in, garbage out. Any analytics program is only as good as the data that feeds it, and auditors often have to compile data in disparate forms and patterns and from different sources. This subjects auditors to additional layers of data quality risk, both from bad data in its original form and errors/omissions during the data compilation process. Jumping straight into analysis on such data without a sanitary check could produce misleading audit results, ultimately putting the reputation of the audit department at risk.

Cleaning the incoming data can add to the upfront workload in any analytics journey. But the process of data cleaning can be a valuable exercise in itself, as the invalid datasets, once identified, may reveal important patterns for further investigation. As the saying goes, "the devil is in the details."

Mixing Correlation With Causation

Supported by the right data and algorithms, auditors should be on track to identify some patterns or anomalies from the data. However, auditors also may be quick to believe that they have just discovered a "mine" of audit findings and chase it down the path. Until there is further evidence, auditors should bear in mind that these clues are just correlated with a potential issue and may not represent all of the truth or justify an audit finding.

In this case, a prudent auditor would stop and reflect on two underlying questions before taking further action:

  • Are there any other clues that would provide an opposing point of view?
  • What is not collected or apparent in the data? For example, is there data collection bias?

Data Reporting: Engaging in Information Overload

When preparing a report, it can be tempting for auditors to want to show the audience (and especially executives) everything from the analysis, as evidence of the hard work done and robustness of the process. It is an understandable mistake, but doing so actually forces the audience to repeat the laborious process that the auditor has gone through to reach the conclusion. Instead, auditors should ask themselves three questions before they proceed, relating to who, what, and how:

  • To whom are you communicating?
  • What do you want your audience to know or do?
  • How can you use data to help make your point?

All that being said, the only way auditors can successfully avoid these data analytics pitfalls is through practice. With experience, a balanced approach, and good judgment, data analytics can be an indispensable guide and a tool to help internal auditors bring more value to the profession.

Maosen Cai, CIA, CISA, CMA, FRM

Maosen Cai is audit analytics lead at a digital-only bank in Shenzhen, China.