As an assurance leader (internal audit / performance audit / risk assurance), you want to:
Provide assurance to the Board and to management
Help maximise value (efficiency, effectiveness, economy) and customer satisfaction
Ensure that compliance is maintained
As part of your overall assurance program, analytics can help achieve your objectives, allowing you to find both upside (e.g. revenue leakage) and downside (e.g. control breakdown) opportunities.
Evaluating full data populations - going beyond sampling - allows you to vastly improve coverage, and enables you to obtain higher levels of confidence in your results.
You have implemented analytics, in some form.
But something is holding it back.
Something is just not quite working, and you know you can achieve more.
So first you need to find the problem(s).
Do any of the 5 problems below sound familiar?
Access to data - you can't get all the data, or you can't get it quickly enough
Low value - the analysis doesn't provide new insights
False positives - too many of them; results are overwhelmingly noisy, distracting your focus
Superficiality - the results are not deep enough to properly understand / refine the problems or to provide opportunities for improvement
Timing - the results are not available in time for reporting / concluding.
How can you overcome these challenges:
Access to data
Agility and flexibility in approach: by adopting an iterative approach, you can reduce the burden on management and get to a quicker initial result. A win-win. Start with a small set of data that can help shape an initial set of results; discuss among the audit team and build on it by adding more data if required and exploring in more depth; further discussion, and perhaps debate, then refine with additional data and/or further analysis.
Up-front identification of hypotheses: analytics test libraries are old hat. They can be useful in identifying some common tests for specific assurance objectives, but their value-add potential is limited. Ask your team to abandon this outdated approach - focus instead on your organisation's unique strategic objectives and get the team together to brainstorm ideas.
There are techniques to deal with these (e.g. supervised machine learning, as outlined in this previous article).Access to the right combination of analytics tools, as outlined in a recent article, can help alleviate this problem too.
Validate assumptions and outcomes as early as possible. In some cases the agility and flexibility in approach outlined above can help overcome this challenge - given the iterative refinement and progressive levels of depth. Avoid over-emphasis on dashboards - the use of visualisation tools (a.k.a. dashboarding) is important and necessary - but if it forms the bulk of your "analytics" effort, without actual underlying data cleansing, blending and analysis, you may end up with superficial results.
Try to plan for the analytics work well in advance of the audit - generally 3 months before the audit is due to commence. In some cases, the agility and flexibility in approach outlined above can help - you won't be waiting until the end for results. In some cases, you may need to accept the outcome - particularly if the work was highly experimental; in these cases, make sure to carefully outline the lessons learnt, how to try to avoid the same happening the next time around, then feed that into the planning for future projects.
What else is your team struggling with? And how are you removing those blockers?