Risk Insights Blog

Looking for Something?

Search for posts and comments here.

Repeatable analytics - whose job is it?

Over the past few years, and again at a recent ISACA conference in Chicago, there have been lots of discussions regarding analytics strategies for internal audit teams.


Among the strategies, repeatable analytics (e.g. continuous controls monitoring or CCM) seems to be a fairly common theme. Is this the easier route or the appropriate one?


#14. Repeatable analytics - whose job is it?


Image created by Luis_molinero - Freepik.com

 Here are some questions we need to ask:

  1. What is the role of the third line? Is it to assume management responsibility?
  2. Do teams still use fairly rigid internal audit plans?
  3. Repeatable analytics - resurrecting CCM, without calling it that?
  4. Do teams continue to focus on the traditional rules-based approaches (and tools).


Without diving into explicit answers, perhaps we need to consider:

  1. One IIA position paper on 3LOD states "Operational management naturally serves as the first line of defense because controls are designed into systems and processes under their guidance of operational management. There should be adequate managerial and supervisory controls in place to ensure compliance and to highlight control breakdown, inadequate processes, and unexpected events."
  2. Leading IA teams, with the exception of those that have significant supervisory oversight duties (which in itself may be an issue, but that's for another day), have moved to more flexible, agile audit planning approaches (e.g. 3+9, 6+6).
  3. The views regarding CCM and where it should reside vary, but the position paper excerpt above seems to cover this - "first line ... controls in place ... highlight control breakdown".
  4. While the rules-based approaches have served IA folk well, it is time to move on; some of the traditional IA analytics software vendors have not, and are keeping us in the past with them. This Deloitte article on fighting fraud talks about "investigate efforts" that can be impeded by (amongst others) "an over-reliance on rules based testing".

With a recent series of reports highlighting failure by IA teams to leverage analytics, could this be the reason that the strategies are not working?


Some of those reports talk about repeatable analytics being a core component of the IA analytics strategy. Do you agree?


This article is part of the assurance analytics series.


Go to the Assurance Analytics Guide


Fin Services Public Sector Assurance / Audit Analytics