Cookie preferences
Settings I agree

2021B01 - Audit technologies and auditor judgment (PhD project X. Li)


This study forms the basis of study 1 of the PhD project by Xiaoxing Li, from the VU.

What?

In this proposal we posit that, in settings where analytic tools are very well calibrated (false positive rates are lower or hit rates are higher), expressing these rates to audit staff will improve their responses to fraud red flags identified by such tools and, in turn, their skeptical actions (vs. not expressing the rates). However, under conditions where analytic tools are effective, but not as well calibrated (false positive rates are higher or hit rates are lower), we expect that the framing of the rate of calibration will impact auditor skepticism when an analytic tool identifies a fraud red flag. The costs associated with skepticism (e.g., budget overages), particularly in cases where skepticism does not identify a misstatement, can be a barrier to the application of skepticism. As such, framing the calibration of data analytic tools in the terms of “false positive rates” could reduce skepticism as it highlights the costs of skepticism. On the other hand, framing the calibration of analytic tools in the terms of “hit rates” could improve skepticism as it highlights the benefits of skepticism (e.g., identifying a misstatement).

We propose to examine the effects of conveying Audit Data Analytic (ADA) calibration (explicit conveyance versus no conveyance) and the framing of such conveyance (hit rates versus false positive rates) on auditor skepticism.

Why?

The emergence of new audit data analytics (ADA) in the audit environment allows auditors to gain deeper insights into their clients’ data, but simultaneously creates unique challenges for auditors when exercising skepticism. Because data analytic approaches enable auditors to examine full populations (vs. sampling) or incorporate more diverse data into their testing, auditors are often faced with a larger number of anomalies or inconsistencies that should be investigated and evaluated. This creates a unique dilemma for auditors as they determine not only how best to utilize information from data analytic tools, but also how to manage the investigation of red flags identified by these tools.

One concern regarding the use of data analytics is the presence of false positives, or the extent that these tools identify transactions or relationships as potential anomalies that, after further investigation, are determined to be reasonable, explained variations in the data. The frequency of false positives is likely to increase proportionately with the size and complexity of the data extracted from the population. Despite concerns over false positive rates, few studies have addressed the problems auditors face when processing the outliers identified by analytic tools. When false positive rates increase, auditors might be apt to ignore or dismiss red flags identified by analytic tools. 

Knowledge dissemination:

FAR Literature Review - Involvement in the development of data analytics and auditors’ application of professional skepticism
FARview #24 with Xiaoxing Li

FAR Practice Note - Inheriting vs. Developing Data Analytic Tests and Auditors’ Professional Skepticism

FAR Working Paper 2023/09 - 21: Inheriting vs. Developing Data Analytic Tests and Auditors’ Professional Skepticism

Back to overview
  • Project Number
    2021B01
  • Research team
    Prof. dr. Anna Gold
    Dr. Joseph Brazel
    Dr. Tammmie Schaefer
    Dr. Jennifer McCallen
    Xiaoxing Li PhD student
  • Involved University
    Vrije Universiteit Amsterdam
  • Timeline
    09/20 - 12/24