Regardless, the moral of the story is that it is good to recognize that sometimes there is no rule of thumb to follow and no "go to" answer that will be accepted.
Sound judgment is what happens when two or more of our highest values are in conflict, so that our rules of thumb do not apply. We struggle to teach this to our children, we struggle to learn it ourselves as we go into our careers, and it definitely will be one of those "final frontiers" as we try to teach machines to take on more complex work.
Consider Isaac Asimov's three laws of robotics:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey the orders given to it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
These laws are good and appear to be very generic "if/then" statements. In a way, they serve as rules of thumb on how to act, and in principle seem to cover all situations. Still, even when trying to comply with these rules, we still manage to have futuristic epics where robots must decide to save one human versus another, or hurt a human to protect another, and even prevent a human from eating bacon because his blood pressure has been high recently.
If you try to develop "follow-up" rules to reconcile these exceptions and make the three laws work, note that you have just stepped away from the "rule of thumb" approach. While you are at it, it also may be fun to consider if you would want those algorithms to change if one of those humans in the scenarios was yourself or a loved one.
The point is our manuals and procedures cannot address every possible situation, because whoever writes them cannot foresee every possible scenario. For the sake of argument, imagine these manuals did include a perfect solution to each situation. Then, the time spent thumbing through the book to identify the ideal response to the scenario being experienced would be in conflict with our need to act timely — to save a life, engage the next customer, contain a scandal, etc.
The good news is sound judgment develops over time based on experience and modeling of other behaviors in difficult situations. That is why Karen wants to talk to her manager, because the manager may have a better perspective or understanding of the situation and find ways to quickly resolve whatever she needs to discuss. In other words, the manager may know what to do even if this situation is not written in the manual.
That said, internal auditors know managers make mistakes. We find judgment lapses so frequently that we have some projects focused on fraud and some consulting engagements focused on process improvement. That is evidence of how difficult it is to achieve sound judgment.
Returning to the idea that sound judgment can be taught: If this is true, over time we may be able to teach it to machines. Over time, but not yet.
For now maybe instead of machines, we can teach the next generation of auditors. To do this, we simply want machines to highlight potential exceptions for the auditor to assess and decide how to respond. Then, we pair the machine with our junior team members so they both slowly learn how we would handle the situations they encounter. In this process, we teach them to trust each other, while understanding what can be expected, what is not normal, and what should be raised to the manager.
Until then, let's continue to enjoy our bacon!
Francisco Aristiguieta, CIA, is responsible for internal audit analytics at Citizens Property Insurance Corp. in Jacksonville, Fla.