Skip to Content

Please, Sir, I Want Some More​

Blogs Mike Jacka, CIA, CPA, CPCU, CLU

A group named Farnham Street has a podcast called “The Knowledge Project.” Check them and their podcast out; it will be worth your time.

In the second part of a recent episode titled “Winning at the Great Game,” Adam Robinson — author, educator, entrepreneur, and hedge fund advisor — made these comments related to critical thinking:

Human beings have a limited processing ability ... We can’t maintain a whole lot of information in our head at any one point. Because of that limited processing ability we have a hard time with too much information. So we like to think that the more information we get the “better informed” we are, and will make better decisions. But that’s not true. A seminal study was done by a psychologist named Paul Slovic back in 1974. Paul Slovic gets eight horse handicappers into a room and he says, “I want to see how good you guys are.” He says, “We’re going to spend today handicapping horse races.” Which is to say, predicting the winner of a horse race. And these had been races that had been run over the last few decades that Slovic had gotten the stats on. And he deleted the names of all the horses. Because if you knew the name of the horse it would give you an edge. So all you saw was numbers. That’s all you saw. So he said, “We’re going to handicap 40 horse races, and we’re going to do so in four rounds, 10 races each. And in the first round I’m going to give each of you horse bettors, handicappers, any five pieces of information you want.”

So you might want the weight of the jockey but that guy next to you, the other handicapper, he doesn’t really care about the weight of the jockey. He wants some other variables. So each of you, whatever five pieces you wanted and each of the horse handicappers wanted, they got. And at the end of the first round with five pieces of information, they were 17% accurate (and 19% confident). Which is pretty good. [T]here were 10 horses in every race. So we would expect 10% accuracy just blind guessing. Just pick one in 10, you got a 10% chance of getting the horse right. So if you’re betting 19% you’ve almost doubled your results. That’s pretty good. So almost identical confidence and accuracy with five pieces of information. Round two, they were given 10 pieces of information, then 20, and in round four they had 40 pieces of information. And there were 10 races. So this was statistically valid results. Their accuracy was still only 17% with 40 pieces of information, but their confidence almost doubled to 31%. It went from 19% to 31%. So they are now almost twice as confident as they ought to be. ... All the new information did was make them more confident in a decision they already made."

First, it feels like this explains my notoriously lousy results at the track. I can’t say exactly why, but I’m sure this somehow speaks to my ineptitude.

But, more to the point, this little snippet has a couple of takeaways for internal auditors.

Takeaway No. 1: We take a perverse pleasure in testing things to death. And the information age seems to have been specifically designed to give us a direct hit of that high we crave. Now we can literally test 100% of the data. Ah, what assurance, what perfection, what glory.

And yet, is it worth it? Will we have better information? Will we make better decisions? Will we understand more?

As the study quoted above indicates (and there are others out there that show similar results — so this isn’t a one-trick pony) there is a more-than-dandy chance we are wasting a lot of our time. time spent on increased testing, time spent double-checking to “just make sure,” and time spent wanting to verify everything with “just one more person.”

Not all our time testing and double-checking and verifying is wasted. But we need to take a serious look at the work we are actually doing. How much of that work is us just trying to be sure? How much of it actually proves something new? How much of it replicates what we already know? How much of it occurs because the internal auditor is afraid to make an inconsequential mistake, afraid to not be perfect, afraid that even the smallest chink in the armor will be seen as the reason to invalidate everything done in the past, present, and future?

Rather than spending our time on unending testing — trying to achieve 100% assurance — we should be spending our time understanding when enough is enough. And then using that newly discovered time to move onto something that is really important.

But there is a second takeaway, and it has to do with my blog post last week.

(Okay, face it. Some of you thought I’d forgotten my promise to continue talking about report writing, hadn’t you? Face it. Some of you couldn’t care less, could you? Well, I’m going ahead, anyway.)

In that last blog post, I discussed how internal auditors were busy learning the tactics and craft of report writing, but were not focusing on what really makes reports successful — learning how to understand, react, and work with other people.

My intent with this next blog post was to spend some time talking about emotional intelligence (EQ) and its role in reporting writing. In fact, my intent was to actually talk about how reports should be modified — tone, structure, anything — to meet the EQ needs of each individual client.

And my intent was to discuss how internal auditors need, above anything else, to be trained on EQ — to understand themselves, understand their clients, and understand how to bring the two together.

And as I started typing that particular blog post, I had one of those moments where I began to think that this was “Something Important” — a “BIG IDEA” that warranted investigation, research, and a grand investment of time.

(At this point, allow me to note that I have an ego roughly the size of Manitoba, so it is far from unusual for me to believe I have had a “Great Thought.” I provide this simply to provide you insight into the writer and, in reality, the less you know about this problem, the better.)

So I began my research and study and investigation and reading and cross-referencing and documenting and exploration and inquiries and reviews and burrowing and ferreting and plunging into the black hole where all “brilliant ideas” die in their own heat death. And, before I knew it, I was so submerged in information that I literally did not know where to turn. My mind was frozen, my thoughts awhirl with connected and unconnected data, and my blog post was a sick and dying obesity of useless information.

Let’s quote Robinson one more time. “We can’t maintain a whole lot of information in our head at any one point. Because of that limited processing ability we have a hard time with too much information.”

And there I sat with a whole lot of information and an inability to process it.

Yeah, we in internal audit love to test things to death. But, in all situations, don’t we also love to collect as much information as possible. How many interviews does it take to really understand the risks the organization faces? How much research does it take to really understand the changes facing an industry? How much background information does it take to really understand the way a process/department/organization works?

How much time do we waste trying to develop the big idea — in risk, in control, in processes, in objectives, in strategies, in anything — and, while trying to get to that big idea, waste the time that should be used looking into important ideas.

(Note: Big does not mean important. Note #2: Not all valuable ideas — in fact, few valuable ideas — are necessarily big ideas.)

We talk a lot about auditing smart. And this may well be the one area where we can start being the smartest. in recognizing when enough is enough, and it is time to act, to start the audit, to start the test, to finish the test, to write the report, and to just get things done.

I still think there is something worth pursing in the idea of “EQing Audit Reports” (patent pending, trademark applied for, copyright established, it’s mine and you can’t have it). But, if there is something to be found, it won’t be discovered in a fever-pitched, time-restricted, search-and-rescue approach to gathering and assimilating the information.

Any idea — anything we need to know — is best discovered as it comes to us. And once we are there — once we know what it is we need to know — we have to learn to quit looking for more.

Mike Jacka, CIA, CPA, CPCU, CLU

Co-founder and Chief Creative Pilot, Flying Pig Audit, Consulting, and Training Services (FPACTS), based in Phoenix.