Skip to Content

On the Frontlines: Tech and Privacy Rights

Blogs The Institute of Internal Auditors Mar 03, 2023

Technologies like artificial intelligence and machine learning are changing how businesses operate, allowing them to maximize their efficiency. However, with the increased adoption of digitization, organizations also need to be aware of the privacy risks associated with these technologies.

Internal Auditor recently sat down with David Helberg, MBA, CIA, CFE, CRMA, director of internal audit and corporate ethics and privacy officer at Cameco and a member of The IIA’s North American Board, to learn more about the implications of AI and machine learning for internal auditors.

What are the main ways that the rise of artificial intelligence and machine learning have changed the business landscape?

As an increasing number of organizations embrace new digital business models, the amount of data being generated has soared. These new digital footprints, coupled with the exponential growth in data generation, have exposed organizations to new and emerging threats and vulnerabilities like never before, including increased cybersecurity risks and privacy breaches. Additionally, the recent growth we are seeing in the investment into AI, machine learning, and robotic process automation has shone a spotlight on how privacy, security, and trust are now more than ever interconnected in this new digital world and must be at the forefront of decision making when developing and implementing these new technologies.

Are there any common blind spots organizations have when it comes to the use of data and privacy laws?

When it comes to data protection and privacy laws and regulations, the most common blind spot I see amongst organizations is their overall privacy preparedness and awareness of all applicable laws and regulations. There’s an ever-growing patchwork of new data protection and privacy laws and regulations that organizations need to navigate carefully, whether it be locally, nationally, or internationally. It can be challenging for organizations to get an understanding of not only which laws and regulations are currently applicable to them but which ones they’ll need to comply with in the near future.

The internal audit function can provide value to their organization by conducting a comprehensive audit of their organization’s data protection and privacy posture amidst the changing compliance landscape.

What are some of the ethical dilemmas brought about by artificial intelligence and machine learning? How can internal auditors help manage some of these dilemmas?

Ethical dilemmas are often brought about by the data collected and how it’s being analyzed. For example, in the use of RPA, the intended purpose of it can be to prevent fraud within an organization, but the bot could collect unrelated data on an employee that is potentially against organization policy. Unauthorized disclosure of that information could potentially breach privacy laws and regulations.

Internal auditors can help mitigate these issues by working in cooperation with their organization’s privacy office, IT team, HR team, and legal function to create a sound foundation in terms of how bots should be managed. It’s important that internal auditors complete privacy risk assessments for the activities they are undertaking and stay on top of new and emerging data protection and privacy laws and regulations.

What are some of the main implications that big data has for conducting internal audit and anti-fraud procedures?

The first order of business for internal auditors is making sure that their own internal audit house is in order by complying with all applicable laws and regulations, especially data protection and privacy laws and regulations. Importantly, with the increase in remote work and the use of cloud-based systems, we need to be aware of data that may be transferred internationally and if there are national blocking statutes that may exist. While an internal auditor’s office may be based in the United States, they could also be working with other internal audit team members in Europe, Asia, Australia, and Canada, for example, so it is imperative that they work collectively as a team to ensure that their sharing of information follows all applicable laws and regulations.

Why is it important for internal auditing teams to stay ahead of the curve when it comes to technological advancements?

When it comes to the use of artificial intelligence, RPA, and machine learning within the internal audit profession, I believe we should embrace these technological changes because they are here to stay. These technologies have limitless potential to push our profession forward at a rapid pace. However, with the use of this technology, we also need to recognize the increased privacy risks associated with them and ensure our organizations are performing due diligence to stay compliant with relevant privacy laws and regulations.

To help educate internal auditors about the implications of this quickly evolving topic, Helberg will be presenting a session on “Ethical Dilemmas and Privacy Rights with the Use of Artificial Intelligence and Machine Learning Over Big Data for Chief Audit Executives” at The IIA’s 2023 General Audit Management (GAM) conference on March 14. This topic is among the more than 30 informative and engaging sessions designed to keep internal audit leaders abreast of the biggest changes in the profession and provide actionable information on how best to navigate and stay ahead of global trends.  

The Institute of Internal Auditors

Staff