Bates Research | 05-12-23
The Compliance Risk of Artificial Intelligence
The use of Artificial Intelligence (AI) and algorithms is becoming increasingly widespread in today's digital environment. This technology can provide many benefits, but it also brings with it a unique set of potential risks that organizations must consider when incorporating AI into their processes, as financial services regulators have noted in recent years.
Compliance risk is one such risk to consider. Compliance risks associated with using AI include the potential for bias or discrimination in decision-making as well as the threat of data breaches due to inadequate security measures. (See, e.g., Bates article “Errors, biases and algorithms: how to interpret automated results” by Alex Russell.) In April 2023, the Federal Trade Commission (FTC) issued business guidance to help companies understand the compliance risks that come with using AI and algorithms, including strategies to manage associated consumer protection risks. We take a look at some of the compliance considerations here.
Assess Compliance Policies and Procedures
Organizations should assess their existing policies and procedures related to privacy and data security to ensure they are designed to address the potential risks associated with AI. Organizations should also consider developing new policies and procedures specific to the use of AI, such as ensuring that algorithms are tested for bias before they are used in decision-making processes.
In addition, organizations must be aware of the potential for malicious actors to exploit AI technology for fraud or other illegal activities. Deep fake technology is a particularly concerning example of this risk: deep fakes can be used to produce false audio or video recordings of people saying or doing things they never actually did or said.
Test Security Vulnerabilities and Threats
One way to gauge consumer trust is through a luring test — a process by which developers can identify possible security vulnerabilities by replicating techniques used by malicious actors. The Federal Trade Commission (FTC) recently released guidelines on conducting luring tests responsibly, urging companies to think carefully about the data collection and storage policies associated with such tests. These include ensuring customer consent before collecting any data and properly disposing of it once the test is complete. Companies should also build safeguards into their systems to protect against unauthorized access and use of customer data. By taking steps to ensure consumer trust in AI engineering, companies can improve not only their products’ performance but also their customers’ satisfaction.
Cybersecurity
Organizations should take measures to protect themselves and their customers from these threats by investing in strong cybersecurity practices, keeping up with technological developments, and monitoring their systems for unauthorized access. By taking proactive steps to address the compliance risks of using AI technologies, organizations can protect themselves and their customers from potential harm.
Conclusion
Through responsible data collection and storage policies, businesses can ensure that their AI engineering meets both user expectations and privacy regulations. Doing so will benefit both companies and consumers in the long run as they continue to take advantage of the advancements made possible by AI technology. Non-compliance can lead to FTC violations as referenced in the April 2023 article regarding the appropriate use of AI.
About Bates:
Bates Group has been a trusted partner to our non-banking financial institutions and financial services clients and their counsel for over 40 years, delivering superior quality and results on a cost-effective basis. With a full professional staff and a roster of over 175 financial industry and regulatory compliance experts, Bates offers services in AML and compliance, regulatory enforcement and internal investigations, litigation consultation and testimony, forensic accounting, damages, and big data consulting.
Bates Group's MSB, FinTech and Cryptocurrency team provides a full suite of Bank Secrecy Act, Anti-Money Laundering and Office of Foreign Assets Control (BSA/AML/OFAC) compliance consulting services, consumer compliance consulting, state money transmitter licensing acquisition and maintenance support, independent reviews, and corporate compliance training.
Contact Bates today for services and expertise you can count on
About the author:
Brandi Reynolds
Chief Growth Officer and Senior Managing Director, Fintech & Banking Compliance