
Securities Enforcement & Litigation Insider
Search Blog
Preparing For A Greater AI Presence In The Securities Industry
This article originally appeared on Law360.
ChatGPT, an artificial intelligence program, has grabbed wide attention since its first introduction to the public.
It has become the fastest-growing consumer application in history with more than 100 million monthly active users.[1] People are amazed by its ability to respond intelligently to complex queries.
ChatGPT is only one of the many AI tools that are being developed and used in various industries to improve efficiency and customer service.
AI has also become a disruptive force in the securities industry.
Some reports suggest that ChatGPT is better at selecting stocks than some of the most popular investment funds in the U.K.[2] According to a study at the University of Florida — "Can ChatGPT Forecast Stock Price Movements? Return Predictability and Large Language Models" — ChatGPT is also better at predicting how stocks will react to news headlines than traditional models.[3]
Several well-known investment firms have publicized their efforts to incorporate AI into their business models.
For example, JPMorgan Chase & Co. is developing a software service called IndexGPT that uses artificial intelligence to analyze and select securities for customers.[4]Morgan Stanley Wealth Management has also recently partnered with OpenAI to access, process and synthesize contents for financial advisers.[5]
When it comes to financial firms incorporating AI into their securities-related businesses, these developments are just the tip of the iceberg.
However, with great potential, AI also brings significant risks.
The chair of the U.S. Securities and Exchange Commission, Gary Gensler, has raised several concerns over the use of AI in the securities industry in his recent speeches. For example, in discussing the use of data analytics, Gensler notes that the choice of input data might perpetuate societal inequities and historical biases.[6]
That same analysis can be easily applied to the use of AI. Gensler also expressed concern about how the use of AI may exacerbate conflicts of interest, challenging academics and computer scientists to consider "how you could have a neutral algorithm that's not putting a platform or a business's revenue or profits ahead of the investing public."[7]
There is also concentration risk.
In the future, it is likely that only a couple of AI tools will dominate. The commonality of the AI tools could create risk via herding, interconnectedness and data concentration.[8] Gensler commented that future observers might say "the crisis in 2027 was because everything was relying on one base level, what's called [the] generative AI level, and a bunch of fintech apps are built on top of it."[9]
In response to the risks associated with AI, like other government agencies, the SEC is exploring new regulations or guidelines for the use of AI in the securities industry.
The SEC said recently that new rules to address conflicts of interest in the use of AI could be proposed as soon as October.[10] While considering potential rulemaking, Gensler emphasized that current rules should be adopted to regulate AI — according to him, these are "tried and true public polices" and less "about a new law or new rule about artificial intelligence."[11]
On April 6, the SEC Investor Advisory Committee submitted its recommendation on setting up ethical guidelines for AI and algorithmic models used by investment advisers and financial institutions.[12]
The IAC points out that the Investment Advisers Act of 1940 governs the use of AI by investment advisers.[13] Under the Advisers Act, investment advisers have an affirmative duty of care, loyalty, honesty and utmost good faith to act in the best interests of investors.[14]
The use of AI by advisers does not change the fiduciary duty of advisers. In addition, Rule 206(4)-7 of the Advisers Act requires all advisers to establish a compliance program that addresses the investment advisers' performance of regulatory obligations.[15]
Investors should consider the unique aspects of AI in establishing their compliance programs.
The IAC encourages the SEC to consider the following three key tenets in developing its guidance to investment advisers on the use of AI.[16]
- Equity: Advisers should be aware that biases may be replicated in the inputs selected for the algorithms. Biased data could lead to flawed or discriminatory investment advice.
- Consistent and persistent testing: Regular testing should be done on AI to minimize the potential for biased inputs or outcomes.
- Governance and oversight: Advisers should have a management and governance framework to ensure that AI is used in the best interest of investors and without bias.
The IAC recommends tasking the SEC Division of Examinations with monitoring compliance with the new ethical artificial intelligence framework.
While the SEC is in the process of developing new rules and guidelines for the use of AI, investment advisers and financial institutions should take steps to make sure that they are in compliance with the current regulations under the Advisers Act.
A broad compliance framework and accountability mechanism should be put in place to manage the use of AI at the firm. There should also be an independent evaluation process to evaluate the potential conflicts of interest raised by the AI program.
Firms should conduct periodic review and testing to monitor the performance of the AI program and to detect potential biases in the input data and the algorithm. Extra steps should be taken to ensure the privacy and security of client data.
Firms should also make sure relevant people possess sufficient knowledge and skills to understand how the AI works so that the process is transparent and comprehensible. There should also be adequate disclosure so that clients understand the nature of the involvement of AI in the process.
The age of AI is coming and AI will likely have a greater presence in the securities industry in the near future.
While AI has enormous potential to make investment more efficient and accessible, it also brings a set of new risks that are not comprehensively addressed by current risk frameworks.
Investment advisers and financial institutions should be proactive in understanding and managing the risk of AI so that they can fully master the power of AI while avoiding the potential dangers.
[1] David Shepardson & Diane Bartz, US Begins Study of Possible Rules to Regulate AI Like ChatGPT, REUTERS (April 12, 2023), https://www.reuters.com/technology/us-begins-study-possible-rules-regulate-ai-like-chatgpt-2023-04-11/.
[2] Anna Cooban, ChatGPT Can Pick Stocks Better Than Your Fund Manager,CNN(May 5, 2023), https://www.cnn.com/2023/05/05/investing/chatgpt-outperforms-investment-funds/index.html.
[3] Lopez-Lira, Alejandro and Tang, Yuehua, Can ChatGPT Forecast Stock Price Movements? Return Predictability and Large Language Models (April 6, 2023). Available at SSRN: https://ssrn.com/abstract=4412788orhttp://dx.doi.org/10.2139/ssrn.4412788.
[4] Hugh Son, JPMorgan Is Developing a ChatGPT – Like A.I. Service that Gives Investment Advice, CNBC (May 25, 2023), https://www.cnbc.com/2023/05/25/jpmorgan-develops-ai-investment-advisor.html.
[5] Morgan Stanley Wealth Management Announces Key Milestone in Innovation Journey with OpenAI, Morgan Stanley (March 14, 2023), https://www.morganstanley.com/press-releases/key-milestone-in-innovation-journey-with-openai.
[6] Chair Gary Gensler, "Investor Protection in a Digital Age," Remarks Before the 2022 NASAA Spring Meeting & Public Policy Symposium (May 17, 2022), https://www.sec.gov/news/speech/gensler-remarks-nasaa-spring-meeting-051722.
[7] Betsy Vereckey, SEC's Gary Gensler on How Artificial Intelligence Is Changing Finance, MIT (October 12, 2022), https://mitsloan.mit.edu/ideas-made-to-matter/secs-gary-gensler-how-artificial-intelligence-changing-finance.
[8] Chair Gary Gensler, supra note 6.
[9] Richard Vanderford, Next Financial Crisis Could Come From AI, SEC Chair Says, The Wall Street Journal (May 16, 2023), https://www.wsj.com/articles/next-financial-crisis-could-come-from-ai-sec-chair-says-fbe8ecc9.
[10] Lydia Beyond, SEC to Weigh New Artificial-Intelligence Rules for Brokerages, Bloomberg (June 13, 2023), https://www.bloomberg.com/news/articles/2023-06-13/sec-to-weigh-new-artificial-intelligence-rules-for-brokerages#xj4y7vzkg,
[11] Vereckey, supra note 7.
[12] Investor Advisory Committee, Establishment of an Ethical Artificial Intelligence Framework for Investment Advisors (April 6, 2023), https://www.sec.gov/files/20230406-iac-letter-ethical-ai.pdf.
[13] Id.
[14] Id.
[15] 17 C.F.R. § 275.206(4)-7.
[16] Investor Advisory Committee, supra note 12.