In the following guest post, Sarah Abrams, Head of Professional Liability Claims at Bowhead Specialty, takes a look at the challenges that financial trading based on Artificial Intelligence (AI) could mean for D&O, Professional, and Cyber insurers. I would like to thank Sarah for allowing me to publish her article as a guest post on this site. I welcome guest post submissions from responsible authors on topics of interest to this blog’s readers. Please contact me directly if you would like to submit a guest post. Here is Sarah’s article.
The increasing utility and reliance on Artificial Intelligence (AI)-powered technology by Financial Institutions for algorithmic trading has created a developing area of underwriting exposure. In particular, which policy should respond to a trading loss resulting from an AI error; director and officer, cyber and/or professional liability. Notably, there has not been an emergence of published case law examining this issue.
Algorithmic trading technology is being used to minimize transaction costs, improve order execution, and minimize human errors involved in trading securities. It will be worth up to $19 billion annually by 2024.[i] AI trading uses a pre-defined algorithm to place trades automatically, without the need for human interaction. Decisions are made based on historical data which is analyzed by the AI trade bot.[ii] An important distinction at this point for traders, investors and insurers are what roles an investment advisor and/or broker dealer may play in the AI financial trade.
An investment advisor is a person or firm that manages investments. Investment advisors help investors buy and sell securities (think: stocks, bonds or ETFs as well as provide advice on things like portfolio management, asset allocation, market analysis as well as wealth planning. Investment advisors have a fiduciary duty to their clients that requires them to put their clients’ interests ahead of their own.[iii] An advisor isn’t allowed to conduct trades directly for the advisor’s own institutional account with clients unless details about the transaction are released to the client and the client gives consent.
By contrast, a broker-dealer is a person or firm that largely focuses on buying and selling securities – both for its clients and for itself. Broker-dealers that work with the public typically become members of the self-regulatory Financial Industry Regulatory Authority (FINRA). Broker-dealers owe a duty of fair dealing with their clients, which is generally seen as being less onerous than the fiduciary duties that registered investment advisors owe their clients.[iv]
In July 2018, FINRA solicited comments from the industry on the potential challenges associated with using and supervising AI applications at broker-dealer firms.[v] In response, commenters recommended that FINRA undertake a broad review of the use of AI in the securities industry to better understand the varied applications of the technology, their associated challenges, and the measures taken by broker-dealers to address those challenges. FINRA noted in its June 2020 Report on Artificial Intelligence (AI) in the Securities Industry that, with respect to use of AI in Trading and Portfolio Management:
“Firms should bear in mind that use of AI in portfolio management and trading functions may also pose some unique challenges, particularly where the trading and execution applications are designed to act autonomously. Circumstances not captured in model training – such as unusual market volatility, natural disasters, pandemics, or geopolitical changes – may create a situation where the AI model no longer produces reliable predictions, and this could trigger undesired trading behavior resulting in negative consequences.”[vi]
In practice, AI is used as an umbrella term that encompasses a broad spectrum of different technologies, including Machine Learning, Supervised Machine Learning, Unsupervised Machine Learning, Reinforcement Learning and Deep Learning. Within portfolio management, firms noted the use of AI applications to identify new patterns and predict potential price movements of specific products or asset classes. Some broker-dealers that are also investment advisors aim to incorporate these predictions into their investment strategies to generate alpha for the portfolio.[vii] Securities industry participants are also using AI to make their trading functions more efficient by maximizing speed and price performance. Machine Learning is already being used by firms for smart order routing, price optimization, best execution, and optimal allocations of block trades.[viii]
Especially as case law addressing applicable coverage has not yet fully emerged, insurance underwriters of Investment Advisors and Broker Dealers should consider the risks stemming from client use of AI trading. The origination of the software platform and code implemented, particularly if a third party vendor is contractually engaged on behalf of a firm will certainly be examined in case of a loss. In addition, should Investment Advisor firms be held to the same standard of care as a Broker Dealer if funds the firm is managing are traded using AI?
Unsurprisingly, cybersecurity is a tremendous pain point for AI trading platforms. In 2019, IBM found that the finance and insurance industry was the most attacked industry in terms of cybersecurity threats.[ix] The virtual threats against the financial industry can be both external and internal. In addition, the rise of financial artificial intelligence and related financial technology heightens the dangers of systemic risk and major financial accidents.[x] A concern rising out of the AI trading code acting essentially as a Broker Dealer.
So should cyber insurance the policy that responds when an AI trading platform fails? If Machine Learning is being utilized by a Broker Dealer to execute on a trade, but there is a software deficiency, is there a “cyber event” that would trigger a cyber policy? Perhaps Director and Officer liability coverage is broad enough to encompass the Insured Financial Institution, particularly Investment Advisor management reliance on AI for brokerage management.
But what about a professional liability or cyber exclusion to a D&O Policy? Is the input of code to direct an AI trade a professional service by a broker dealer? Does it matter how the AI trading software failed? If it was the result of a defined cyber event, the expectation would be that cyber insurance should respond. However, if it was the result of a coding error by a trader, is that really a cyber incident? If the AI trading software malfunctions, is there coverage for the platform technology failure.
From an underwriting perspective, asking financial institutions about AI trading utility, including use of proprietary software provides more clarity into the risk profile and future exposure. It really is a brave new world for insurers.
[ix] IBM, X-FORCE THREAT INTELLIGENCE INDEX 4 (2019), https://www.securindex.com/ downloads/8b9f94c46a70c60b229b04609c07acff.pdf [https://perma.cc/3MDS-4TJW].