Share this article on:
Joe Longo, the chair of the Australian Securities and Investments Commission, has used an address to the ISDA/AFMA Derivatives Forum in Sydney today (20 June) to warn of the challenges of artificial intelligence (AI) when it comes to the financial services sector.
After addressing the current levels of market volatility and how wholesale financial operators need to rise to the opportunity that the current environment represents, Longo addressed the rise of such tools as ChatGPT and generative AI.
He began by noting that while the financial sector has been utilising AI-based tools for the last decade when it comes to things such as algorithmic trading, generative AI is another beast altogether.
“Recent developments, especially in the field of generative AI, represent a step change, and potentially create new and different risks and issues,” Longo said. “The speed at which things are changing also seems to be accelerating. What is clear is that there is as yet no real consensus on how to regulate AI, if at all.”
A number of countries have proposed different regulatory approaches to AI, Longo noted. Some, like the European Union, have taken a legal approach, while others have taken a risk-based approach and others still — like the UK — a “‘pro-innovation’ devolved regulatory model”. Longo then praised Australia’s own recent approach on AI regulation, with its Safe and Responsible AI in Australia discussion paper, which was released on 1 June.
However, the crux of the ASIC chair’s point is that no technology should be used if we do not entirely understand how it operates, or — in the case of AI — how it comes to the conclusions that it does. Longo cited one of the early innovators of cybernetics, Professor Norbert Wiener, who in 1960 said: “If we use, to achieve our purposes, a mechanical agency with whose operation we cannot interfere effectively … we had better be quite sure that the purpose put into the machine is the purpose we really desire.”
For Longo, the key issue is “the safety and integrity of the financial ecosystem”.
“I want to take this opportunity, as an aside, to emphasise that ASIC has AI as a high and important priority,” Longo added. “Not just in regards to wholesale markets, but also its role in — and for — the whole economy, including consumers and small business.”
The ASIC chair believes that the “fear of being ‘left behind’” could be a key driver in how many organisations make use of such technology. Whether from rushing “too quickly” to adopt generative AI models or in not applying “appropriate controls”, the effects of making the wrong decisions could have disastrous consequences not just for one company, but for the wider market in general.
Longo noted that the effects of the ION Derivatives cyber attack in February 2023, which impacted traders across the US and Europe, is a perfect case in point.
“With this in mind, entities need to focus on robust governance and operational resilience measures,” Longo said. “This is nothing new — just because the technology has changed, nobody should think that means your existing obligations around good governance have changed with it. They haven’t. But it’s all too easy to forget this in the face of such rapid and unprecedented change.”
“Easy — and dangerous.”
As far as Longo is concerned, the key task is to understand what outcomes a company may seek to achieve by harnessing generative AI, and its impact on data security — especially when it comes to investor and consumer confidence.
“The point is, the fear of missed opportunities cannot be allowed to drive poor decisions, outcomes, or controls,” Longo said. “While the potential in this field is enormous, our vigilance must be unwavering. The industry will look to you to lead the way.”
Longo also announced that ASIC would be consulting on matters such as automated order processing rules in the coming financial year in regard to how AI can impact futures markets. An update to electronic trading guidelines can also be expected in that time frame.
Ultimately, Longo expects that establishing proper controls for AI should be part of any related technology’s design phase.
“It’s important that the whole financial market ecosystem works to uplift controls — just as a convoy must go at the pace of its slowest vessel,” Longo said, “so too is the financial ecosystem reduced to the strength of its weakest link”.
David Hollingworth has been writing about technology for over 20 years, and has worked for a range of print and online titles in his career. He is enjoying getting to grips with cyber security, especially when it lets him talk about Lego.