AI

ChatGPT for financial services: is it worth the risk?

Read the original article by Steve Whiter, Appurity Managing Director, on financialit.net

techradar logo

ChatGPT could be transformative for the financial industry

The financial services sector is poised to benefit hugely from widespread use of ChatGPT and other language processing models. These novel technologies can help detect fraudulent or risky behaviour, enhance marketing and customer management efforts, improve user experiences, stay on top of market trends and even support compliance requirements. In short, ChatGPT could be transformative for the financial industry.

But of course, any new technology that promises to change work and society as we know it comes with a host of risks. Tech leaders in regulated industries – which require adherence to the highest levels of data protection, regulatory compliance, and cybersecurity frameworks – should take extra note. Before any firm considers using the technology to enhance its operations, here’s what needs to be considered:

The data problem

We know that ChatGPT and other language processing tools are trained on data to generate responses. When using data in this way, there’s always a risk that the input data comes from unreliable or untrustworthy sources. It could even be sourced without users’ explicit consent.

This poses both a bias and a privacy problem.

When language processing tools have access to personal or sensitive information, and are trained on this information, it is much more difficult for users to later retract access to it. Language processing tools aren’t simply a database which users can ask to have their data deleted from – their personal data may have been used to train and refine those models. Firms that must comply with stringent data protection policies may face an additional hurdle here where their users’ digital privacy rights are concerned. It’s not insurmountable, but it’s something to consider.

At the workplace level, if employees are simply using publicly available language processing tools to process data, this raises many questions about data protection and regulatory compliance – as well as ethical concerns. Firms will need to ensure they have robust management policies in place for the use of language processing tools, including exactly what data can and cannot be processed, and any additional layers of security to protect this data in case the models are exploited or fall victim to a cyber attack.

Firms also need to consider the effects of bias in the language models they use. Without full visibility into the data which the AI has been trained on, users may not be able to fully trust the output information they receive – especially if that information is used to help form a financial opinion or recommendation.

Cybersecurity concerns

Phishing and social engineering attacks are on the rise, and already cybersecurity firms are warning of ChatGPT’s potential to produce sophisticated phishing text or malicious code. In the financial industry, this might manifest as text generated to impersonate a respected institution or individual which is used as part of a social engineering attempt to gain access to a user’s financial details or personal information.

As  with any new technology or tool, malicious actors will always look for an opportunity to exploit the next big thing. ChatGPT’s owner, OpenAI, discovered this earlier this year when an open-source library it uses to cache user information was exploited through a vulnerability. This exploit enabled users to see the chat history of other active users – a significant privacy mishap. And while this exploitation didn’t have grave reputational effects for ChatGPT (the damage was largely contained), it’s an example of what can go wrong when tools and technologies aren’t appropriately secured. The more a firm relies on third-parties to process, store, and share its data, the more it outsources the responsibility of securing this data. If you can’t guarantee with certainty that all third-party tools and technologies are securing your and your clients’ critical data to the highest standards, should you be using them at all?

Communication monitoring

Many firms operating in the financial industry are required to keep accounts of all conversations and communication they have with customers or clients. Ask yourself: where does language processing fit into this? Will your firm keep a complete paper-trail of all data inputted and outputted to language processing tools in case of an investigation or misconduct allegation? These are big questions, and tech leaders may find that there are no easy or straightforward answers.

ChatGPT and other language processing tools have the potential to streamline operations and process data in new and effective ways. This represents a huge opportunity for many global businesses, including those in the financial services industry.

But the responsibility to protect and secure your firm and your clients’ data still falls to you, no matter which tools or technologies are introduced to enhance your workflow or operational processes. The only thing that’s changed is the expansion of this responsibility – all firms that rely on language processing tools will now need to implement additional security strategies that cover these tools, instituting additional protective measures to mitigate risks.

The key takeaway for firms interested in using language processing tools is that they must consider how these new technologies fit within their regulatory frameworks. Firms which make use of programs like ChatGPT will also need to make a renewed commitment to cybersecurity to ensure all processed data is protected to the highest standards. Be vigilant of the risks, acknowledge the additional work required to protect your firm and the data it holds, and continually assess your security defences.

RESOURCE

Cyber Essentials

Cyber Essentials is a government-backed scheme that helps businesses protect against a range of online threats.

DOWNLOAD

Appurity Cyber+

Is your business ready for Cyber Essentials Plus accreditation?

Share This Story, Choose Your Platform!

Ready to talk?

Confirm you are a human *

We're committed to your privacy. Appurity uses the information you provide to us to contact you about our relevant content, products, and services. You may unsubscribe from these communications at any time. For more information, check out our Privacy Statement.