Chapter 1
Navigating the risk and controls implications
GenAI constraints are becoming better understood
GenAI is quite unlike other forms of artificial intelligence. Many business leaders expect it to behave like conventional software, delivering a consistent set of outputs for the same inputs. However, rather than following a rigidly defined deterministic path, GenAI uses probability to generate outputs – which can vary.
To understand the difference, consider a scenario in which three academics are given an identical writing assignment. All three academics complete their papers with the same level of professionalism, but just like probabilistic GenAI, each paper is unique. This variance and non-repeatability are an important area of consideration for heavily regulated financial services firms, as GenAI may not be the appropriate tool when certainty is required to achieve compliance.
Other key risks associated with GenAI include:
- Data privacy and security: Generative AI models require large amounts of data to train and generate content. This data must be protected, anonymised, or consented - to comply with data privacy laws and regulations, such as the Privacy Act 1988 or the Notifiable Data Breaches scheme.
- Ethical and social implications: GenAI algorithms need controls or guardrails to ensure outputs are unbiased and accurate. For example, a GenAI tool may produce inaccurate or unfair credit scores, loan offers, or investment advice based on biased or incomplete data.
- Legal and regulatory compliance: For example, GenAI models may produce documents that fail to meet the standards of clarity, completeness, or disclosure required by the Australian Securities and Investments Commission (ASIC) or the Australian Prudential Regulation Authority (APRA).
It is important to note that GenAI models are trained on general knowledge across a wide range of topics. Therefore, GenAI models typically require additional training to understand financial services terminology and data. GenAI also relies on prompt engineering – a methodology for asking GenAI appropriately structured questions, designed to deliver the most consistent outputs.
However, GenAI excels at aggregating and summarising text, assessing large volumes of data, and providing cognitive search across large corporate knowledge repositories. When used in the right context, GenAI can deliver a step change in knowledge-worker productivity. For example, one EY client was able to double their high-risk customer capture rate thanks to GenAI.
Chapter 2
Weighing the full cost of GenAI against the return on investment
Value comes from prioritisation
A GenAI financial services use case may look compelling on paper, but it is important to take a wider view of project value, comparing full cost with the projected return on investment to determine whether the use case is truly financially viable.
For example, GenAI can be particularly resource-intensive. Creating large-language models requires significant computing power, and GenAI also needs high-quality data, sector-specific training and prompt engineering in order to generate the desired outputs. High levels of governance and oversight may also be needed to manage risk and facilitate regulatory compliance, which may make a solution particularly expensive in practice.
Chapter 3
What does a risk-based approach to GenAI adoption look like?
Test and learn
Careful prioritisation of GenAI use cases is key to navigating the challenges around risk, governance, and cost. EY clients successfully adopting GenAI tend to take a test-and-learn approach. This involves identifying relatively small, low-risk use cases that complement rather than disrupt an organisation’s existing IT, process and governance ecosystem.
According to global EY research,
91%are using AI primarily to optimize operations, develop self-service tools like chatbots, or automate processes
Only
8%are driving innovation, such as new or improved offerings.
Australian tier one financial services companies generally have robust IT engineering capability and a sound technology backbone, so embracing a test-and-learn approach plays to the sector’s strengths.
For example, using GenAI in a contact centre to summarise a customer interaction can be a powerful but relatively low risk first use case. When there is sufficient confidence in the tool, it can then be evolved to enable customer service agents to extract relevant real-time information during client calls. GenAI can perform lower-risk tasks like these faster than a human agent, enabling the agent to focus on the customer, make better decisions, streamline the interaction, and significantly improve the overall customer experience. EY clients have achieved up to an 80% increase in contact centre effectiveness leveraging GenAI.
Vendors such as Microsoft, ServiceNow, and others also continue to release new GenAI capabilities in their products. Organisations should not discourage adoption of these tools – but should ensure vendor contracts have appropriate clauses to address GenAI-specific concerns. Organisations should also have clarity on how to measure GenAI usage within vendor platforms, understand the value being delivered, and have appropriate controls in place to manage risk.
Adopting a test-and-learn approach also gives an organisation an opportunity to learn about GenAI and evolve its existing AI guardrails and governance frameworks - without exposing itself to unacceptable levels of risk and cost.
Chapter 4
Building a GenAI full lifecycle strategy
Move ahead with confidence and reliability
Evolving guardrails is just one way EY teams are now helping clients across Australia and New Zealand optimize the GenAI lifecycle. Underpinned by the EY Responsible AI Framework, this lifecycle features four key steps which are central to successful GenAI implementation:
Step one: Strategise
Financial services organisations should identify high-impact use cases and analyse costs, benefits, and risks. Define measurable success criteria and develop an AI implementation plan which is aligned to business objectives.
Step two: Implement
Firms should establish guardrails for ethical use, privacy, and security. Ensure technical infrastructure, tools, and platforms are available for AI implementation. Validate AI initiatives with proof-of-concept projects. Scale minimum-viable products and implement a change management strategy, including training and communication, to facilitate smooth transition to AI-enabled processes for all affected employees.
Step three: Run
Empower users with AI technologies through comprehensive support, including training and resources, while continuously evaluating value delivered by AI-driven processes, monitoring security, and gathering user feedback and business metrics.
Step four: Improve
Set a regular cadence of GenAI performance analysis. Identify successful implementations, assess obstacles to adoption, ensure compliance with regulations and policies, reallocate resources as required, and optimise personnel alignment for enhanced AI-enabled hyper-productivity and market advantage.
The most successful adopters of GenAI within the Australian financial services sector are likely to be those who use the technology to improve efficiency and productivity. These companies will focus on lower risk use cases that expedite rather than disrupt existing corporate systems and processes.
To be part of this group, firms will need to prioritise the greatest value-creation opportunities based on how GenAI can improve the organisation’s bottom line, using tools such as the EY.ai Value Accelerator to identify AI use cases that boost revenue, reduce cost, and optimise EBITDA.
Once organisations have prioritised individual use cases, avoided risk and generated compelling value they can then use this success as the launchpad for a longer-term vision and direction for GenAI within their organisation.
Summary
GenAI presents many strategic benefits in financial services. Firms can safely get started through test and learn, while establishing risk, governance and spend guardrails.