HTI’s latest Insight Summary highlights a disconnect between what Australian consumers and workers want in relation to AI and what organisations deliver. The Insight Summary provides practical steps for corporate leaders to better align AI use and adoption with stakeholder needs.

In the latest publication from HTI’s AI Corporate Governance Program, Insight Summary - Disconnected AI: the unmet expectations of consumers and workers, outlines the attitudes, concerns, expectations and experiences of consumers and workers towards the increasing use of AI systems.
Drawing on quantitative research into consumer expectations and quantitative research into worker experiences, conducted in partnership with Essential Media, the Insight Summary underscores why corporate leaders need to listen to these stakeholders when developing and deploying AI systems.
Consumers are worried and workers feel disempowered
While Australians recognise the potential benefits of AI, they have significant concerns about its increasing adoption. In short, consumers are worried and workers feel disempowered. There is a disconnect between what they want and what they expect organisations will provide. This is most evident in relation to privacy and transparency:
- 85% of consumers were concerned with the misuse of their personal information by a company or a breach of their privacy arising from the failure of an AI system.
- Most consumers expect to be informed when an organisation is using an AI system to deliver a product or service in a wide range of different use cases and industries.
- Yet only one in three consumers expect that corporate leaders will keep their personal data secure and not misuse it, and will be transparent when AI is being used.
Why this matters for corporate leaders and what they can do
Stakeholders can offer valuable insights into the use and development of AI systems. Organisations that fail to deeply engage with consumers and workers are missing out on their expertise and less likely to gain their trust. Worse, they risk ‘so-so automation’, where AI adoption delivers limited productivity gains but increased dissatisfaction.
To meet the expectations of consumers and workers, corporate leaders should ensure and provide:
- Accountability: AI-related harms cannot be blamed on the algorithm. 68% of consumers believe executive officers and directors of the company using the AI system should be held responsible.
- Transparency: Consumers and workers want to know when AI systems are being used, yet many have low awareness and understanding around how it is currently used.
- Redress mechanisms: Organisations should have review and challenge mechanisms in place for AI-related decisions. Consumers want to receive reasons for decisions made by AI systems, and workers want to be able to provide this to them.
- Deep engagement: Organisations need to deeply engage with stakeholders through the AI lifecycle.
- Quality training for workers: Many workers lack sufficient training of AI systems, leading to worker and consumer frustration.
Corporate leaders can address these challenges by implementing human-centred AI governance, such as by implementing the guardrails in the Voluntary AI Safety Standard. Without effective governance that responds to stakeholder concerns, organisations will struggle to meet expectations and build trust in their use of AI systems.
Read the Insight Summary - Disconnected AI: the unmet expectations of consumers and workers