Skip to main content

Site navigation

  • University of Technology Sydney home
  • Home

    Home
  • For students

  • For industry

  • Research

Explore

  • Courses
  • Events
  • News
  • Stories
  • People

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt
  • Study at UTS

    • arrow_right_alt Find a course
    • arrow_right_alt Course areas
    • arrow_right_alt Undergraduate students
    • arrow_right_alt Postgraduate students
    • arrow_right_alt Research Masters and PhD
    • arrow_right_alt Online study and short courses
  • Student information

    • arrow_right_alt Current students
    • arrow_right_alt New UTS students
    • arrow_right_alt Graduates (Alumni)
    • arrow_right_alt High school students
    • arrow_right_alt Indigenous students
    • arrow_right_alt International students
  • Admissions

    • arrow_right_alt How to apply
    • arrow_right_alt Entry pathways
    • arrow_right_alt Eligibility
arrow_right_altVisit our hub for students

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Apply for a coursearrow_right_alt
  • Current studentsarrow_right_alt
  • Scholarshipsarrow_right_alt
  • Featured industries

    • arrow_right_alt Agriculture and food
    • arrow_right_alt Defence and space
    • arrow_right_alt Energy and transport
    • arrow_right_alt Government and policy
    • arrow_right_alt Health and medical
    • arrow_right_alt Corporate training
  • Explore

    • arrow_right_alt Tech Central
    • arrow_right_alt Case studies
    • arrow_right_alt Research
arrow_right_altVisit our hub for industry

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Find a UTS expertarrow_right_alt
  • Partner with usarrow_right_alt
  • Explore

    • arrow_right_alt Explore our research
    • arrow_right_alt Research centres and institutes
    • arrow_right_alt Graduate research
    • arrow_right_alt Research partnerships
arrow_right_altVisit our hub for research

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Find a UTS expertarrow_right_alt
  • Research centres and institutesarrow_right_alt
  • University of Technology Sydney home
Explore the University of Technology Sydney
Category Filters:
University of Technology Sydney home University of Technology Sydney home
  1. home
  2. arrow_forward_ios ... About UTS
  3. arrow_forward_ios ... Leadership and governanc...
  4. arrow_forward_ios ... Policies
  5. arrow_forward_ios Policy A-Z
  6. arrow_forward_ios Artificial Intelligence Operations Policy

Artificial Intelligence Operations Policy

explore
  • Policies
    • arrow_forward Policy A-Z
    • arrow_forward Policy by classification
    • arrow_forward What's new in policies?

On this page

AI Statement of Intent | Purpose | Scope | Principles | Policy statements | Roles and responsibilities | Definitions | Approval information | Version history | References

AI Statement of Intent

UTS is a recognised global leader in Artificial Intelligence (AI). Our thought leadership and innovation extend across:

  1. The creation of new AI models, tools and technology through our research and research collaborations
  2. The implementation of AI in real-world applications through our innovations and partnerships with business, civil society, and government
  3. The education of our students to be skilful, responsible users of AI in their studies and future employment, and
  4. Our work to ensure that ethical, legal and other societal impacts are considered in the development and use of AI.

As a progressive organisation, UTS also evaluates and leverages AI models and tools in our business operations where it is appropriate to do so.

1. Purpose

1.1 The Artificial Intelligence Operations Policy (the policy) guides the use, procurement, development and management of artificial intelligence (AI) at UTS for the purposes of teaching, learning and operations. 

2. Scope

2.1 This policy applies to staff and affiliates (hereafter staff) who are responsible for AI at UTS for teaching, learning and operational functions.

2.2 This policy applies to the:

  1. development, approval, use and management of AI software, systems or platforms developed or created for use by UTS, and
  2. approval, use and management of AI software, systems or platforms procured for use by UTS.

2.3 This policy does not apply to research projects and outputs (including university consulting). Information on the use of AI in research, as well as required research approvals and ethics clearances, is provided in the Research Policy and the Use of AI in Research Guidelines.

3. Principles

3.1 AI at UTS may be used for administrative and operational functions, teaching and learning activities and to improve user experiences as part of the delivery of the university’s object and functions (refer University of Technology Sydney Act 1989 (NSW)). 

3.2 To ensure confidence in UTS processes, and alignment with the UTS 2027 strategy, AI must be ethical, reliable, transparent, secure and comply with applicable laws and regulations. 

3.3 UTS acknowledges that: 

  1. there are legitimate concerns about the use and reliability of AI and will seek to identify the balance between opportunity for improvement and automation, while managing and mitigating risks to the university and its community at all times
  2. algorithmic bias may result in erroneous or unjustified differential treatment which could have unintended or serious consequences for groups of individuals and/or for their human rights
  3. the use of AI will be increasingly regulated and, as such, this policy and any associated procedure must be maintained to ensure compliance with current and emerging regulatory standards and government advice. 

3.4 Use of AI systems must comply with the Privacy Policy, the Procurement Policy, the Information Security Policy and the Acceptable Use of Information Technology Resources Policy as appropriate.

4. Policy statements

How is AI used at UTS? 

4.1 At UTS, AI may be used to support teaching, learning and operational functions and activities, including but not limited to:

  1. making operations more efficient (increasing consistency, speed, agility and scalability)
  2. improving competitiveness to quickly adapt to changing conditions 
  3. improving speed and agility to identify issues, risks and opportunities or improve controls
  4. protecting UTS physical and digital resources
  5. verifying identity and highlighting potential cheating
  6. providing feedback to students
  7. identifying at-risk and high-achieving students (to improve learning outcomes and provide additional support) 
  8. engaging in student support activities 
  9. improving student and staff experiences and/or capabilities
  10. recommending study pathways towards identified career goals
  11. recommending optimal allocation of campus facilities, and
  12. marketing activities including ranking prospects and personalisation.

AI risks and opportunities 

4.2 All AI systems exist on a spectrum of risk, ranging from low-risk (not automated, often non-operational, does not contain personal or sensitive data and does not have direct impacts on individuals) to high-risk automations (highly autonomous, normally operational AI systems with minimal controls that could impact individual and institutional safety and wellbeing). 

4.3 Guided by the NSW Government’s AI Assessment Framework (the NSW framework) operational and non-operational AI will be assessed for risks and opportunities as outlined in the Artificial Intelligence Operations Procedure (the procedure) and the Risk Management Policy. 

4.4 UTS will:

  1. assess and manage the use of AI to understand and effectively manage these risks (refer the procedure), and 
  2. balance risks and opportunities to protect UTS and individuals and their human rights (refer also Risk Management Policy).

AI endorsement, approval and oversight

4.5 Business owners must submit proposed AI systems to the Head of Data Analytics and Artificial Intelligence for feedback before commencement of any development or procurement activities as outlined in the procedure. 

4.6 The Artificial Intelligence Operations Board (the board) will develop institutional knowledge and insights about the use, management and control of AI for the purposes of teaching, learning and operations at UTS. 

4.7 The board’s terms of reference and membership are approved by the Chief Operating Officer (COO) and are available to staff (refer the procedure). The board will provide an annual report to the University Leadership Team. 

4.8 The board is responsible for endorsing the use of AI as part of an identified business activity or solution before the procurement and/or development of AI systems. Where the board considers a system to be high risk, further advice (internal or external) may be sought before submission to the COO for endorsement.

4.9 Following the board’s or the COO’s endorsement of an AI system, the final approval for expenditure of funds will apply as follows:

  1. For projects that require funding: refer to the financial delegations for project approval authority (refer Delegations).
  2. For IT Capital Management Plan (ITCMP): follow standard ITCMP processes (available at ITCMP project governance and delivery framework (Staff Connect)).
  3. For projects that result in the acquisition or procurement of an IT resource: refer to the Procurement Policy and the Acceptable Use of Information Technology Resources Policy for approval process and authority. 

4.10 AI system owners must work with AI business owners in the ongoing management, governance and monitoring of AI in systems once procured or implemented, including liaising with information system stewards (refer Data Governance Policy and Transparency and reliability) or the Information Technology Unit as appropriate. 

Ethical principles for the use of AI

4.11 UTS follows the NSW Government’s Mandatory Ethical Principles for the Use of AI, summarised as follows:

  1. Community benefit: AI should deliver the best outcome for human users, in this case, the UTS community, and provide key insights into decision-making. AI must be the most appropriate solution for a service delivery or a policy problem, considered against other analysis and policy tools. 
  2. Fairness: Use of AI will include safeguards to manage data bias or data quality risks. The best use of AI will depend on data quality and relevant data as well as careful data management to ensure potential data biases are identified and appropriately managed (refer also Data Governance Policy). 
  3. Privacy and security: AI will include the highest levels of assurance. The UTS community must have confidence that data is used safely and securely in a manner that is consistent with privacy, data sharing and information access requirements (refer also the Privacy Policy and the Records Management Policy). 
  4. Transparency: Review mechanisms will ensure that the UTS community can challenge and question AI-based outcomes and will have access to an efficient and transparent review mechanism if there are questions about the use of data or AI-informed outcomes (refer Policy breaches and complaints).
  5. Accountability: While AI is recognised for analysing and looking for patterns in large quantities of data, undertaking high-volume routine process work, or making recommendations based on complex information, AI-based functions and decisions must always be subject to human review and intervention. AI system owners and business owners are responsible for the management of their AI systems.

4.12 The ethical principles for the use of AI are applied at each phase of the AI system lifecycle. The lifecycle stages include:

  1. problem identification and requirements analysis
  2. planning, design, data collection and modelling 
  3. development and validation (including training and testing stages)
  4. deployment and implementation 
  5. monitoring, review and refinement (including when fixing any problems that occur) or destruction (removal of the system from use). 

Transparency and reliability

4.13 AI systems must only operate in accordance with their primary purpose or objective as outlined as part of the approval process. Where a change to the primary purpose is needed, this requires a separate risk assessment and endorsement. 

4.14 Where AI is used by UTS to automate a function, process or decision that may impact staff, students or others, this must be specifically identified in the relevant privacy and/or AI use disclosure notice, which must include a review or enquiry mechanism.

Privacy and records management 

4.15 Where personal information is captured, used or stored by an AI system the business owner must complete a privacy impact assessment (refer Privacy hub: Privacy impact assessments (SharePoint)). This must be approved and managed in line with the Privacy Policy and information security classifications.

4.16 Use of biometric identification must be treated as processing sensitive personal information (refer Privacy Policy). 

4.17 Data used to develop algorithms or AI systems, and any data generated, shared, managed and/or recorded as part of an AI system’s operation or algorithm, is considered corporate data and must be managed in line with the Data Governance Policy and the Privacy Policy. This requirement applies to the full system lifecycle and to the lifetime of the data (whichever is the longer). 

Policy breaches and complaints

4.18 Breaches of this policy or any associated procedures must be reported to the Chief Data Officer (CDO) and managed in line with the Code of Conduct and the relevant Enterprise agreement as appropriate. 

4.19 Any data breaches or suspected data breaches will be managed in line with the Data Breach Policy.

4.20 Complaints in relation to the use or outcomes of UTS AI systems will be managed in line with the Staff Complaints Policy or the Student Complaints Policy. 

5. Roles and responsibilities

5.1 Policy owner: The Chief Data Officer (CDO) is responsible for policy enforcement and compliance, ensuring that its principles and statements are observed. The CDO is also responsible for the approval of any associated university level procedure. 

5.2 Policy contact: The Head of Data Analytics and Artificial Intelligence (Data Analytics and Insights Unit) is responsible for the day-to-day implementation of the policy and acts as a primary point of contact for advice on fulfilling its provisions. The Head of Data Analytics and Artificial Intelligence is also responsible for providing guidance and advice on the application of this policy and the procedure and for recommending appropriate training for staff who have responsibility for AI.

5.3 Implementation and governance roles: 

The Chief Operating Officer (COO) is responsible for the establishment of the Artificial Intelligence Operations Board as outlined in this policy and the procedure. The COO is responsible for endorsing high-risk AI systems before normal approval processes in line with this policy.  

The Deputy Vice-Chancellor (Research) is responsible for the management of all research activities, including guidance on AI in research (refer the Research Policy and the Use of AI in Research Guidelines).

AI system owners and AI business owners are responsible for the approval, management and oversight of AI systems within their remit.

6. Definitions

The following definitions apply for this policy and all associated procedures. These are in addition to the definitions outlined in Schedule 1, Student Rules. Definitions in the singular also include the plural meaning of the word.

Algorithm means a series of specific directions or instructions built into computer software or systems to solve a defined problem or automate decision-making. Algorithms may use AI to produce improved outcomes. 

Algorithmic bias means an error or prejudice built into an algorithm (intentionally or unintentionally) that creates outcomes that are unfair, erroneous or favour one group of people over another and may impact an individual’s human rights.

Artificial intelligence (AI) UTS uses the NSW Government’s AI Assessment Framework definition, which states that AI is ‘intelligent technology, programs and the use of advanced computing algorithms that can augment decision-making by identifying meaningful patterns in data’. AI is used to solve problems autonomously, and perform tasks to achieve defined objectives, in some cases without explicit human guidance.

AI system (also system) means any software system, technology or program at UTS that uses AI as part of its decision-making processes, in part or in whole. AI systems include capabilities, analytics and algorithms on new or existing IT resources. 

AI system lifecycle means the time from inception through to review and renewal or destruction of an AI system.

AI system owner means appropriate staff identified as responsible for the technical implementation of the AI system and its review and monitoring as outlined in the approved project. Where there are multiple AI system owners, a primary owner (normally the most senior staff member) must be identified. AI system owners must have a working relationship with any relevant information system steward (refer Data Governance Policy). 

Business owner (of AI systems) means staff with overall accountability for the AI system. Business owners have responsibility for establishing the business case for the system, implementing the system (including identifying users, user training and risk management), monitoring its activities (including reporting and review, ensuring benefits and outcomes are realised in line with the purpose), AI ethics and providing support to the AI system owner(s). Business owners may be self-identified (during an approval or initial scoping exercise) or appointed in line with their position description. Normally business owners will be a dean, director, senior manager or equivalent.

Corporate data is defined in the Data Governance Policy.

Information system steward is defined in the Data Governance Policy. 

IT resource is defined in the Acceptable Use of Information Technology Resources Policy.

Non-operational AI system means AI systems that do not use a live environment for their source data, rather, provide analysis and insight from historical data. While, normally, non-operational AI represents a lower level of risk, the risk level needs to be carefully assessed, particularly where the outputs may be used to influence decision-making or action.

Operational AI system means to either produce an act or decision, or to prompt a human to act. Normally these systems work in real time using a live environment for their source data. These systems generally present more risks than non-operational AI systems as they tend to have a real-world effect. However, not all operational AI systems are high risk (for example, digital information boards or apps that show the time of arrival of the next bus).

Personal information (which includes health information for the purposes of this policy) is defined in the Privacy Policy. 

Research project is defined in the Research Policy. 

Research output is defined in the Research Policy. 

Approval information

Policy contactHead of Data Analytics and Artificial Intelligence 
Approval authorityVice-Chancellor
Review date2026
File numberUR23/685 
Superseded documentsNew policy 

Version history

VersionApproved byApproval dateEffective dateSections modified
1.0Vice-Chancellor18/05/202309/06/2023New policy.
1.1Vice-Chancellor24/05/202429/05/2024Updated to include the AI Statement of Intent.
1.2Deputy Director, Corporate Governance (Delegation 3.14.2)20/06/202408/07/2024Updates to reflect the review of the Acceptable Use of Information Technology Resources Policy and the Information Security Policy.
1.3Deputy Director, Corporate Governance (Delegation 3.14.2)24/04/202530/04/2025Update to refer to new Use of AI in Research Guidelines.

References

Acceptable Use of Information Technology Resources Policy

Artificial Intelligence Operations Procedure

Code of Conduct

Data Breach Policy

Data Governance Policy

Enterprise agreements

Information Security Policy

NSW Government AI Assessment Framework

NSW Government Mandatory Ethical Principles for the use of AI

Privacy Policy

Procurement Policy

Records Management Policy

Research Policy 

Risk Management Policy

Staff Complaints Policy 

Student Complaints Policy

Use of AI in Research Guidelines

UTS 2027 strategy

University of Technology Sydney Act 1989 (NSW)

Acknowledgement of Country

UTS acknowledges the Gadigal People of the Eora Nation and the Boorooberongal People of the Dharug Nation upon whose ancestral lands our campuses now stand. We would also like to pay respect to the Elders both past and present, acknowledging them as the traditional custodians of knowledge for these lands. 

University of Technology Sydney

City Campus

15 Broadway, Ultimo, NSW 2007

Get in touch with UTS

Follow us

  • Instagram
  • LinkedIn
  • YouTube
  • Facebook

A member of

  • Australian Technology Network
Use arrow keys to navigate within each column of links. Press Tab to move between columns.

Study

  • Find a course
  • Undergraduate
  • Postgraduate
  • How to apply
  • Scholarships and prizes
  • International students
  • Campus maps
  • Accommodation

Engage

  • Find an expert
  • Industry
  • News
  • Events
  • Experience UTS
  • Research
  • Stories
  • Alumni

About

  • Who we are
  • Faculties
  • Learning and teaching
  • Sustainability
  • Initiatives
  • Equity, diversity and inclusion
  • Campus and locations
  • Awards and rankings
  • UTS governance

Staff and students

  • Current students
  • Help and support
  • Library
  • Policies
  • StaffConnect
  • Working at UTS
  • UTS Handbook
  • Contact us
  • Copyright © 2025
  • ABN: 77 257 686 961
  • CRICOS provider number: 00099F
  • TEQSA provider number: PRV12060
  • TEQSA category: Australian University
  • Privacy
  • Copyright
  • Disclaimer
  • Accessibility