Skip to main content

Site navigation

  • University of Technology Sydney home
  • Home

    Home
  • For students

  • For industry

  • Research

Explore

  • Courses
  • Events
  • News
  • Stories
  • People

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt
  • Study at UTS

    • arrow_right_alt Find a course
    • arrow_right_alt Course areas
    • arrow_right_alt Undergraduate students
    • arrow_right_alt Postgraduate students
    • arrow_right_alt Research Masters and PhD
    • arrow_right_alt Online study and short courses
  • Student information

    • arrow_right_alt Current students
    • arrow_right_alt New UTS students
    • arrow_right_alt Graduates (Alumni)
    • arrow_right_alt High school students
    • arrow_right_alt Indigenous students
    • arrow_right_alt International students
  • Admissions

    • arrow_right_alt How to apply
    • arrow_right_alt Entry pathways
    • arrow_right_alt Eligibility
arrow_right_altVisit our hub for students

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Apply for a coursearrow_right_alt
  • Current studentsarrow_right_alt
  • Scholarshipsarrow_right_alt
  • Featured industries

    • arrow_right_alt Agriculture and food
    • arrow_right_alt Defence and space
    • arrow_right_alt Energy and transport
    • arrow_right_alt Government and policy
    • arrow_right_alt Health and medical
    • arrow_right_alt Corporate training
  • Explore

    • arrow_right_alt Tech Central
    • arrow_right_alt Case studies
    • arrow_right_alt Research
arrow_right_altVisit our hub for industry

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Find a UTS expertarrow_right_alt
  • Partner with usarrow_right_alt
  • Explore

    • arrow_right_alt Explore our research
    • arrow_right_alt Research centres and institutes
    • arrow_right_alt Graduate research
    • arrow_right_alt Research partnerships
arrow_right_altVisit our hub for research

For you

  • Libraryarrow_right_alt
  • Staffarrow_right_alt
  • Alumniarrow_right_alt
  • Current studentsarrow_right_alt

POPULAR LINKS

  • Find a UTS expertarrow_right_alt
  • Research centres and institutesarrow_right_alt
  • University of Technology Sydney home
Explore the University of Technology Sydney
Category Filters:
University of Technology Sydney home University of Technology Sydney home
  1. home
  2. arrow_forward_ios ... About UTS
  3. arrow_forward_ios ... Information on Faculties...
  4. arrow_forward_ios ... Faculty of Engineering a...
  5. arrow_forward_ios ... School of Electrical and...
  6. arrow_forward_ios ... What we do
  7. arrow_forward_ios Perceptual Imaging Lab (...
  8. arrow_forward_ios Studying with PILab

Studying with PILab

explore
  • What we do
    • arrow_forward Industry
    • arrow_forward Integrated Nano Systems (INSys) Lab
    • Perceptual Imaging Lab (PILab)
      • arrow_forward Our people
      • arrow_forward Our projects
      • arrow_forward Working with PILab
      • arrow_forward Studying with PILab
    • RF and Communication Technologies (RFCT) Research Development and Testing Facility
      • arrow_forward Fabrication and Measurement Facilities at RF and Communication Technologies R&D
      • arrow_forward RFCT members
      • arrow_forward Studying with RFCT
    • arrow_forward Teaching and learning

πLab is currently actively seeking high quality students to pursue PhD degrees. Join our team and start an exciting career in a growing area!

Study Learning and Leadership

Want to study for a PhD degree in any of the areas below? Please contact πLab or the relevant supervisor listed.

PILab areas of interest

Perceptually-accurate simulation of real surfaces and materials in virtual environments

Degree: PhD

Supervisor: A/Prof Stuart Perry (UTS)

Co-supervisors: Dr Juno Kim (UNSW) and A/Prof Don Bone (UTS)

Project description: Perceptually-accurate simulation of real surfaces and materials in virtual environments (joint project with School of Optometry and Vision Science, University of New South Wales):

An exciting opportunity is available to undertake a PhD conducting research on a cross-institutional collaboration in the field of surface and material appearance. Material appearance is the vivid perceptual experience we have of different material properties when we look at images (e.g., 3D shape, colour, gloss, lightness/albedo). Research into graphics and visual reality aims to simulate interactions of light with surfaces to generate these material experiences in artificial environments, both in real-time and as realistically as possible. Much of the complexity in light’s interaction with opaque objects can be simulated using computational models, such as a BRDF. Although BRDF information is generally captured using highly specialised equipment, this equipment is usually not well-suited to the problem of scanning real-life 3D objects. This is because many real-life objects have complicated BRDFs and may even deviate beyond the scope of these models (e.g., when objects are semi-opaque). Hence, collecting accurate material appearance for real 3D objects is still a potentially complicated problem.

This project is primarily concerned with the collection of material appearance information from scans of real objects. Current techniques attempt to bring together multiple image captures to collect sufficient information to resolve a physical model of surface reflectance, such as a micro-facet model. However, as material appearance is directly related to how humans perceive materials, this project will also use psychophysical experimentation to elucidate the fundamental dimensions of model parameters that are necessary to maximise the efficient capture and simulation of physical surface properties.

3D visual saliency detection

Degree: PhD

Supervisor:  A/Prof Min Xu 

Project description: 3D visual saliency detection:

Visual saliency has been widely researched to estimate human gaze density in 2D. In this research, we will explore human gaze density estimation in 3D. In a 3D environment, human gaze will be attracted by salient regions, which are not only standout with visual feature contrast but also distinguishable in a depth map.  

Emotion-based human computer (multimedia) interaction

Degree: PhD

Supervisor: A/Prof Min Xu 

Project description: Emotion-based human computer (multimedia) interaction:

This research will focus on human emotion estimation through analysing a couple of wearable sensor data. The research involves signal processing, wearable sensor data fusion and time series data analysis.

Image Aesthetics Analysis

Degree: PhD

Supervisor: A/Prof Min Xu

Project description: Image Aesthetics Analysis:

Automated assessment of image aesthetics is a significant research problem due to its potential applications, such as image retrieval, image editing, design, and human computer interaction. The research will create a machine expert system, which is able to provide an automatic aesthetic rating for any images. Image analysis and machine learning (e.g. deep learning) are the key components.

 

 

3D Point Cloud Segmentation and Analysis

Degree: PhD

Supervisor: A/Prof Stuart Perry and Dr Wenjing Jia

Project description: 3D Point Cloud Segmentation and Analysis 

Scanned 3D data is increasing in availability and produced by various devices such as LIDAR, SLAM and structured light imaging systems. Although there has been considerable research into the segmentation and analysis of 2D imagery and videos, there has been a lack of research into the segmentation and analysis of 3D point cloud data. Segmentation and analysis of point cloud data is crucial to a number of applications such as autonomous driving, 3D urban mapping and the 3D scanning of large cultural heritage sites to name a few.

This project is concerned with using advanced machine learning frameworks to identify and classify objects in both public datasets such as Semantic 3D (http://www.semantic3d.net/) and data sets collected using equipment available at UTS such as structured light and stereo 3D capture technologies. Data collected may be static or dynamic point cloud data and the goal would be to develop systems relevant to real-world problems such as safe systems for autonomous vehicles.



 

 


 

Acknowledgement of Country

UTS acknowledges the Gadigal People of the Eora Nation and the Boorooberongal People of the Dharug Nation upon whose ancestral lands our campuses now stand. We would also like to pay respect to the Elders both past and present, acknowledging them as the traditional custodians of knowledge for these lands. 

University of Technology Sydney

City Campus

15 Broadway, Ultimo, NSW 2007

Get in touch with UTS

Follow us

  • Instagram
  • LinkedIn
  • YouTube
  • Facebook

A member of

  • Australian Technology Network
Use arrow keys to navigate within each column of links. Press Tab to move between columns.

Study

  • Find a course
  • Undergraduate
  • Postgraduate
  • How to apply
  • Scholarships and prizes
  • International students
  • Campus maps
  • Accommodation

Engage

  • Find an expert
  • Industry
  • News
  • Events
  • Experience UTS
  • Research
  • Stories
  • Alumni

About

  • Who we are
  • Faculties
  • Learning and teaching
  • Sustainability
  • Initiatives
  • Equity, diversity and inclusion
  • Campus and locations
  • Awards and rankings
  • UTS governance

Staff and students

  • Current students
  • Help and support
  • Library
  • Policies
  • StaffConnect
  • Working at UTS
  • UTS Handbook
  • Contact us
  • Copyright © 2025
  • ABN: 77 257 686 961
  • CRICOS provider number: 00099F
  • TEQSA provider number: PRV12060
  • TEQSA category: Australian University
  • Privacy
  • Copyright
  • Disclaimer
  • Accessibility