Skip to main content
Photo of Marian-Andrei Rizoiu

Associate Professor Marian-Andrei Rizoiu. Photo Andy Roberts

Australia has a problem with vaccine hesitancy. Childhood immunisation rates have been falling since 2022, for the first time in more than 20 years, while a recent study found Australian parents viewed vaccines more negatively in 2023 than they did in 2017. Social media has been fertile ground for a vocal fringe to grow their anti-vaccination narrative. It has almost certainly contributed to this growing problem. 

From anti-vaxxers to the recent UK riots, evidence of the real-world harms when online misinformation spreads is everywhere.

It’s real-world harm that UTS data scientist Associate Professor Marian-Andrei Rizoiu found out about when close family member disappeared down a rabbit hole of extremist online communities.

“Like many families, mine is grappling with the effects of misinformation on someone we love and the damage it does to us all. I now have a kind of personal mission in my professional life – to help minimise these harms so others don’t go through what we are,” Marian-Andrei says.

He comes to this mission with a very special set of skills. 

“My expertise is to build stochastic models. These are algorithms that predict possible outcomes while accounting for “noise” or random variance. They’re used in predicting everything from earthquakes, to stock market movements or brain activation patterns,” he explains.

“I was using these mathematical tools to understand how information spreads on online social platforms and within populations of individuals connected digitally. I explored things like how popularity emerges, notions of virality, engagement and emergence of influence from a theoretical perspective.”

That was before the proliferation of conspiracy theories, anti-vaccination claims and political extremism had taken hold.

Starting from a hypothesis that misinformation is a type of information, I've been looking at how these specific types of information propagate. Why do people consume misinformation? What is it about them that makes them so catchy?

“Why do some people get to the point where they prefer this community, and buy this alternative view of reality, even over their own family members?”

Complicating this picture is when misinformation is deliberately spread by people or agents with their own, often malicious, agenda.

“Information spreads by itself. But sometimes misinformation is being injected into conversations, which is known as opinion manipulation, digital propaganda or information operations. I realised the same tools can be applied when this disinformation is being weaponised,” he says.

Marian-Andrei’s research has mainly focussed on three topics – the proliferation of anti-vaccination rhetoric in Australian social media networks, the impact of far-right and toxic masculinity on young boys, and detecting and countering online foreign interference.

My research builds stochastic models. These are mathematical approximations of how information travels from one individual to the other within online social networks.
 
This allows us to understand phenomenons like popularity virality or even why misinformation catches on.
 
Successful solutions will allow us to design counter measures and to approximate the effectiveness in order to be able to safeguard Australia's future.

Marian-Andrei has used his models to look at the mix of misinformation emerging from grassroots conversations and disinformation being spread on purpose.

“Going into this problem with my computer scientist hat on, my models were my secret sauce,” he explains.

“But in reality, a computer scientist like me isn’t equipped with all they need. In order to make a real dent, I quickly realised I needed to partner up with social scientists, political scientists, digital communication experts, journalists and so on.”

An example of the success of this approach came after he met two UTS social science academics – Dr Francesco Bailo, a lecturer in digital communications, and Associate Professor Amelia Johns, a digital ethnographer. They decided to work together on looking at anti-vaxxer activity.

“They were following groups of people online with fringe views on the topic. I started deploying some of the modelling,” he says.

“We further developed our collaboration and then kept applying our tools. We built prototypes and software dashboards together. We developed the approaches for data labelling that minimise the time and effort of the expert involved. We deployed large language models.”

It’s a partnership that has yielded great insights and won funding from Facebook and the Defence Innovation Network.

Photo of Marian-Andrei Rizoiu with whiteboard

Now, Marian-Andrei is turning his sights on building tools to help people counter misinformation.

“It’s not like there’s a silver bullet approach where you click a button and it's all done. We cannot replace the experts. However, we can give new tools and methods to the operative, journalist or social scientist that automate their tasks and help augment their expertise,” he says.

“What really keeps me awake these days is  when and how to counter misinformation. Let's say you detect a particular actor pushing a particular narrative. How should you respond to it? Should you respond at all? This is very relevant for journalists and campaigners.”

He’s looking at popularity prediction of particular types of content while accounting for the type of content, the actor propagating it and the platform they’re using.

“It’s a big decision whether to debunk misinformation or not. If you do, you’re coming in with a big cannister of gasoline and hoping that it will put out the fire. If it puts it out then great. But if it doesn’t, you can inflame it.”

“That's what I'm concentrating on right now. If we need to react, what is the best reaction to control damage? When is the best time to make an intervention? What pre-emptive measures can we deploy early on to limit the damage?”

For everyone’s sake, we can only hope his tools can help stop the spread of misinformation that threatens all our futures.

Associate Professor Marian-Andrei Rizoiu is a finalist for the 2024 Department of Defence Eureka Prize for Outstanding Science in Safeguarding Australia. He previous won Academic of the Year and Excellence awards at the 2023 Australian Defence Industry Awards.