Big tech, bad ethics
This week saw the announcement of major media reforms intended to hold big tech to account. It’s been a long time coming, with privacy law, misinformation, online safety and the news media bargaining code among the many legal issues under ongoing review. Still, there were surprises. Expected reforms were absent; unexpected reforms made an appearance.
Top of the agenda is a social media ban for kids. And why not? There is a problem, clearly. Some have questioned how we verify the age of users, but isn’t the answer obvious? TikTok, YouTube and Insta collect rather a lot of data. I’m confident they’d have a pretty good idea of their users’ ages. The business model of digital platforms isn’t complicated. One, collect data. Two, attract advertisers.
On that note, this week the government revealed how its privacy law reform is progressing. Yesterday, a bill was introduced to Parliament containing the first tranche of reforms, including the introduction of a statutory tort for serious invasions of privacy. This is a major step forward, but as Privacy Commissioner Carly Kind says, there is a lot more to be done: ‘Further reform of the Privacy Act is urgent.’
Indeed. Also, this week Meta made unedifying admissions about the data it’s been scraping to train its AI, with no opt-out available for users. Here’s a question for Meta: apart from the data of users, are you also scraping the data of non-users of your services to train your AI? In other words, take someone who has never had an account with Facebook, Instagram, WhatsApp, Threads, Messenger, or any other Meta offering. Nonetheless, that person appears in users’ photos and posts. Is Meta scraping that non-user’s data? If the answer is yes, it’s time for serious self-reflection. And for tough law. Let’s see what the second tranche of privacy reform looks like.
For all these issues, a fundamental shift is needed. Let's put the responsibility on digital platforms and switch from a caveat emptor (buyer beware) approach to a caveat venditor (seller beware) approach. I’ve previously argued this in relation to privacy and, in a new paper, in relation to algorithms and AI. On this line of thinking, it’s the responsibility of digital platforms to ascertain the age of users, and to ensure data collection and use is ethical. Further, I argue that the law should encourage ‘light patterns’, instead of dark patterns that manipulate users into acting against their own best interests.
Sure, let’s encourage innovation, but not at the expense of our autonomy and democracy.
Sacha Molitorisz, Senior Lecturer - UTS Law