Home
opinion

Kate Emery: Why the ‘Balenciaga Pope’ scares me

Headshot of Kate Emery
Kate EmeryThe West Australian
CommentsComments
The real Pope, who may or may not own a puffer jacket of some kind, has since been treated in hospital for a respiratory infection.
Camera IconThe real Pope, who may or may not own a puffer jacket of some kind, has since been treated in hospital for a respiratory infection. Credit: Supplied

You might think you’ve seen a photo of the Pope dressed in a Balenciaga white puffer jacket so voluminous he’s more Michelin Man than holy man. You haven’t.

The image was created not by the Pope’s collision with high fashion and a spare $4000 down the back of his cassock but by a 31-year-old Chicago construction worker high on life — and, perhaps more relevantly, magic mushrooms — using artificial intelligence software known as Midjourney.

The real Pope, who may or may not own a puffer jacket of some kind, has since been treated in hospital for a respiratory infection.

But if the fake Pope fooled you, join the queue. The image was just believably bonkers enough that web culture expert Ryan Broderick suggested it “might be the first real mass-level AI misinformation case”.

People have been manipulating the truth of photos since just about photography’s earliest days when Thomas Wedgwood was getting splashy with the silver nitrate.

Anyone who’s marvelled at a magazine cover star’s lack of pores has been fooled by a manipulated image. Photoshop creates convincing fakes for nefarious or comedic purposes, to be circulated by the credulous.

But what makes this new generation of AI fake images — so-called deep fakes — troubling is the ease with which they can be whipped up. Far be it for me to suggest the artist behind the puffy Pope wasn’t working at the top of his game while tripping on ’shrooms but what Photoshop requires a certain level of expertise for, Midjourney can do in seconds.

We’ve seen it with a Tom Cruise deepfake from a couple of years ago, and more recently with “photos” of Donald Trump praying and being arrested.

Ai deep fake images - Donald Trump praying supplied
Camera IconAi deep fake images - Donald Trump praying supplied Credit: supplied/supplied

The deepfake Pope coincided with a call for Twitter boss Elon Musk and Apple co-founder Steve Wozniak, among others, to tap the brakes on AI lest the machines rise up to enslave us all (I paraphrase).

Apart from a worst-case Matrix-style scenario, the threat of deepfakes being used for extortion or blackmail is real: AI-generated email and voicemail scams are already out there.

Nobody was harmed by the fake Pope, unless like me you’ve never quite forgiven the Aussie inventor of the puffer jacket, George Finch, for his crime against waistlines.

But what if the next deepfake is the Pope (forgive me, your holiness) shooting up? Or a politician in nazi regalia? Or you in a porno?

Australia’s eSafety Commissioner Julia Inman Grant says average Australians are increasingly at risk.

“As this technology becomes more accessible, we expect more and more everyday Australians to fall victim,” she says.

She wants tech companies to build solutions into their products.

ESafety’s image-based abuse scheme covers deepfakes and has a 90 per cent success rate. That’s good news if not enough to reverse the heart attack Aunt Mary has the first time someone emails her “photos” of the Pope snogging (notable Hollywood lothario) Pete Davidson.

Ai deep fake images - Tom Cruise deepfake images that went viral.
Camera IconAi deep fake images - Tom Cruise deepfake images that went viral. Credit: Chris Ume/tiktok/supplied

Brendan Walker-Munro is a senior research fellow at the University of Queensland. If and when the machines take over, he’ll probably be first up against the wall because he understands how these things work.

He says people who don’t know to be “really guarded” about what they see online are most at risk of being duped. Research suggests even when people know they’re being duped — not necessarily by photos but, say, a politician’s lies — the fake stuff can still influence voting behaviour.

But he is cautiously optimistic “guerrilla regulation” will help.

“There’s a lot of research being done into detecting deepfakes, there are tools out there and they’re becoming more numerous and better,” he says.

“I wouldn’t be surprised if we see over the next year there will start to be platforms where they say ‘provide the link and we can tell you if it’s real or not’.

“I think that these sorts of tools are going to come out at the same speed, maybe even faster than, the deep fake technology.

“It’s an arms race.”

The arms race of the Cold War years gave us the scenario of mutually assured destruction. The best-case scenario for the deepfake arms race might be something closer to detente, where cooperation seems possible.

But there will be casualties and it’s not just the prospect of innocent people being targeted that should scare us. Just as dangerous is the prospect the guilty will get away with their crimes.

That photo of them accepting a sack of money with a dollar sign on it? Fake. At least that’s what they’ll say, and who do we believe when we can’t believe our eyes?

Get the latest news from thewest.com.au in your inbox.

Sign up for our emails