News location:

Sunday, December 22, 2024 | Digital Edition | Crossword & Sudoku

‘Terror is the word’: calls to act on deepfake porn

An inquiry is looking at laws to curb the non-consensual sharing of digitally altered sexual images. (AP PHOTO)

By Dominic Giannini in Canberra

Circulating fake, sexual images has the potential to ruin lives and while new laws have been welcomed, there are calls for more to be done to stop their creation in the first place.

There has been an alarming increase in the generation of “deepfakes” with new technology powered by artificial intelligence (AI) making their creation a lot easier for everyone, a parliamentary committee heard.

Deepfakes involve digitally altered images of a person or their body, while AI can be used to generate an image based on a person’s photo or to superimpose their face onto pornographic material.

Young people had taken their own lives after becoming victim to the sharing of offensive deepfake material, Association of Services Against Sexual Violence CEO Nicole Lambert told the hearing on Tuesday.

There was also a lack of support services, Queensland Sexual Assault Network executive officer Angela Lynch added, saying a 12-year-old rape victim in her state remained on a waiting list for help.

Perpetrators of non-consensual deepfakes in one survey admitted the biggest deterrent for them committing the abuse would have been criminal penalties, Rape and Sexual Assault Research and Advocacy CEO Rachel Burgin said.

“So we’ve heard from the perpetrators’ mouths what would have stopped them,” she told the committee on Tuesday.

“We need more than signs at bus stops. What we’re doing for prevention in Australia doesn’t work, that’s why we’ve had more than 50 women killed at the hands of men this year.”

Deepfakes are often accompanied by doxxing, where personal information is shared online, which then makes people fear for their safety, she said.

“Terror is the word,” Ms Burgin said, adding that sexual violence was a precursor to homicide

Social media platforms, search engines and payment platforms were complicit as they directed people toward such content and profiting from transactions and advertising, abuse survivor Noelle Martin said.

Billions of people globally had viewed the top 40 non-consensual deepfake nude sites and there were a plethora of apps promoting undressing or “nudifying” women, she added.

“It’s life-destroying and shattering,” Ms Martin said.

Tech companies “have been able to operate with impunity” for too long and needed to face severe fines or potentially criminal liability, she said.

“At the end of the day, we are seeing it on their platform, it’s been on their platform for a very long time, these deepfake sites are high up on their search results … regulators need to stop going easy.”

The use of deepfakes to control women in abusive relationships also churned stomachs at the inquiry.

“People will and do, create images of their partners as a mechanism of exerting control over them in a family violence context,” Ms Burgin said.

More than 96 per cent of deepfakes target women, and a cybersecurity company that tracked deepfake videos since December 2018 found 90 to 95 per cent were non-consensual porn, according to law professor Rebecca Delfino.

More than 680,000 women had fake nude images created and shared without their knowledge through an AI chatbot in October 2020 and the phenomenon had become worse throughout the pandemic, she said.

The committee heard consent needed to be clearly defined in new federal legislation and it should also be broadened to include creation and threatening to create an image.

The laws would impose a six-year prison sentence for sharing deepfake sexually explicit images without consent and a seven-year term if they created and distributed the image.

Attorney-General Mark Dreyfus argued the Commonwealth had legal limits to what it could tackle but Marque Lawyers managing partner Michael Bradley believed the federal government had the power to make the bill broader.

“In the terrorism realm, there are some pretty broad offences that … criminalise accessing and creating content so I don’t think it’s that much of a stretch,” Mr Bradley said.

The parliamentary committee will report by August 8.

 

Who can be trusted?

In a world of spin and confusion, there’s never been a more important time to support independent journalism in Canberra.

If you trust our work online and want to enforce the power of independent voices, I invite you to make a small contribution.

Every dollar of support is invested back into our journalism to help keep citynews.com.au strong and free.

Become a supporter

Thank you,

Ian Meikle, editor

Share this

Leave a Reply

Related Posts

Follow us on Instagram @canberracitynews