News location:

Thursday, January 16, 2025 | Digital Edition | Crossword & Sudoku

Student AI deepfake images reflective of porn crisis

Senator Matt Canavan says standards of behaviour are not being taught to young boys. (Darren England/AAP PHOTOS)

By Cassandra Morgan and Holly Hales in Melbourne

The disturbing doctoring of photos of about 50 teenage girls to create fake nude images is part of an epidemic that’s causing great harm, experts warn.

Australia’s eSafety Commissioner Julie Inman Grant on Wednesday said rapidly developing technology is making it harder to determine what content is fake.

It follows the arrest of a teenage boy after manipulated images were shared of girls in years 9 to 12 at Bacchus Marsh Grammar, northwest of Melbourne.

“Deepfakes, especially deepfake pornography, can be devastating to the person whose image is taken and altered without their knowledge or consent, no matter who they are,” Ms Grant said.

“Image-based abuse, including the production of deepfaked images and videos, is a persistent online harm which also represents one of the most egregious invasions of privacy.”

Sexual Assault Services Victoria chief executive Kathleen Maltzahn said the images reflect a broader pornography-driven crisis in schools.

“The availability of online AI programs meant boys and men could create and distribute deepfake porn,” she said.

“We’re seeing significant levels of children using sexual harm against others.

“We need to get ahead as much as possible of the deepfake stuff and have structures or resources in place so we can work with schools, so that we can move boys away from this behaviour.”

The Victorian Department of Education needed to be better resourced to respond to sexual violence in schools, and the federal government should step up its regulation of social media companies, Ms Maltzahn said.

“Schools are not equipped to deal with this, and they come to our services, and our services are not funded at the level we need to be able to go into schools and give an emergency response,” she said.

“Pornography is significantly carried by Twitter and Facebook and Instagram and the rest, so we can do something about those companies.”

Laws cracking down on the sharing of sexually explicit AI-generated images and deepfakes without consent were introduced to federal parliament on June 5.

However, some politicians remain concerned about the persistent availability of online programs to create deepfake images.

Several apps offered prompts including “undressing someone”, “body retouch”, “hot looks”, “spicy images” and “pushing the boundaries” – usually under subscriber content.

Face swaps, replicating outfits to “short uniforms”, overlaying faces on pornographic content and inserting sexually explicit photos were accessible within half a dozen clicks.

Bacchus Marsh Grammar, northwest of Melbourne, has said it is counselling students after it was made aware of the production and circulation of video content including images of about 50 students.

“Bacchus Marsh Grammar is taking this matter very seriously and has contacted Victoria Police,” acting principal Kevin Richardson said in a statement.

Victoria Police officers arrested a teenager over the explicit images circulated online and he was released pending further inquiries.

The mother of one 16-year-old student learned of the images on Saturday and picked up her daughter from a sleepover.

“She was very upset, and she was throwing up and it was incredibly graphic,” Emily – who did not provide her surname – told ABC Radio Melbourne on Wednesday.

“She was cropped out but there’s just that feeling of ‘Will it come up, will this happen again?'”

Health Minister Mary-Anne Thomas described the incident as “absolutely abhorrent” and said it was a wake-up call to families and the community to have direct conversations with young people about respect.

“Quite clearly, young people are accessing material on the internet, and through social media, that is influencing their behaviours in ways that … (are) actually causing real harm to other young people,” she said.

Who can be trusted?

In a world of spin and confusion, there’s never been a more important time to support independent journalism in Canberra.

If you trust our work online and want to enforce the power of independent voices, I invite you to make a small contribution.

Every dollar of support is invested back into our journalism to help keep citynews.com.au strong and free.

Become a supporter

Thank you,

Ian Meikle, editor

Share this

2 Responses to Student AI deepfake images reflective of porn crisis

cbrapsycho says: 12 June 2024 at 11:44 am

If the leaders in our society (such as politicians, sports people, CEOs etc.) demonstrated respect to other people including colleagues, women and children, rather than abuse and disrespect, we might expect more of our kids. Too often the kids are just copying their role models.

More immediate and more public sanctions for all leaders would be good. Some sports quickly remove offenders, whilst political and business leaders often stay to continue the abuse until they’re forced out. This is the example set for our kids.

Reply
Curious Canberran says: 12 June 2024 at 4:44 pm

Obviously disturbing.
I hope that the material was not created on the schools premises using it’s resources (PC’s, school AI accounts, etc..), nor that was it disseminated utilising the schools network/infrastructure – or then the school would have partial responsibility.
If not, then it really doesn’t have much to do with the school and I hope we don’t see the school used as some sort of scapegoat for actions outside of its control.

Reply

Leave a Reply

Related Posts

Follow us on Instagram @canberracitynews