AI ‘deepnude’ apps infiltrating Australian schools

Politics


“See someone naked for free,” reads the website's tagline.

“Just color the clothes, set the age and body type and get a deep nude in seconds.”

More than 100,000 people use the “Undress AI” website every day, according to its parent company. Users upload a photo, choose from image settings like 'nude', 'BDSM' or 'sex', and from age options like 'subtract five', which uses AI to make the subject look five years younger .

The result is an automatically generated “deepnude” image for free in less than 30 seconds.

Undress AI is currently legal in Australia, as are dozens of others. But many do not have adequate controls in place to prevent them from generating images of children.

More than 100,000 people use the “Undress AI” website every day, its parent company claims, including Australians.

There is evidence that pedophiles are using these apps to create and share child sexual abuse material, and the tools are also being introduced into schools, including Bacchus Marsh Grammar in Melbourne, where he was arrested earlier this month.

The use of technology to create realistic fake pornographic images, including of children, is not new. Perpetrators have been able to use image editing software such as Photoshop to paste a child's face onto the body of a porn actor.

What's new is that what once required hours of manual work, along with a desktop computer and some technical proficiency, can now be done in seconds thanks to the power and efficiency of AI.

These apps, readily available to Australian users through a quick Google search, make it easy for anyone to create a nude image of a child without their knowledge or consent. And they're growing in popularity: Web traffic analytics company Similarweb has found that they receive more than 20 million visitors each month worldwide.

Deepnude apps like Undress AI are trained on real images and data pulled from the internet.

The tools can be used for legitimate purposes – the fashion and entertainment industries can use them in place of human models, for example – but Australian regulators and educators are increasingly concerned about their use in the wrong hands, especially when children are involved.

Undress AI's parent company did not respond to requests for an interview.

Schools and families, as well as governments and regulators, are grappling with the dark underbelly of new AI technologies.

Julie Inman Grant, Australia's eSafety Commissioner, is the person responsible for keeping all Australians, including children, safe from harm online.

Safety Commissioner Julie Inman Grant during a Senate Estimates hearing at Parliament House in Canberra.

Safety Commissioner Julie Inman Grant during a Senate Estimates hearing at Parliament House in Canberra.Credit: The Sydney Morning Herald

If the Inman Grant is successful, tools like Undress AI will be taken offline, or “deplatformed,” if they fail to adequately prevent the production of child pornography.

This month it released new standards that will, in part, specifically address websites that can be used to generate child sexual abuse material. They are expected to come into effect in six months, after a 15-day recall period in parliament.

“The rapid acceleration and proliferation of these really powerful AI technologies is quite amazing. You don't need thousands of images of the person or huge amounts of computing power… You can just collect images from social media and tell the app an age and body type, and it spits out an image in seconds,” Inman. Grant said.

“I think that's just the tip of the iceberg, given how powerful these apps are and how accessible they are. And I don't think any of us could have predicted how quickly they've proliferated.

“There are literally thousands of such applications.”

Inman Grant said image-based abuse, including fake AI-generated knots, was routinely reported to his office. He said about 85 percent of the intimate images and videos that were reported were successfully deleted.

“All levels of government are taking this seriously, and there will be repercussions for the platforms and for the people who generate this material.”

“I almost threw up when I saw it”

loading

The problem became a grim reality for students and their parents in June, when a teenager from Bacchus Marsh Grammar was arrested for creating nude images of around 50 of his classmates using an AI tool and then circulate them through Instagram and Snapchat.

Emily, a parent of one of the students at the school, is a trauma therapist and told ABC Radio she saw the photos when she picked up her 16-year-old daughter from a sleepover.

He had a bucket in the car for his daughter, who was “sick to her stomach” on the way home.

“She was really upset and throwing up. It was incredibly graphic,” Emily said.

“I mean, they're kids… The photos were mutilated, and so graphic. I almost threw up when I saw it.

“Fifty girls is a lot. It's really disturbing.”

Bacchus Marsh Grammar hit the headlines for pornographic images, but campaigner Melinda Tankard Reist says the problem is widespread.

Bacchus Marsh Grammar hit the headlines for pornographic images, but campaigner Melinda Tankard Reist says the problem is widespread.

According to Emily, the victims' Instagram accounts were set to private, but that didn't stop the perpetrator from generating the nude images.

“There's a sense of … is this going to happen again? It's very traumatizing. How can we assure them that once the measures are put in place, it won't happen again?

A Victoria Police spokeswoman said no charges had yet been laid and an investigation was ongoing.

Activist Melinda Tankard Reist leads Collective Shout, the campaign group fighting the exploitation of women and girls. He has been in contact with parents at Bacchus Marsh Grammar about the incident.

Tankard Reist said girls in schools across the country were being traumatized as a result of boys “turning into self-styled porn producers.”

“We use the term deepfakes, but I think that disguises the fact that this is a real girl whose face has been lifted from her social media profiles and superimposed on a naked body,” he said. “And you don't have to go to the dark web or some kind of secret place, it's all out there in the mainstream.

“I'm in schools all the time, all over the country, and some schools have gotten the media spotlight, but this happens everywhere.”

The Bacchus Marsh Grammar incident came after another Victorian student, from Melbourne's Salesian College, was expelled after he used artificial intelligence software to take “deep nudes” of one of his female teachers.

A loophole in the law

In Australia, the law is catching up with the issue.

Until now, laws specific to AI deepfake pornography only existed in Victoria, where the use of AI to generate and distribute sexualized deepfakes became illegal in 2022.

Attorney-General Mark Dreyfus has said new legislation will apply to sex material depicting adults, with child abuse material already covered in the criminal code.

Attorney-General Mark Dreyfus has said new legislation will apply to sex material depicting adults, with child abuse material already covered in the criminal code.Credit: Alex Ellinghausen

This month, the federal government introduced legislation to ban the creation and sharing of fake pornography, which is currently being debated in parliament. Offenders will face up to six years in prison for transmitting sexually explicit material without consent, and a further year if they created the deepfake.

According to Attorney-General Mark Dreyfus, the legislation will apply to sexual material depicting adults, with child abuse material already dealt with under Australia's criminal code. AI-generated images are already illegal if they depict a person under 18 in a sexualized manner, he said.

“Overwhelmingly, women and girls are the targets of this offensive and degrading behavior. And it's a growing concern, with new and emerging technologies making it easier for abuse like this to happen,” Dreyfus said.

“We brought this legislation to parliament to address a gap in the law. Existing criminal offenses do not adequately cover cases where counterfeit adult sexual material is shared online without consent.”

The federal government has also introduced an independent review of the Online Safety Act to ensure it is fit for purpose.

Noelle Martin is a lawyer and researcher, and at the age of 18 she was the target of sexual predators who made and shared pornographic images of her without her consent.

Noelle Martin is a lawyer, but she was once the victim of deepfakes created without her consent.

Noelle Martin is a lawyer, but she was once the victim of deepfakes created without her consent.Credit: Tony McDonough

For Martin, the younger a victim-survivor, the worse the damage.

“The harm to victim-survivors of manufactured intimate material is as serious as if the intimate material were real, and the consequences of both can be lethal,” he said.

“Especially for teenage girls, experiencing this form of abuse can make it harder to navigate daily life, school and enter the workforce.”

Such abuse “could deprive victims of reaching their full potential and potentially derail their hopes and dreams,” Martin said.

Martin wants to hold all parties in the deepfake pipeline accountable for facilitating the abuse, including social media sites that advertise deepfake providers, Google and other search engines that drive traffic to them, and credit card providers that facilitate their financial transactions.

“Ultimately, laws are only one part of dealing with this problem,” he said. “We also need better education in schools to prevent such abuse, specialist victim support services and robust means of removing such material once it has been distributed online.

“But countries, governments, law enforcement, regulators and digital platforms will need to cooperate and coordinate to tackle this problem. If they don't, this problem will only get worse.”

Business Briefing offers top stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.



Source

Leave a Reply

Your email address will not be published. Required fields are marked *