UCC tool to reduce explicit AI images harm

“Human users are the ones deciding to harass and defame people in this manner." 
UCC tool to reduce explicit AI images harm

The issue has come to the fore on a large scale in recent weeks, after the Elon Musk-owned X social media site began to comply with the requests of users who asked its AI tool Grok to manipulate photos of women and children. Stock image

A “world-first tool” aimed at reducing the kinds of harmful engagement with explicit AI-generated images seen in the Grok controversy has been unveiled by researchers at University College Cork (UCC).

The free 10-minute intervention dubbed ‘Deepfakes/Real Harms’ has been designed to reduce the willingness of users to engage with harmful uses of deepfake technology, like creating non-consensual explicit content.

“There is a tendency to anthropomorphise AI technology — blaming Grok for creating explicit images and even running headlines claiming Grok ‘apologised’ afterwards,” lead researcher John Twomey, from the UCC School of Applied Psychology, said. 

“But human users are the ones deciding to harass and defame people in this manner. 

"Our findings suggest that educating individuals about the harms of AI identity manipulation can help to stop this problem at source.”

The issue has come to the fore on a large scale in recent weeks, after the Elon Musk-owned X social media site began to comply with the requests of users who asked its AI tool Grok to manipulate photos of women and children, undressing them to put them in bikinis or put them in sexually suggestive poses.

Read More

INTO president joins calls for action on social media platform X

The UCC researchers said, against this backdrop, educating internet users to not engage with such AI-generated sexual exploitation must be a part of the response.

They found people’s engagement with non-consensual deepfake imagery was associated with the belief in several myths about deepfakes.

This included the belief the images are only harmful if viewers think real or public figures are legitimate targets for the creation of such images.

Their 10-minute intervention focused on what they called the encouragement of “reflection and empathy with victims of AI imagery abuse”, and said it significantly reduced the belief in common deepfake myths.

It also lowered the intention of users to engage with such harmful users of deepfake technology, according to the researchers.

The research project principal investor Gillian Murphy said referring to such tech as “deepfake pornography” was deeply misleading as pornography generally refers to an industry where participation is consensual.

“What we are seeing is the creation and circulation of non-consensual synthetic intimate imagery, and that distinction matters because it captures the real and lasting harm experienced by victims of all ages around the world,” she said.

“This toolkit does not relieve platforms and regulators of their responsibilities in tackling this abuse, but we believe it can be part of a multi-pronged approach.”

Read More

Cork councils considering use of X after Grok revelations 

More in this section

Lebanon Israel Iran Cork military expert cannot see 'clean' or early end to war
Extended visitor restrictions in place at Cork University Hospital due to norovirus outbreak  Extended visitor restrictions in place at Cork University Hospital due to norovirus outbreak 
Last residents from flooded East Cork mental health centre move into new homes Last residents from flooded East Cork mental health centre move into new homes

Sponsored Content

The power of the G licence The power of the G licence
Happy couple receiving new house keys from real estate agent Time to get to grips with changes in rental laws
Boatbuilder turned engineer proves alternative paths can lead to success Boatbuilder turned engineer proves alternative paths can lead to success
Contact Us Cookie Policy Privacy Policy Terms and Conditions

© Examiner Echo Group Limited

Add Echolive.ie to your home screen - easy access to Cork news, views, sport and more