Fury has erupted over deepfake porn images of Taylor Swift that have circulated widely on social media in recent days – but the problem is also traumatising scores of real life women in the UK.
New AI technology has made it easier for misogynistic trolls to steal photos of real women before transplanting some of their features – such as their face – onto pornographic footage before sharing it online without their consent.
The boom in deepfake porn is widely recognised as a growing problem, but slow progress in formulating new laws to tackle it means victims are often left without any legal recourse. In the event you cherished this short article and you want to get more details concerning adult xxx movies free generously visit our own webpage.
Researcher Kate Isaacs was scrolling through X when a video popped up on her notifications. When she clicked play, she realised the footage showed a woman in the middle of a sex act with her face superimposed onto the woman’s body.
‘I remember feeling hot, having this wave come over me. My stomach dropped. I couldn’t even think straight. I was going ”Where was this? Has someone filmed this without me knowing? Why can’t I remember this? Who is this man?”’ she told the Mail.
Researcher Kate Isaacs was scrolling through X when a video popped up on her notifications that showed her in a deepfake porn video
Nassia Matsa, a tech writer and model, was travelling on the Tube when she noticed an advert for an insurance company that had used her face without permission
‘It was so convincing, it even took me a few minutes to realise that it wasn’t me. Anyone who knew me would think the same. It was devastating. I felt violated, and it was out there for adult free movies xxx everyone to see.’
The 30-year-old, who founded the #NotYourPorn campaign, never found who made the offending porn video of her, but believes she was specifically targeted because she had spoken out about the rise of ‘non-consensual porn’ in the past.