Woman outraged after her photos were used to make AI-generated porn

Woman outraged after her photos were used to make AI-generated porn
Source: Daily Mail Online

A mother-of-one has revealed how she has become a victim of AI-deepfake porn - technology that's advancing faster than laws designed to protect victims.

Alyssa Rosa, 29, revealed she learned of the pornographic images bearing her likeness after a woman contacted her on social media, saying she found the artificially generated media on her boyfriend's phone.

Recognizing it was fake but noticing the unmistakable likeness of a real person, she tracked down Rosa to tell her of the infringement.

'I was mad,' Rosa, who lives in a small town in the southern part of the state with her son, told ABC-6 Action News.
'That kind of content never existed of me before. And now it does. And it's completely without my consent.'

Rosa went on to detail how she turned that anger into motivation - toward finding the person responsible.

She soon learned the images were likely being created by a man she'd befriended on a social dating app, who went on to mine public parts of her social media profiles, she said.

'He would comment on my photos, like, saying, "thank you," "you're so beautiful,"' Rosa recalled. '[He’d] comment on photos with my son, saying, like, you know, "He’s so handsome."'

The horror, however, did not end there - all laid bare in an interview.

Sharing her story Thursday, 29-year-old Alyssa Rosa revealed she learned of the pornographic images bearing her likeness after a woman contacted her on social media, saying she found the artificially generated media on her boyfriend's phone.

'He said, "Isn't she gorgeous?",' Rosa said, reading texts from the suspected perpetrator that she obtained from the woman's boyfriend's phone.

'"I tried meeting up with her but she's dry (slang for not interested),"' the person, whom Rosa did not name, went on. '"So many hot pics on her page."'

'I wanted to know who was making them,' she told the Philadelphia news station of how she came to the realization.

'Like, obviously - what was the source? What was the reason?'

It became clear the man responsible was the person she met on the dating app, she said, when she confirmed he was social media friends with a friend of the now-ex boyfriend of the Good Samaritan.

She realized how, despite not being friends with her on social media, the suspected culprit was creating the sexually explicit images from several of her real photos.

Fueled by easy access to her Facebook and Instagram photos, which are public, the alleged smut-peddler's plot was uncovered further when the woman who reached out agreed to share portions of photos and messages sent without her knowledge.

'I've already made clips of her,' one message from the alleged culprit read. 'What I would do to that.'

Recognizing it was fake but noticing the unmistakable likeness of a real person , she tracked down Rosa to tell her of the infringement.

'I was mad,' Rosa, who lives in a small town in the southern part of the state with her son, told ABC-6 Action News, showing some of the messages sent by the believed perpetrator.

She soon learned the images were likely being created by a man she'd befriended on a social dating app, who went on to mine public parts of her social media profiles , she said.

'Damn Tattoos and all. Where She lives?' someone responding to artificial porn sent by the person said.

The alleged perpetrator, horrifyingly, responded: 'Near me, thankfully. She's a single mom but you know what that means.'

Disturbed beyond belief, Rosa spoke about how the plight remains ongoing, as New Jersey law that would prohibit deepfake pornography and imposing criminal and civil penalties for non-consensual disclosure, introduced in 2023, has yet to be approved.

Meanwhile, a new law doing the same in Pennsylvania will go into effect later this month, as some states continue to - albeit, slowly - make the sort of AI-generated sexually explicit material that's plaguing Rosa illegal.

On a federal level, the phenomenon still is not barred - but House Bill 125, which just passed this year, will prohibit AI from being used to generate child sexual abuse images.

Moreover, just last week, the Senate passed the Take Down Act - another federal law that would force social media companies to remove sexually exploited images such as deepfakes within 48 hours of being notified by a victim.

As it stands, the bill still needs to pass in the House of Representatives, which is now dominated by Republicans.

Rosa told ABC-6 she's still worried since she hasn't seen most of the content that is allegedly out there, being pedaled by what appears to be the same person.

A New Jersey law that would prohibit deepfake pornography and imposing criminal and civil penalties for non-consensual disclosure, introduced in 2023, has yet to be approved

Disturbed beyond belief, Rosa spoke about how the plight remains ongoing, as New Jersey law prohibiting deepfake pornography and imposing criminal and civil penalties for non-consensual disclosure introduced in 2023 has yet to be approved

'One thing that really stuck with me when in the screenshot she sent me is that he said, "I made so many clips of what that b*h would do,"' she said, reading off another disturbing text sent by the person. 'Like, it’s disgusting. Like, how dare you?'

US Representative Madeleine Dean of Pennsylvania, meanwhile, also sat down with the station; speaking about how she introduced bipartisan No Fakes Act to protect people like Rosa from being victimized.

'We have to put guardrails in place,' she said. '[The act] gives property right to you and to me—to our voice and likeness.'

In the interim, the rate at which deepfakes are circulating online continue to increase. Dean said there needs to be concrete laws to punish those responsible—whether it be those creating media or those spreading it.

'The Shield Act creates new criminal offense for someone who knowingly mails or distributes intimate visual depictions,' added Dean.
'AI is moving so fast; sometimes for very good outcomes and sometimes for very tragic.'

Recent high-profile cases—like one earlier this year involving pop star Taylor Swift that saw AI-generated pornographic pictures of singer distributed on X—have brought some attention to phenomenon made possible by advancements in AI.

Rosa—as a result—said she feels violated without legal leg to stand on as content of her continues to spread.

Lawmakers like Pennsylvania's Rep. Madeleine Dean are working to pass federal laws prohibiting deepfake pornography that would impose criminal and civil penalties for non-consensual disclosure as states slowly adopt similar laws.

Rosa—meanwhile—said she feels violated without legal leg to stand on as content of her continues to spread.

She told the station how she simple hopes the person responsible destroys the content and she hopes laws are soon put into place to protect other victims.

'There should be laws to protect you,' she said, holding back tears at a point. 'That's just way too much power for someone to have access to my likeness and do whatever they want.'

DailyMail.com has reached out to Rosa for comment and more information about her ordeal.