‘I was put in deepfake porn by my best friend’

‘I was put in deepfake porn by my best friend’
Descriptive text here
-
Article information
  • author, Kate West
  • Roll, File on 4, BBC radio program
  • 1 hour ago

“Jodie” found photos of herself used in pornography deepfake on the internet — and then discovered something even more shocking. In an interview with the BBC’s File on 4 radio program, she told about the moment she realized that one of her best friends was responsible.

WARNING: Contains offensive language and descriptions of sexual violence.

In spring 2021, Jodie (not her real name) received a link to a pornography website from an anonymous email account.

When she clicked, she came across explicit images and a video of what appeared to be her having sex with several men. Jodie’s face had been digitally added to another woman’s body — what is known as “deepfake”.

Someone had posted photos of Jodie’s face on a pornography website, saying she made him “very horny” — and asking if other users of the site could make “fake porn” with her. In exchange for deepfakesthe user offered to share more photos of Jodie and details about her.

Speaking for the first time about the experience, Jodie, who is now in her early 20s, recalls:

“I screamed, I cried, I used my phone violently to understand what I was reading and what I was seeing.”

“I knew it could really ruin my life,” he adds.

Jodie forced herself to browse the porn site, and says she felt her “entire world fall apart”.

She had come across a specific image—and noticed something terrible.

A series of disconcerting events

It wasn’t the first time Jodie had been targeted.

In fact, it was the culmination of years of anonymous online abuse.

When Jodie was a teenager, she discovered her name and photos were being used on dating apps without her consent.

This went on for years, and she even received a Facebook message from a stranger in 2019 who said he would meet her at Liverpool Street station in London for a date.

She told the stranger that it was not her he had been talking to. And he remembers that she was “nervous”, because he knew everything about her, and had managed to find her online. He went after her on Facebook after “Jodie” on the dating app stopped responding.

In May 2020, during the lockdown imposed by the Covid-19 pandemic in the UK, Jodie was also alerted by a friend to a number of Twitter accounts that were posting photos of her, with captions suggesting she was a sex worker.

“What would you like to do with little teenager Jodie?” read a caption alongside an image of Jodie in a bikini, which had been taken from her private social media account.

The Twitter accounts responsible for posting these images had names such as “slut exposer” and “pervert boss”.

All of the images that were being used, Jodie had shared on her social media with close friends and family — with no one else.

She later discovered that these accounts also posted images of other women she knew from university, as well as from her hometown of Cambridge.

“At that moment, I had a very strong feeling that I was at the center of it, and that that person was trying to hurt me,” he says.

Counterattack

Jodie began contacting the other women in the photos to warn them, including a close friend we’ll call Daisy.

“I just felt bad,” says Daisy.

Together, the friends discovered many other Twitter accounts posting photos of themselves.

“The more we looked, the worse it got,” says Daisy.

She sent messages to Twitter users, asking where they got the photos. The answer was that the photos were “uploads” from anonymous senders who wanted them shared.

“Is it an ex-boyfriend or someone who is attracted to you,” one user responded.

Daisy and Jodie created a list of all the men who followed both of them on social media — and who could access both photo albums.

The friends concluded that it must be Jodie’s ex-boyfriend. She confronted him, and blocked him.

For a few months, the posts stopped — until someone got in touch via an anonymous email.

“Sorry to remain anonymous,” the email said, “but I saw this guy was posting photos of himself on horrible subreddits (communities on the Reddit platform). I know this must be very scary.”

Jodie clicked on the link and was directed to the social network Reddit, where a user had posted photos of Jodie and two of her friends, numbering them: 1, 2 and 3.

Other people online were asked to play a game — which of these women would you have sex with, marry, or kill.

Below the post, 55 people had already commented.

The photos used on the site were recent, and had been posted after Jodie blocked her ex-boyfriend. The friends then realized that they had blamed the wrong person.

Six weeks later, the same anonymous person got in touch again by email – this time, to warn about the deepfakes.

‘The ultimate betrayal’

When compiling the list, Jodie and Daisy ruled out some men they completely trusted, such as family members—and Jodie’s best friend, Alex Woolf.

Jodie and Alex formed a strong bond of friendship when they were teenagers — it was their shared love of classical music that brought the two together.

Photo caption,

Alex Woolf (interviewed here on the BBC’s Newsnight program) was one of Jodie’s best friends

Jodie sought comfort from Woolf when she discovered her name and photos were being used on dating apps without her consent.

Woolf graduated in music from Cambridge University and won the BBC Young Composer of the Year award in 2012, as well as appearing on the British program Mastermind in 2021.

“He [Woolf] I was very aware of the issues women face, especially on the internet,” says Jodie.

“I really thought he was partisan.”

However, when she saw the pornographic images deepfakethere was a photo of her in profile with an image of King’s College, Cambridge University, behind.

She clearly remembered when it was taken — and that Woolf was also in the photo. He was also the only person she had shared the image with.

Photo caption,

Jodie’s photo (blurred) with Alex, a cropped version of which was uploaded to the porn site

It was Woolf who had offered to share more of Jodie’s original photos in exchange for them being turned into deepfakes.

“He knew the profound impact it was having on my life,” says Jodie. “And yet, he did it.”

‘Completely embarrassed’

In August 2021, 26-year-old Woolf was convicted of taking images of 15 women, including Jodie, from social media and uploading them to pornography websites.

He was sentenced to 20 weeks in prison, supervised for two years, and ordered to pay each of his victims £100 compensation.

Woolf told the BBC that she is “utterly ashamed” of the behavior that led to her conviction — and that she is “deeply sorry” for her actions.

“I think about the suffering I caused every day, and I have no doubt that I will continue to do this for the rest of my life,” he said.

“There are no excuses for what I did, nor can I adequately explain why I acted so despicably based on those impulses at the time.”

Woolf denies having anything to do with the harassment Jodie suffered before the events for which he was accused.

For Jodie, finding out what her friend had done was the “ultimate betrayal and humiliation.”

“I replayed every conversation we had, when he had comforted me, supported me and been kind to me. It was all lies,” she says.

We reached out to X (formerly Twitter) and Reddit about the posts. OX did not respond, but a Reddit spokesperson stated: “Non-consensual intimate images (NCIM) have no place on the Reddit platform. The subreddit in question has been banned.” The porn site was also taken offline.

In October 2023, pornography sharing deepfake has become a criminal offense as part of the Online Safety Act in the UK.

There are tens of thousands of videos deepfake online. A recent survey found that 98% are pornographic.

Jodie is outraged, however, by the fact that the new law does not criminalize a person who asks others to create a deepfake, which is what Alex Woolf did. It is also not illegal to create a deepfake.

“This is affecting thousands of women, and we need to have the right laws and tools to stop people from doing this,” she concludes.

The article is in Portuguese

Tags: put deepfake porn friend

-

-

PREV It is false that foods with a ‘frog seal’ are the work of Bill Gates
NEXT Copa do Brasil has a “super fourth” with eight games in the third phase; see clashes and where to watch | Brazil’s Cup
-

-

-