Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search


(10,177 posts)
Wed Oct 19, 2022, 09:23 AM Oct 2022

'They said: aren't you that porn star?' The woman hunting down image-based abuse

“Faces of exes,” Mia Landsem read out loud, as she clicked on a link to a forum exposing intimate images of ex-girlfriends, her frowning brow illuminated by a three-screen computer. On the 25-year-old’s neck, underneath wisps of blond hair, are tattooed reminders in Norwegian to be “brave” and “don’t give a fuck.” An internet security expert by day, by night she has made it her mission to hunt down and report such images from her apartment in Oslo. “I try to focus on the worst ones,” she said. “I can maybe get a few groups removed in a day, but then 20 more appear.”

Digital image-based sexual abuse – a catch-all phrase that includes deepfake pornography, so-called “upskirting” and “revenge porn”, a term rejected by activists for implying the victim has done something wrong – is a global problem on the rise. Almost three out of four victims are women, according to a 2019 study by the University of Exeter. But there are male victims and female perpetrators.

Catching digital perpetrators was, in Landsem’s own words, initially a way to stay alive. At 18, she was at a bar in the Norwegian city of Trondheim when she noticed a group of guys sniggering at her. When she asked what was so funny, they said: “Aren’t you that porn star?” Landsem recalled. “I didn’t understand anything. Then they showed me that photo.”

In it, Landsem and her ex-boyfriend were having sex. She was 16 at the time. “I remember running to the toilet of the bar and crying,” Landsem said. Seeing how distressed she was, the men deleted the image. But it was already making the rounds of the city.

Although it is difficult to pin down how widespread digital intimate image-based abuse is, aid organisations in several countries reported that it exploded during the pandemic. “We’re seeing more and more content,” said Sophie Mortimer, manager at the UK Revenge Porn Helpline, whose caseload surged to a record of 3,146 cases in 2020. “We need to act in a global manner,” Mortimer said. “Because that’s how the internet works, it is a global thing.”


This is what brought down Katie Hill--remember her?

Latest Discussions»General Discussion»'They said: aren't you th...