Noelle and Adam

Noelle and Adam
Survivor Stories
Noelle and Adam

May 13 2025 | 01:10:34

/
Episode 3 May 13, 2025 01:10:34

Hosted By

Mariska Hargitay Debbie Millman

Show Notes

In the final episode of this season of Survivor Stories, we explore the devastating impact of image-based abuse; from AI-generated deepfakes to the distribution of nonconsensual intimate imagery. Meet Noelle Martin, who at just 18 discovered manipulated images of herself online that derailed her law career and led her to become a powerful advocate for change. Then hear from Adam Dodge, founder of EndTAB, who explains the evolving landscape of technology-enabled abuse and the progress being made to combat it. Together, they reveal how digital violations can be as traumatic as physical violence, plus what's being done to hold abusers and platforms accountable and change the culture at the root of the problem.

View Full Transcript

Episode Transcript

[00:00:04] Mariska Hargitay: Welcome. I'm Mariska Hargitay, and this is Survivor Stories, a podcast of the Joyful Heart Foundation, which I started more than 20 years ago to transform society's response to sexual assault, domestic violence, and child abuse. Thank you for joining us today. Survivor Stories introduces you to some of the remarkable people that we at the Foundation have met over the years. Through interviews with host and longtime board member Debbie Millman, you will hear from these people, some of whom have survived profoundly difficult, painful experiences of violence and gone on to change the world. I hope you find their stories as inspiring and enlightening and motivating as I do. [00:00:56] Debbie Millman: Please note that this episode contains depictions of sexual abuse and other violence. Please take care of yourself and go to our website, JoyfulHeartFoundation.org to find resources and hotlines if needed. [00:01:15] Adam Dodge: It is extremely disempowering to have somebody hijack your body, sexualize it, and share it all over the internet. I think the reason that the trauma runs so deep with synthetic images like deepfakes and undressing apps, is that it violates our bodily autonomy. [00:01:36] Noelle Martin: This was happening to so many people, but not many people had been speaking out because it exposes you to more harm, because you're speaking about the very thing that's out there and you don't want people to see. [00:01:51] Debbie Millman: Welcome to Survivor Stories, the podcast of the Joyful Heart Foundation, where we meet the people who are helping reshape society's approach to domestic violence, sexual assault, and child abuse. I'm your host, Debbie Millman, a longtime board member of Joyful Heart, working alongside our courageous founder, Mariska Hargitay. Today, we're exploring image based abuse. While that term may be unfamiliar to you, you've probably heard of revenge porn and deepfakes. Those are both forms of image-based abuse involving intimate images that are distributed without the consent of the subject of the image. And in the case of deepfakes, involves a manipulated image generated with the help of artificial intelligence, photo editing software, or other technology. Technology has changed virtually every aspect of our lives, so I suppose it's inevitable that it would change how perpetrators and abusers harm their victims. But the impact can be as devastating and life-shattering as more traditional forms of violence. First, we'll hear from Noelle Martin, a survivor of image-based abuse who is now a fearless advocate. We will be talking to her about her lived experience and her advocacy work. Then we'll hear from Adam Dodge, a lawyer and founder of EndTAB, an organization dedicated to ending technology-enabled abuse. [00:03:26] Debbie Millman: I'll be talking with Adam about the many forms image-based abuse takes and the progress made to date to fight it. Noelle, thank you for joining us today. Your journey to becoming an advocate started with your own personal experience, being a survivor of image-based abuse. You were 18 years old and happened to do a simple Google search of your name. That's when you discovered images of yourself that you didn't even know existed. The reason you'd never seen them before is because they were fake. These images were manufactured by taking real photographs of your face from your social media accounts and pasting them onto the bodies of adult film stars. You've described the experience as completely horrifying, dehumanizing and degrading. Noelle, this is such a profound violation and what I can only imagine was a tremendous, confusing, life-changing shock. Can you talk a little bit about what happened, where you were at that time in your life, and what it was like seeing these deepfake, pornographic images in order to help our listeners understand the painful, real-life impact image-based abuse can have? [00:04:54] Noelle Martin: Wow. Yeah. Thank you so much for that introduction. It has been an 11 year journey, and it started off when I was 18 years old and I did a simple Google search, but my experiences actually started before I actually found out I was being targeted. For a long time, they had been posting images, taking images of me, targeting me before I ever found out what was happening. And as I look back now, after 11 years, I really didn't stand a chance with this abuse. I had almost like my life derailed before I ever had a chance to understand and control it in many ways. But I was 18 when I found out, and I had been studying university at the time. I was doing a law degree, and that's always what I wanted to do, and I was in Sydney at my uni accommodation and it was just late at night and out of pure curiosity I did that search and it was every emotion that you could think of. I mean, it was shock and horror and I didn't know what was happening. I didn't know why it was happening. I didn't know the term that we would describe it now. There were so many thoughts and feelings going through my mind. I didn't fully understand what it was going to mean and how much this would impact my life over the years. And it was like every day or every week and every year since, you learn more and more about not only what this abuse does, but also, and more importantly, what the societal, legal, cultural attitudes towards this abuse does to a person who experiences it. That's almost like the hardest part is not only dealing with it, but dealing with everyone else's views of it. [00:06:55] Debbie Millman: Noelle, do you know why you were targeted in the first place? [00:06:59] Noelle Martin: Well, I don't know exactly, but I do have my theories, and I believe that in the beginning they had targeted me because they fetishized me and my body type. A lot of the sites that had my images to begin with were for busty women and almost like amateur, like everyday people. And then over time, it became more and more mainstream. But when I started speaking out and fighting for law reform, it was almost as if the motivation changed and they targeted me, and they have continued to target me over the years because I was fighting back. And they almost increased and escalated the abuse to silence me and to intimidate me because I, I just, I didn't just take it. [00:07:57] Debbie Millman: Now you say they. Are you aware of this being multiple sources? [00:08:03] Noelle Martin: I do believe it is more than one person. I think that it is a couple of main people. I don't know who they are and I don't believe it's anyone that I know personally. I think that it is a couple of people who are the main perpetrators, and then it's all the other secondary people who are like consuming it and then sharing it and then commenting. They're also perpetrators too. But the main people who are responsible for editing and manipulating the content, i think it's a couple of people, and that's why I refer to them as they. [00:08:37] Debbie Millman: Did this impact your studies in terms of how people were treating you in school and in your life? [00:08:44] Noelle Martin: As someone who was studying at the time that this was happening, it impacted it because I was already studying law and that's all that I wanted to do. And this is the kind of field that is all about, you know, for better or worse, your name and your image and your reputation. And so I was studying, and I would have to go down these rabbit holes in my university breaks, or in the time that I had to study trying to take down this content. So I was trying to, like, live my life and learn and educate myself for a career. But I was having to deal with these anonymous perpetrators, just trying to destroy my ability to self-determine as a young person. And in terms of other people knowing about it throughout my studies, luckily, and I say luckily, like, you know, probably not the best word, but luckily when it was happening in the beginning, it hadn't reached a point where that was like all you could see on Google. It took a bit of time for it to be something that you would see immediately. You'd have to search it through Google and know what to search for to find it. So it hadn't blown up at that time. One part of the reason why I spoke out was I wanted to get ahead of the narrative, to reclaim my narrative before it took a life of its own. [00:10:07] Debbie Millman: And how were you able, if at all, to take some of it down or insist that it be taken down from the sites it was on? [00:10:17] Noelle Martin: So that is a whole process in itself, the takedown process. When I started that process, it was not even an option really. Realistically, I could never win that battle because by the time I found out, realistically it was already too late. Like the content had spread already. Before I found out it had been already on these sites, and even if I had somehow managed to take it down, I could never guarantee that I haven't removed from it being saved on phones or laptops. I could never guarantee that it won't resurface again in another form. It was a never ending. I didn't really understand that. So I went down that process and I contacted all the sites and I would do the reverse search on multiple images of me, find new sites every time I would find new sites and find more content. And it was like looking for your own abuse and seeing more of it, and then reading the comments and realizing the scale, and then feeling so helpless and powerless because you can't do anything about it, and then feeling alone because it wasn't an issue that was talked about in the media. And so you're going down this path and you contact the sites, and then sometimes they respond. Sometimes they don't respond. If they respond, sometimes they'll remove it. And then two weeks again it'll pop up again. Sometimes you needed your ID to prove it was you. But it's ironic because they didn't need my ID to have the images out there in the first place. They need ID for me to remove it. And then, you know, there was that one example of a perpetrator saying that they'd only take down my content if I sent them nude photos of myself within 24 hours. So, yeah. [00:12:11] Debbie Millman: Oh my gosh. So you're just constantly being retraumatized. [00:12:16] Noelle Martin: Yeah. It just didn't stop. [00:12:18] Debbie Millman: Did this impact your decision to pursue a specific kind of law? [00:12:23] Noelle Martin: Well, in Australia, you almost do it like a general degree. But this impacted my ability to really use the years that you have to study and focus on that. And that's money and that's time. And that is time in a field that is competitive. That puts me at a disadvantage. That puts you on a path that other people are on. But I'm having to be pushed back. Who's going to pay for that time and money back that I didn't get to make use of my degree? And not only did it impact like that period of time, but it's also, it impacted my ability to get a job with that degree. Like, I literally couldn't get a job as a lawyer and I'm a lawyer too. I was admitted, but I couldn't get a job in a law firm. It's quite graphic what's out there of me. But even though I fought back, that's still confronting for people. It's still an association that in certain work fields, they don't like. [00:13:30] Debbie Millman: I'm so sorry this happened to you, Noelle. When and how did you first begin your advocacy work? [00:13:38] Noelle Martin: So I first began technically in 2015. I wrote for my university magazine, but then I spoke out the next year nationally. This was pre MeToo movement. This was, this issue wasn't being talked about like it is today. If women spoke out, they were ripped apart, shamed and blamed like they are today, but on a scale that people weren't really like as mobilized as they are on these issues today. I spoke out because there was really nothing else I had. There were no laws that dealt with it. It was happening to me, but it was also happening to so many people, and it didn't seem like other people knew about it or were talking about it. So I had to do what I could. I thank the whole, you know, universe in some ways that I was so idealistic at the time. I thought that if you shared your heart and you were brave, people would respond or people would understand, but that was a serious wake-up call. I poured my heart and my soul out for the world and I was ripped to shreds. Not only was I ripped to shreds, it went all over the world. [00:14:56] Noelle Martin: And I think part of the reason was because this was happening to so many people. But not many people had been speaking out because it exposes you to more harm, because you're speaking about the very thing that's out there and you don't want people to see. So my story went all over the world because it was happening to people, but it was happening behind the scenes, and I almost put a face, human story to an issue that was bubbling under the surface. And I also wasn't a celebrity like, I was just an ordinary woman. And so, you know, it goes all around the world, and then I'm getting ripped apart. And it was just one of the most horrific experiences. And I think to this day, I'm still traumatized and I still have very little faith in humanity. And it almost like, viscerally hurts me when I see other survivors speak out and to see them get ripped to shreds. It makes me so sad to see it happen, and it happens all the time. [00:16:01] Debbie Millman: Why do you think that is the response from the public to do that to victims of image-based abuse and almost every other type of sexual violence? [00:16:14] Noelle Martin: The misogyny is so deeply rooted in our society and this almost hatred of women, and that women must know their place in society and be small and not challenge anything because they threaten. And because by doing that, they'll threaten and disrupt the systems and everything that's worked so far to keep those in power, in power, or make those who uphold the system uncomfortable. But when it comes to this specific form of abuse, I think that I was particularly ripped apart because it wasn't talked about. And so the harms weren't as established in society like they are for other forms of abuse that happens. We know about physical sexual assaults. We know about family, domestic violence a lot more than we do about this issue. There's an established understanding, I think, about what that does to people. You're dealing with people who don't even understand that this is harmful. People don't even take it seriously because they think that you weren't physically touched, so you couldn't have been hurt or harmed. And so all the ignorance bubbled to the surface. And it, because I put a face to the issue, they directed it at me, but also they directed it at those that speak out because it challenges them and challenges people on their views on the world and maybe on things that they've participated in, and they don't want to face that. [00:17:44] Debbie Millman: Despite the harm that you were impacted by in speaking up, you still continued to speak up. Where did that sense of courage and resilience come from? [00:17:59] Noelle Martin: So I, I don't know why I'm getting like a little bit emotional, but I, I mean, I think about this often and I do want to say that I can recognize that it took everything and it takes everything to speak out, but it's like you are hanging on by a thread every day. Like it is really not easy to do this all the time. And so I don't want to ever paint out a picture of me being this like, really, you know, courageous person, because behind the scenes I'm like struggling with coping mechanisms. What really I think has sustained me is this unwavering belief and conviction that what the perpetrators are doing is wrong, and that those that are impacted should not be impacted and should not be affected and not be targeted. And they deserve justice. If I didn't have that belief to my core, then I wouldn't have been able to sustain myself through the crap from the public. And then also the, the responses from those close to you. Because it's not just about the public. It's like family, it's friends, it's university people because they are also part of the broader world that we live in, and not all of them are going to have the same views about things as you. They'll blame you for having a social media. They'll blame you for what you wear. They won't understand it. They won't think it's harmful. And so that's also something that you have to navigate is dealing with sometimes those closest to you not getting it. [00:19:40] Debbie Millman: What does your advocacy work entail now? [00:19:44] Noelle Martin: So now I speak out all over the world, wherever I have the opportunity to speak out. I am also working as a researcher, dealing with what these technologies of human replication are going to mean for us, and how they're going to manifest and how technology is evolving because it's not just image-based abuse and deepfakes. There's so many different forms of this where people are replicating people's images online. So I'm coupling my activism and my advocacy in the media around the world, but also doing that with my research and my legal background so I can try and bring about some sort of meaningful justice. [00:20:25] Debbie Millman: You have become a warrior for other survivors in Australia and internationally. What progress have you seen, if any, that gives you hope that society is or can move in the right direction when it comes to combating image-based sexual abuse? [00:20:47] Noelle Martin: That's a really great question. There are a lot of things that give me hope. I have seen how there has been a shift. So six, seven, eight years ago when people would speak out about this, they were met with a tidal wave of victim blaming and slut shaming. There weren't that many laws that dealt with this issue. Fast forward, we are seeing a tangible difference in A) awareness, B) understanding of the harms, C) laws that deal with this issue, D) the organization that's developing with survivors in a way that hasn't been the case for the longest time. People are now more than ever before mobilizing around this issue. And that gives me hope. So even though there's a long way to go and laws constantly need to be reformed, there is movement happening and it's better than what we've seen before. [00:21:50] Debbie Millman: Can you talk about how you are working with lawmakers and policymakers to implement reforms to protect survivors, to regulate content, and to hold those who cause harm accountable? Since every country's laws are different, I'm wondering if you can offer some general lessons or things you've learned that can help advocates and survivors address this problem in the US and around the world? [00:22:20] Noelle Martin: So again, that's a great question. I mean, what I can provide are some just general lessons that we've learned from the Australian experience. So we have laws that deal with this. General laws. It's illegal to distribute non-consensual intimate images and videos. So you've got that as a baseline, but then you've got to deal with issues of how these laws are enforced and applied in practice. You've got law enforcement that don't take it seriously, and you've also got the judiciary that aren't applying the strongest punishments for this. It's under-criminalized, an issue that needs to be dealt with seriously, at least in this stage of the process. And so you've got that in terms of the logistics and the practicalities of dealing with this issue, then you've got other issues to do with the fact that if you know who your perpetrator is and your perpetrator is in the same jurisdiction as you, you're more likely, I would say, to see some sort of justice. If you don't know who's responsible and they're not in the same jurisdiction, good luck because it is difficult to try and navigate. Then you've got issues to do with the scale of the harm and the abuse that it's all over the internet, potentially. And you can't monitor, you can't control, and you've got issues to do with the fact that it's individual victims who have to see and find and identify and re-identify and search for their own abusive material to get it removed. That needs to change. [00:23:55] Noelle Martin: That is traumatising for anyone. But then you've got issues to do with how do you take it down, and how do you make sure that it stays down, and how do you hold those that do the distributing accountable? In Australia, we have certain provisions in our laws that go after, if you're charged with this kind of offence or if you're convicted, then you have to take it down. Again, that's also so narrow and so limited because you're dealing with the World Wide Web, you know? Then you're seeing movements in this space and in policy discussions about removal as like the be-all and end-all solution to this problem. It's not. And it frustrates me so much because it is such a superficial response to this issue. You don't actually go after the root causes. You don't go after the perpetrators. You don't go after the tech companies. And then you've got issues with what is the meaningful justice for the victim as well. How are they compensated for the fact that their education has been destroyed, potentially, that they can't get a job? How are they compensated for the fact that their life has been derailed because of this? I mean, I could go on forever because ultimately the laws that we have are limited. They're narrow, they don't really tackle the issue. They're focused in on certain areas that are superficial, but ultimately it's better than nothing. There's always room for improvement. [00:25:27] Debbie Millman: Noelle Martin, thank you for doing such important work. Thank you for being on the front lines of this advocacy and effort to change the rules, change the laws, and ultimately change the world. I don't know how we could do this without you. [00:25:48] Noelle Martin: Thank you. That's very kind. Thank you. [00:25:52] Debbie Millman: Thank you for joining me today on Survivor Stories from the Joyful Heart Foundation. Thank you for sharing your insights and experiences. Your courage in confronting this abuse, image-based sexual abuse and more is truly, truly inspiring. Thank you. Now we're turning our attention to the legal and technological aspects of image-based sexual abuse with our next guest, Adam Dodge. Adam Dodge is a lawyer and founder of EndTAB, which stands for Ending Tech Enabled Abuse. Adam and his organization have created practical tools to keep people safe in the digital age. I'm going to talk with Adam to better understand the complexities of image-based abuse, to detail how it's rapidly evolving, what's being done to stop it, and what we can do to reduce the chance that we'll be victims of online abuse. Adam, thank you for joining me on Survivor Stories. [00:27:04] Adam Dodge: Oh, thank you so much for having me. [00:27:06] Debbie Millman: Adam, we just heard from Noelle Martin, who shared her personal experience with deepfake images where her face was pasted onto the bodies of adult film stars. That is, sadly, only one form of image-based sexual abuse. What else falls into the category of image-based abuse? [00:27:29] Adam Dodge: So it's very diverse and I think a lot more nuanced than folks realize. And Noelle, just a shout out to Noelle. I love Noelle and she's such a trailblazer in this space and has done such an amazing job shining a light on this issue. But deepfakes are one headline of image-based abuse or image-based sexual abuse that many folks recognize from headlines. But underneath this AI generated imagery are many other ways that images are weaponized to harm primarily women and girls online. These include the non-consensual distribution of authentic, intimate images that are distributed without the person depicted as consent, often known as revenge porn, but is a term that we really try to stay away from because revenge is not always the motivation, and no victim wants to be associated with porn as part of their trauma. Cyberflashing is another form of image-based sexual abuse where someone sends an unsolicited nude image to an unsuspecting victim. You also have sextortion, where people, either a current or former intimate partner or an online scammer or criminal, threatens to post an intimate or nude image of the victim unless they do something against their best interest. And finally, we get to this new era of synthetic imagery, which includes deepfakes, which is essentially a face swapping technique using and leveraging neural networks and artificial intelligence to move a victim into a video that depicts them engaging in sex acts or undressing apps, which is essentially perfect Photoshop that can render the victim from a photograph in the nude. I know that's a lot, but that is, I think, important to mention because a lot of folks just see the tip of the iceberg, but there's so much more underneath. [00:29:35] Debbie Millman: Oh, absolutely. I was looking on Instagram the other day, and someone had posted a rather silly video of Angelina Jolie dancing in a very sort of silly, clownish way, and it was obvious that it wasn't her, but not quite obvious enough that it wasn't really her. And I couldn't help but wonder what else can be done in this space. And you're sharing with me the depth at which this has grown is really terrifying. Is there anything that can be done to prevent someone from having their images shared in this non-consensual way? [00:30:20] Adam Dodge: That's a tricky question because that focuses on victims, right? Can victims stop this from happening? And I always try to move the conversation away from what victims should do and focus on the person harming them's behavior, the platforms that are hosting this content, our institutions that are designed to keep us safe, like our legal and judicial and law enforcement institutions. Because in the face of this form of abuse, it's very natural for people to default to what can victims do to stop this? And there are things that they can do if they want that kind of guidance. So when people ask me, hey, I'm working with this victim who has had an intimate image created with AI or an authentic image shared without their consent, can you talk to them? The first thing I say is, how can I be helpful to you? What do you need? What would make you feel safe? And if they say, I want to prevent this from happening to me again, then I can give them advice with respect to what I call synthetic image-based abuse. So deepfakes or undressing apps. There is no technological solution to prevent this from happening because people are taking non-explicit photos and manipulating them. Frankly, technology is what has created this problem, so I'm not real keen on looking them to be our saviour for a problem they created in the first place. When I think about prevention, I will often default to outreach and awareness and education because the challenge that we're having with, for example, these undressing apps that are showing up in middle schools and high schools is first contact with this technology is coming from algorithms on social media that are delivering targeted ads to young people, offering them these apps that will undress anybody in a photo and giving them promotional free uses. [00:32:14] Adam Dodge: And first contact really needs to come from trusted sources like parents and teachers and educators and community educators who can explain that these undressing apps, these deepfake apps are not funny, they're not harmless, but in fact are a form of sexual violence. And every time you use one on somebody else and it's always a female student, it's not even possible in most cases to create these images of male-identifying individuals, because the AI is not trained to work on men's bodies. So it is true violence against women technology. And so our voices need to be the first voices that young people hear about this technology so that when they encounter it when we're not there, hopefully they can make an informed decision about whether to use it or not. And look, their brains are still developing. They're going to make bad decisions, but I think we can at least start to mitigate some of the harm we're seeing by talking about this and being louder than the internet when we do. [00:33:22] Debbie Millman: What initially drew you to the field of fighting and preventing image-based sexual abuse and technology-enabled violence? [00:33:32] Adam Dodge: It's a great question. I am a little bit of an outlier. We're on a podcast, so you can't see me, but I'm a white cis male, and you don't see a lot of people that look like me in the gender-based violence prevention and response space. But through a series of different career choices, I found myself repeatedly pulled to work in the field of gender-based violence, and I'm so glad that I did, because it's been the most rewarding thing I've ever done in my life. I don't know why I'm drawn to this work. I'm not a survivor. I'm not part of a victim class. But I am so fulfilled by this work. And in doing this work, working with victims, helping them live lives free of abuse, what became really apparent was that technology was ubiquitous in these cases, and what was equally apparent is neither I nor my staff knew what to do about it. And when we went out to seek training and resources for non-tech experts who work with victims on how to deal with tech-facilitated violence, it just didn't exist. And so I set about creating what I needed in that moment, and it snowballed from there. Other people got wind of what I was doing and started asking me to train, and that's how I got involved in this. And I lean very heavily into the image-based violence because it is the preferred weapon in the digital age. And so we have to focus on it if we want to address and prevent modern gender-based violence. [00:35:11] Debbie Millman: This type of gender-based violence and image-based sexual abuse is something that was brought to the attention of the Joyful Heart Board by our new executive director, Robun Mazur, and we have been learning so much about this now. How big a problem is image-based sexual abuse, and is there a way to quantify its scope and how it's been growing? [00:35:40] Adam Dodge: I would say there is and there isn't a way to quantify it. I think there's a lot happening that is going unreported. But I can give you some examples of the order of magnitude of this problem. So with cyberflashing, for example, which is called and I'm not going to say the word, but it is, it's a little vulgar. But you know in communities they call them d-pics essentially. So these are male-identifying individuals sending unsolicited nude images below the waist and above the knee to women and girls, and I do a lot of talks in middle schools, high schools and colleges. And in fact, the number one reason I get pulled into middle schools to talk is because of that problem and sharing nudes without consent. And there was a survey done and research done where they interviewed women and girls ages 12 to 19, and asked them if they had received these. And 75% of the respondents had received an unsolicited nude by the time they reached 19. And when I talk about that stat to audiences especially, I was just at Virginia Tech last week, i asked the audience, do you think 25%, do you think 50%, or do you think 75%? Everybody says 75%. [00:36:58] Adam Dodge: This is the worst-kept secret out there. Everybody knows how pervasive this is and it's sexual violence. If somebody engages in indecent exposure in the physical world and is arrested and convicted, they have to register as a sex offender. It is a very serious crime. And yet online it is essentially the cost of being a woman or girl. If you want to exist online, three quarters of you are going to be the victim of sexual violence. So it's normalized sexual violence. And again, it's so critical that we be talking about this because often there's a real disconnect between the people sending these and the people who are receiving them. So that's just one example. I've seen stats that say 1 in 12 women have been the victims of non-consensual distribution of intimate images. It's a really pervasive problem, primarily because the way we engage in intimacy today is often through our devices. And so that means video and that means photo. And it makes perfect sense that people will misuse that content to do what they've always done, which is harm women and girls. So this is sort of the modern landscape of what we've been dealing with for a long time. [00:38:18] Debbie Millman: For the many victims of image-based sexual abuse, the emotional toll can be profoundly traumatic. Can you talk a little bit about the impact on victims? [00:38:32] Adam Dodge: Sure. So I spent a lot of time talking about tech-enabled trauma and how the trauma from tech-facilitated violence is a different animal from offline abuse or physical abuse. And I often use non-consensual distribution of intimate images as the case study or example of how trauma has become amplified today in ways that we've never seen in human history. The example I often use is this idea of image-based sexual abuse, where you're sharing an intimate image without someone's consent being a modern problem. It's not a modern problem. People have been sharing intimate images of their partners without their consent for a long time. I'm sure in the Renaissance, nude paintings were commissioned with the agreement that they would be kept private and then shown to somebody else without the person depicted consent. What is that? That is the non-consensual distribution of intimate images without the person depicted consent. But what's happened is technology. The internet, smartphones and social media have amplified this trauma in two really profound ways. One, when an image gets shared online, it becomes public to the world. So now we are taking something where if you showed physically showed a photo to a group of people without the person's consent, the blast radius was limited to that group of people and they can't take that photo with them. [00:40:03] Adam Dodge: It only exists in their minds. But in the digital age, when you share that photo with somebody digitally or post it online, not only is it public, but it is also permanent and the victim has to live the rest of their life, the rest of their life knowing that that photo can show up anytime, anywhere. A good example of this. Jennifer Lawrence and a host of other female celebrities were targeted over ten years ago in a nude photo leak, and their nude photos were non-consensually taken from their accounts and posted online. And she is giving interviews today saying that her trauma will last forever from that incident, because she continues to get alerts that people are sharing and viewing nude photos of her body. So that's what I try to get people to understand how deeply harmful image-based sexual abuse is today, when compared to 30 years ago or something like that. [00:41:03] Debbie Millman: I remember when this occurred to Jennifer Lawrence and how she was beseeching the public to not look at these images because they would also be participating in that abusive relationship. Recently, when this happened to Taylor Swift, the Swifties became very vocal on social media, demanding that they be taken down, demanding that people not look at them. And while it's certainly traumatic for any celebrity to have this occur in their lives, there does seem to be public outrage that's growing. When it happens to somebody that's not a celebrity, there's no one to defend them. There's no one to help them, to beseech any group of people to avoid looking at them. In your experience with survivors, what are some of the common challenges that they face now in coping with the aftermath of image-based sexual abuse? [00:42:14] Adam Dodge: How much time do we have? No there's-- [00:42:18] Debbie Millman: As much as you want. [00:42:19] Adam Dodge: There's, there's a lot here. So what's unique? There's a couple of things. And I'm really glad you brought up the Taylor Swift example, because it really did shine a very bright light on this issue. And it was one of the rare circumstances where you really saw this public outcry when a woman is targeted, because what we typically see is a pretty lukewarm response when intimate images of women are shared online or created with AI. If men are targeted, there's more, especially with celebrities like an example of this is Chris Evans mistakenly posted a full frontal nude of himself on his Instagram story several years ago, and the response from the internet was just outrage and shaming anybody who was posting or sharing these photos, finding feeds where they were, sharing them and filling them with photos of Chris Evans and his dog so that people would have a harder time finding the image. And those of us in the field just sort of rolled our eyes because it was so plain. The gender divide when it comes to image-based sexual abuse. And so I don't think the Taylor Swift incident is a harbinger, that there's going to be moral outrage when this happens in the future, but I think it does hopefully set an example. So when I work with victims and survivors, a couple of things that I do to sort of anticipate what I know are unique facets of the trauma when it comes to what I call synthetic image-based sexual abuse. [00:43:56] Adam Dodge: One, I tell them I believe them. I believe you. It's really important to be very clear and intentional with this, because what I found is victims are either questioned because the synthetic image looks so real. They hear things like this is clearly you. If you can't be honest with me, then I can't help you. And if they haven't heard that some part of them, I think, is worried that someone's going to think that. And then the other thing I think that we can do that is really powerful is validate the harm. There is a bias when it comes to synthetic image-based abuse, that it's not as serious as authentic image-based abuse. This isn't a real photo or video, so it's not as harmful as if an authentic photo or video of you had been shared. And that is a false equivalency because it is extremely disempowering to have somebody hijack your body, sexualize it, and share it all over the internet. The reason there's outrage, I think the reason that the trauma runs so deep with synthetic images like deepfakes and undressing apps, is that it violates our bodily autonomy. Historically, we think about bodily autonomy in the physical world, right? We think, oh, if you don't want to give your grandpa a hug when he comes over, you don't have to. [00:45:25] Adam Dodge: And we're practicing bodily autonomy with our kids so that when they get into relationships, when they're older, they will know that just because somebody cares about them doesn't mean that they can touch their body or violate their boundaries. But the bodily autonomy conversation tends to stop with the physical. And I am a big advocate of expanding the idea of bodily autonomy to digital representations of our body. In the physical world, I have more agency over my bodily autonomy. I can stop somebody from physically touching me. You know, there are certain circumstances in assault where that's not possible. But in general, we have a lot more agency. When violations of bodily autonomy go digital, that agency is gone and we can't stop people from manipulating our bodies online and sexualizing our bodies online and violating our boundaries online. And that is deeply, deeply traumatic and harmful and disturbing for folks. And so I think it's important when we have conversations with students and youth about bodily autonomy, we expand the conversation to include their online experiences so that people will be more intentional, hopefully, when asking for nude images, sharing nude images, or viewing nude images. [00:46:55] Debbie Millman: You know, the unique nature of image-based sexual abuse also complicates whatever legal recourse and community support mechanisms that there are. And this is a distinct challenge for survivors as the technology advances, as it continues to cause harm, it can take some time for the law to catch up. What is the current legal landscape for this issue? Because these photos are having harm that goes to a person's professional life as well as their personal life. What kind of legal recourse do victims of image-based sexual abuse have? [00:47:40] Adam Dodge: There are options, but I want to be very clear about setting expectations that in practical circumstances, I don't see a lot of follow through when it comes to either authentic image-based sexual abuse, sharing actual photos, or with deepfakes and undressing apps. Some states. So let's talk about synthetic images and deepfakes and things like that. There are laws. We don't have a federal law yet. There are some bipartisan support for criminal and civil laws that would allow somebody to either sue civilly if a synthetic image is created and shared of them online, and then a federal law that would impose criminal penalties. Because I'm a lawyer and because I focus on this area, I field a lot of inbound inquiries of the state has created this law. What do you think about it? And my response is always the same. Laws are great. We need laws. But creating a law, even the most perfect law to address this problem is in and of itself insufficient. And to fully address through the criminal legal system this issue, i think it's helpful to think about this problem as a stool with three legs. One leg is the law, and it's important to have that. The other leg is enforcement. Can law enforcement, are they properly trained and staffed and resourced to investigate these tech-facilitated crimes? And I can tell you the answer is no. And then the third leg of the stool is interjurisdictional cooperation. [00:49:27] Adam Dodge: So if somebody in another state or another country is the one behind this, then we need to be able to cooperate with that other jurisdiction. And that is a fearsome challenge in and of itself, and can be a dead end if the person is in a country where we have no hope of cooperating. You know, if it's in North Korea, then that's a brick wall. So we really need all three of those things so that if a victim in a rural area goes into the local sheriff's department and says someone's creating deepfake pornography of me online, can you find out who it is? Investigate them, arrest them, file charges, and keep me safe? I think instinctively we all know that that's probably not going to happen, especially if the person's in another country and they're doing this anonymously. So we really need all three of these priorities to be addressed, because otherwise really great laws, I fear, set unrealistic expectations for victims. They think, oh my gosh, there's this great law. It's exactly what's happening to me. And then they go to report it and nothing happens. And I'm sure the Joyful Heart Foundation, given what you all have done with the rape kit backlogs, you recognize the limitations that these institutions have. Well, when we're talking about technology, it just adds another barrier to justice that we're hoping to unlock. But as it exists now, it's very challenging. [00:50:58] Debbie Millman: When a person is the victim of any type of sexual violence, there are recourses in terms of going to the police. Having a rape kit made. And it doesn't seem from what I'm hearing, that if a person is a victim of image-based sexual abuse, that there's any one specific place that they can go to for any kind of recourse. Is that true? [00:51:33] Adam Dodge: Well, I'm speaking about synthetic images when we're talking about the non-consensual distribution of intimate, authentic images, you know, aka revenge porn. Again, the term people recognize, but we don't like to use, 48 states and D.C. have criminal laws against this, so you have a much better chance, I think, of employing these criminal and legal and judicial and legislative institutions to protect us. That said, I work with many victims who do not feel satisfied with the response. I mean, California, for example, was one of the first, if not the first state to criminalise the non-consensual distribution of intimate images or NDII. And it's a misdemeanor, right? [00:52:26] Debbie Millman: A misdemeanor? [00:52:27] Adam Dodge: It's tough to get a new law labeled as a felony. Usually it's misdemeanor, and then it graduates later through legislative advocacy. Or in Texas they have a cyberflashing law, which is a class C misdemeanor. So these are very low level and one, it deserves to be a felony. When you label something like this as a low level misdemeanor that it's not as serious. And that is a trash opinion in my in my estimation. It's so the opposite. And so what we found is that victims of these crimes, when they learn that their best case scenario outcome is fines, you know, maybe probation, maybe it gets pled down to something, you know, disturbing someone's peace or something like that. Why would they go through the agonizing and retraumatizing process of a criminal conviction when their best-case scenario is one that is not satisfying? At the end of the day, criminalization is certainly one way to address this. [00:53:37] Adam Dodge: I'm more of a proponent of outreach and education, because it's a lot of young folks using this tech and using it without any guidance from parents or teachers or trusted adults. And I think we can pivot to have these conversations earlier and more often in an age-appropriate way that will help disrupt this problem, because it's an upstream problem. How are they getting this information? Like with the undressing apps. These kids are not seeking out undressing apps. They are receiving targeted ads on social media about these undressing apps, which legitimizes them. And there's nobody talking to them about how dangerous they are. So I look at this and think, okay, well, let's make sure that again, first contact with this information is not from an algorithm, but from a trusted adult or trusted institution. I can't tell you how often I talk to parents of middle schoolers who are given phones with little to no oversight, and it makes perfect sense that these kids are going to get drawn into unsafe situations and encouraged to make bad decisions that are harming people in a way that they don't fully comprehend or realize. [00:54:50] Debbie Millman: I have so many questions about this. [00:54:52] Adam Dodge: Yes. Go. [00:54:53] Debbie Millman: Well, first, if somebody that is perpetuating any type of sexual violence and they get arrested and are prosecuted and so forth, there is some recourse for a victim. If somebody is distributing those images, even if they are convicted of a misdemeanor, those images are still living online. What about the social media sites and sites like Google? Is there any recourse for being able to control one's own images or manipulated images that are being distributed? [00:55:34] Adam Dodge: It's very challenging. People are fond of saying the internet is forever. Once something goes up there, because and sometimes when I talk to parents and I say, look, once an intimate image is shared, it's never coming down. And they say, well, what if it's posted and we get it taken down in, in 30 seconds? And I say no because a hundred people could have downloaded that photo onto their devices and reshare them whenever they want. You can never be sure of this. And so there's little to no regulation around this. I will say, when it comes to minors, you know, there is more robust regulation and legislation around CSAM. But generally speaking, know that you don't have a lot of options when an intimate image gets shared online. And unfortunately, the burden repeatedly gets shifted to victims. And I'll explain why. One, mainstream platforms, even sites like Pornhub, have the infrastructure for and portals to request that intimate images or videos shared without consent be taken down. But that puts the burden on the victim to find those images and make that request. Then Google says, well, we'll de-index them from Google search so that people can't find them. This is true for authentic and synthetic images. Again, it shifts the burden to the victim. What we would like to see, and what we are actively rallying for with Google is just de-index it, just make it unsearchable. You've already shown you're able to do this with child sexual abuse material. Do it with non-consensual synthetic imagery, deepfakes, undressing apps that are posted, things like photos that are posted. Just make it unsearchable. And this will kneecap these deepfake sites because 70% of their traffic comes from Google search. [00:57:28] Adam Dodge: If you take that away, they will die on the vine. But it's not happening as of yet, and we can't sue these platforms because they are immunized under a 1990s law called the Communications Decency Act. So section 230 of that law essentially says that platforms are not liable for third-party content. So the law says unless Instagram is the one posting intimate images of you, you cannot sue Instagram. There is some promising shifts in this legal immunity problem, and we're seeing lawsuits at the very least, not get thrown out at the outset by approaching this as a product liability problem. So if kids are getting contacted by traffickers on Snapchat, even though all these parental settings are set up and yet this is continuing to happen. Well, that's not that's not a section 230 issue. That is a faulty product where your product is putting kids at risk. And so we're starting to see people and organizations and nonprofits suing these platforms. Shout out to Carrie Goldberg, who's just this force of nature, victims rights attorney in New York. And she sued the platform Omegle, which is basically like a chat roulette where you go and you just randomly get matched with other people, and kids get kept getting matched with adults who are exposing themselves or manipulating them. And she sued them, saying this is a faulty product. This is not about the things that people are doing to kids on their platforms. It's the fact that your product allows it to happen and even in some cases incentivizes it. And the platform shut down. So she got the platform shut down, which is just like a euphoric feeling because it almost never happens. [00:59:22] Debbie Millman: Adam, you've mentioned a few times that people are being fed ads for undressing apps algorithmically. That means that the sites that are allowing these ads could be liable. I mean, I'm not a lawyer, so I can't say that with any certainty, but my question is if, say, Instagram allows for an undressing app to be served to people on the social site, couldn't they determine that that ad was something that shouldn't be even on the site at all? [01:00:02] Adam Dodge: It depends on whose side you're on. Right? [01:00:05] Debbie Millman: But how is an undressing app like a freedom of speech situation? How is an undressing app even legal? [01:00:12] Adam Dodge: Well, social media platforms are public spaces. So this free speech argument doesn't usually get us very far. But if I were arguing this, I would say, look, your algorithm is feeding unsafe content to kids through ads, right? And that's a product liability. The platforms are likely to say, look, this is a slippery slope. And if you are asking us to review and approve every ad that goes on here, we can't do that. And this is really third-party content and we'll take it down. We're not legally obligated to take it down. We'll take these ads down. But this is a section 230 issue. We are not creating these ads. We are not posting these ads. We're not responsible for this. [01:00:55] Debbie Millman: But they're getting the profit from the ads, they're getting paid to-- [01:00:58] Adam Dodge: For sure. Yeah. Oh, yeah. [01:01:00] Debbie Millman: And I believe that Apple does actually review all the apps. I can't see any upside to have an undressing app available to people. [01:01:09] Adam Dodge: Yeah. And I want to be clear, I don't think there are undressing apps in Apple's App Store. I don't, I don't think those are in there. I think people are going to websites to use these or, you know, downloading them on Android devices outside the Google Play Store, which is a big distinction between Android and Apple. Apple, the only apps you can get are through the App Store, but that's not true with Android. Or it might just be a website that they go to. So there's a variety of different ways that folks can find this. This is what you get when you allow these companies to self-regulate. They don't put trust and safety at the forefront. It is never, trust and safety is never prioritized over user growth and retention and revenue generation. [01:01:59] Debbie Millman: It sounds ultimately that the best solution to prevent image-based sexual abuse is changing the culture, which is a really tall order and generally takes decades, if it's possible at all. Is there anything that individuals can do in the meantime to protect ourselves, aside from never, ever posting a photo of ourselves to the internet? [01:02:28] Adam Dodge: Yeah, absolutely. So there are a variety of things that we can do to protect ourselves. Let's take Cyberflashing, for example. Cyberflashing usually takes place when someone sends an unsolicited, intimate image to a person's device, either through message or through a direct message on social media, or through Bluetooth sharing either AirDrop on iPhones or Quick Share on Android devices. We have to look at that photo and be exposed to this unwanted nude image, and then delete it or block the person. So what Apple has done is they've created a sensitive content warning feature that just came out in the fall that is not turned on by default, but if you go into your privacy settings, you can turn it on. And if turned on, anytime someone air drops a nude image or messages a nude image to you, it is blurred in transit. Whether that image is sent consensually or without consent, and then the recipient has the option to blur and unblur the photo. So it is one of the rare circumstances where a tech fix, a really simple one, can prevent the lion's share of cyberflashing on iOS or Apple devices. Android's not there yet, but you can adjust your quick share settings to turn it off or set it to contacts only. So if someone tries to Bluetooth share a nude that you don't want to see, they will be prevented from doing so. The other thing that I really encourage from a prevention standpoint is what I call safe sexting. This is not a new phenomenon. I am basically pivoting safe sex education to safe sexting education. We don't teach abstinence-only education because we know that doesn't work. We know kids are going to be having sex. [01:04:14] Adam Dodge: We want them to do it safely. We know kids are going to be sending nude images. We want them to do that safely as well. To do that safely, whether it's somebody you just met online or a partner you've been with in the physical world for a long time, any intimate image that is shared should never include identifying information. No faces, no identifying information in the background, No tattoos, no birthmarks, nothing. Because the reason Jennifer Lawrence's trauma is lasting forever is because she is identifiable in the photo. If it was just a picture of her body and not her face, she has plausible deniability. She could say, that's not me, right? And people I think will lose interest. So we can do this. We will see a steep drop in sextortion, in the sharing of non-consensual distribution of intimate images. If they are not identifiable images, it's linking the victim to the photo where the harm is. And the last thing I'll mention for victims of sextortion, where somebody is threatening to post an intimate image on, let's say, social media, there's a great nonprofit that can be found at the website StopNCII.org, stop non-consensual intimate imagery, that will create a digital fingerprint of the photo and share that fingerprint, not the photo, so the victim never provides the photo to anyone. That digital fingerprint is then shared with all social media platforms, who monitor their platforms in real time for anybody uploading a photo with that fingerprint and will proactively take it down. So those are three really powerful ways that we can either prevent image-based sexual abuse or engage in harm reduction. And I include them in every presentation that I do. [01:06:06] Debbie Millman: What other resources on these issues are going to be helpful to people? If someone is the victim of image-based sexual abuse, what would be the first thing you would tell someone to do? And where can they find help? [01:06:24] Adam Dodge: Different resources include the Cyber Civil Rights Initiative. So the Cyber Civil Rights Initiative is the preeminent image-based sexual abuse nonprofit in the country. They have a hotline. They have countless resources about the laws in different states and how to use copyright law to -- if a selfie is posted online without consent, the person who took the selfie owns the copyright so you can execute what's called a DMCA takedown, a Digital Millennium Copyright Act takedown. It's a great one-stop shop to learn about your different options. If you know who the person is, who is posting and sharing these images, domestic violence restraining orders or protective orders are a way to force somebody to stop sharing and take those images down. Another great resource that I direct folks to. There's a nonprofit called chayn, c, h, a, y, n, and they're based in the UK, and they have courses for victims to recover from trauma. And they have a specific course on recovering from image-based sexual abuse. The last thing I will tell folks is to never go through this alone. Because all the resources that are out there tend to place the burden on the victim, I really encourage people to activate their support networks. So if that's requesting takedowns, have your friends help you do it. If it's searching for where a photo might be posted, have your support network do it. If it's sharing these photos with StopNCII.org, have your friends help you. Have your friends help you save the evidence because it's really important where something is posted without consent to take a screenshot and record the URL so that if you have to report it to law enforcement or a judicial institution or Title IX, they know where it was and where it was being hosted. [01:08:21] Adam Dodge: And not only does this help alleviate the burden that is often placed on victims unfairly when it comes to responding to these forms of violence, it also ensures that they don't isolate themselves, because it's very easy to isolate oneself when targeted in this way and just close yourself off. But that's a really unsafe place for victims to be, in isolation. As you said, it's pushing a boulder up a hill. But the more folks we get involved, the more hands we get on that boulder, the quicker we'll get there. And that's why I so appreciate what the Joyful Heart Foundation is doing in this field, because we need organizations like this one who have done such a tremendous job, and no one's asked me to plug Joyful Heart. I'm doing this on my own accord. I love this organization, but it's such a great model for how you can address a problem like the rape kit backlog, and have agency and give people hope that if you galvanize people, if you raise awareness, if you push reform, you can have impact. It's not impossible. It's actually very possible. And so the fact that you all are focusing on this issue gets me really excited, because we really need the help. [01:09:30] Debbie Millman: Thank you. Adam. How can people learn more about EndTAB and what you're doing? [01:09:36] Adam Dodge: Sure. So my website is EndTAB.org. You can find me on social media at @AdamRDodge where I will post about these things. I also have a course for parents which can be found at the TechSavvyParent.com, which is really for parents of K through fifth graders on getting them ready for digital adolescence, talking to them before they start asking for a phone in middle school and an Instagram account, and getting in there earlier in an age-appropriate way so that we're responding, not reacting when things start happening with our kids in their digital lives. [01:10:10] Debbie Millman: Adam Dodge, thank you so much for doing this important work, and thank you so much for talking with me today on Survivor Stories, the podcast of the Joyful Heart Foundation. I'm your host, Debbie Millman, and I want to thank you for joining us today.

Other Episodes