The Future Will Be Synthesised – E1: Deepfake Image Abuse 

“The Future Will Be Synthesised”
BBC Radio 4 documentary “The Future Will Be Synthesised”

About the author

Thomas Graham

Thomas Graham

Share This Post

At Metaphysic we are very proud that our own Head of Policy and Partnerships, Henry Ajder, is presenting a BBC Radio 4 and BBC Sounds documentary called “The Future Will Be Synthesised

Episode 1: Private Pain - Deepfake Image Abuse 

Henry interviews an amazing cast of expert commentators: 

 

This episode covers the following important discussion:

What do we want the synthetic future to look like? It’s seeping into our everyday lives, but are we ready? We need a conversation about the legal, policy, and ethical implications for society.

Deepfakes’ murky origins are in a form of sexual image abuse that is being used against hundreds of thousands of people, most of them women. Presenter and synthetic media expert Henry Ajder speaks to journalist Sam Cole, who first reported on deepfakes in 2018. She uncovered a Reddit forum sharing pornographic videos with the faces of famous Hollywood actresses transposed onto the bodies of porn performers. Since then technology has become much more accessible and ordinary women have become the target. Henry interviews a woman who was targeted with deepfake image abuse, and considers what we can do to protect citizens from synthetic media’s malicious uses.

For those of you who can’t listen in, here are some highlights: 

Henry talks to Samantha Cole, Senior Editor of Motherboard from Vice, about her coverage of deepfake porn. You can find her recent coverage related to these issues herehere and here

Setting the scene on what deepfake porn is, Sam Cole states:

“I guess as a journalist, we have eyeballs in like lots of places online. I’m always kind of monitoring a lot of different subcultures…We saw this one user posting to that forum, these video versions of that. So it was porn videos and then a celebrity’s face on it. It looked pretty good, like for what it was, it was, you know, short clips, like, you know, five or ten seconds and kind of like blurry and grainy.”

Sam Cole continues:

“So it was very big name, female celebrities. It’s also, you know, that’s what people wanted to see. And it was easy access to hours and hours and hours of footage. And then the porn videos were obviously also stolen without the consent of the people in them. These people who made this work had it stolen and then slapped someone else’s face onto their bodies. And they didn’t agree to that either.”  

Henry also talks to Noelle Martin, an activist and law reform campaigner at the University of Western Australia. Noelle has been the victim of deepfake porn and has since become a leading voice in the discussion around the regulation of harms surrounding deepfake porn and image abuse.   

Noelle Martin states:

“About 10 years ago, I discovered that anonymous people around the world had been taking images of me from social media and doctoring those images into pornographic shots of me – into a whole suite of fake doctored images that they distributed all over these pornographic sites…these were distributed on sites without my consent.”

“For a few years, I would say I spent trying to take the images down. I would contact the sites, contact the webmasters, and I learned very, very quickly that it was just an impossible task. And it was also quite distressing to see you because every time you would try and do that, you would discover new images and you would discover just how much it had spread.”

Noelle Martin continues:

“We have laws in Australia that criminalizes this conduct, but those laws are exceptionally limited in what they can actually do. So if somebody were to experience what I had experienced today, despite the fact that we have those laws, they would still be in the same position that I am. The laws are only effective in so far as you have a perpetrator that is known to the victim or the perpetrator that is in the same jurisdiction as the victim.”  

“What needs to happen is that countries around the world need to domestically criminalize this. And then ultimately there needs to be major networking and collaboration with law enforcement and governments to actually make sure the people, you know, in the UK, can’t just do this to someone like me or someone in Australia can’t do this to someone in the UK and ruin their lives and then get away with it.”

Henry also talks to, Jesselyn Cook,  a investigative tech reporter at NBC, She has spoken to many victims and written on the topic. You can find examples of her work here

Jesselyn Cook states:

“There is almost a cottage industry online where deep fake porn is a business and there are websites where you can request custom deep fakes or you can offer your paid services to create them at a really professional-looking level.”

“The creators and distributors of this content are often anonymous. So there’s really no way to sue them. You don’t know who you’re coming after and the platforms themselves in the US they’re protected by Section 230 of the Communications Decency Act, which shields these online intermediaries from liability.”

Jesselyn Cook continues:

“There was a lot of interest in the US about the potential political chaos that deep fakes could cause it did spark this big panic ahead of our 2020 Election that someone might come along and create a deep fake video of one of the candidates, um, saying something or doing something outrageous right before the vote.”

“And so my interest was kind of the lack of concern for the very real harm that had been afflicted on women and girls for the most part by deep fakes for years already, it was real harm. It was not being discussed. And instead, people were concerned about something that hadn’t even happened yet.” 

Metaphysic builds software to create incredible content with the help of artificial intelligence. Co-founded by Chris Umé, the leading creator of ethical hyperreal synthetic media and creator of @deeptomcruise, Metaphysic has been featured in many leading publications around the world – find out more on CNN60 Minutes and Business Insider. Metaphysic’s commitment to the evolution of ethical synthetic media is highlighted by our initiation and sponsorship of Synthetic Futures – a community dedicated to shaping a positive future for the technology in its many different forms. Find out more: www.metaphysic.ai and www.syntheticfutures.org.

For more info please contact info@metaphysic.ai or press@metaphysic.ai.

More To Explore

AI ML DL

Research Proposes ‘Moral’ Sanitization for Text-To-Image Systems Such as Stable Diffusion

New research from Korea and the United States has proposed an integrated method for preventing text-to-image systems such as Stable Diffusion from generating ‘immoral’ images – by manipulating the generative processes within the system to intercept ‘controversial’ content and transform the generated content into what the authors characterize as ‘morally-satisfying’ images instead.

manvatar-MAIN
AI ML DL

Creating State-of-the-Art NeRF Head Avatars in Minutes

If time were no object, Neural Radiance Fields (NeRF) might by now have made greater inroads into potential commercial implementations – particularly in the field of human avatars and facial recreation.

It is the mark of an educated mind to be able to entertain a thought without accepting it.

Aristotle