Recently in the news, there’s been a new issue of what are being called deep-fakes. These are video clips that use a celebrity’s face merged onto a video of something, usually adult videos. These images and videos have been around for a while, but now with the use of new apps like FakeApp, the process has become streamlined and straightforward enough for anyone to attempt to create with just a photo-set, an adult video and a few days of rendering the fake video. Previously you would need a high level of technical knowledge and a powerful computer network to create them, instead these new apps use A.I to create these new extremely realistic deep-fake videos.
Most of the fake videos involve actresses or singers, such as Game of Thrones actresses, e.g. Emilia Clarke, movie stars such as Emma Watson, Gal Gadot, and current pop-stars such as Ariana Grande.
But there have also been people making these fake videos of some of the popular Korean celebrities, such as AOA‘s Seolhyun and Blackpink‘s Rose. It seems no one is safe from this new issue, and it’s only just started. Most likely there will be videos of every female celebrity because more and more people are finding out about this and making their own using the app.
As a legal issue, not much can be done right now because the videos are being made not to damage an individual’s image, but for personal use. The videos aren’t being used commercially, and the bodies aren’t the celebrities’ real bodies, so they have no privacy right to them; essentially the videos are composites, and so fall under a derivative work.
However, the backlash is already strong, with many media outlets lambasting the fake videos and their creators. Some people have commented that this might be a push towards what will be worldwide Internet censorship, because there is no way to control this type of thing otherwise.
Groups sharing these videos have already started to expand greatly, such as the sub-reddit http://www.reddit.com/r/deepfakes . However many websites are responding quickly and banning these clips, so it’s possible that these groups will be pushed to invite-only areas.
Obviously if you are the target of one of these videos, it would be very disturbing, even more disturbing when the next step from celebrities is that people will make these videos from people they know in real life. So with just access to someone’s Instagram or Facebook, they could create a fake porn video from someone they work with or go to school with. This opens up a whole host of problems and issues that will be difficult to solve, not to mention the implications of fake news using these videos, such as fake sex tapes or videos to ruin someone’s reputation.
We’re interested to know what our readers think of these fake videos, are they utterly horrible and something you’d never view, or are they harmless because it’s just a face and not totally real. Do you see them as something similar to the celebrity’s nude photo leak, “the Fappening”, or is it less of an issue because they are just fakes, and they can be easily proven to be fakes by experts. Whatever you think, the issue is going to get bigger and you’ll be hearing more about deep-fakes soon.