Fake Adult Videos – Korean Celebrities Targeted

Recently in the news, there’s been a new issue of what are being called deep-fakes. These are video clips that use a celebrity’s face merged onto a video of something, usually adult videos. These images and videos have been around for a while, but now with the use of new apps like FakeApp, the process has become streamlined and straightforward enough for anyone to attempt to create with just a photo-set, an adult video and a few days of rendering the fake video. Previously you would need a high level of technical knowledge and a powerful computer network to create them, instead these new apps use A.I to create these new extremely realistic deep-fake videos.

Most of the fake videos involve actresses or singers, such as Game of Thrones actresses, e.g. Emilia Clarke, movie stars such as Emma Watson, Gal Gadot, and current pop-stars such as Ariana Grande.

But there have also been people making these fake videos of some of the popular Korean celebrities, such as AOA‘s Seolhyun and Blackpink‘s Rose. It seems no one is safe from this new issue, and it’s only just started. Most likely there will be videos of every female celebrity because more and more people are finding out about this and making their own using the app.

As a legal issue, not much can be done right now because the videos are being made not to damage an individual’s image, but for personal use. The videos aren’t being used commercially, and the bodies aren’t the celebrities’ real bodies, so they have no privacy right to them; essentially the videos are composites, and so fall under a derivative work.


Blackpink’s Rose


AOA’s Seolhyun

However, the backlash is already strong, with many media outlets lambasting the fake videos and their creators. Some people have commented that this might be a push towards what will be worldwide Internet censorship, because there is no way to control this type of thing otherwise.

Groups sharing these videos have already started to expand greatly, such as the sub-reddit http://www.reddit.com/r/deepfakes . However many websites are responding quickly and banning these clips, so it’s possible that these groups will be pushed to invite-only areas.

Obviously if you are the target of one of these videos, it would be very disturbing, even more disturbing when the next step from celebrities is that people will make these videos from people they know in real life. So with just access to someone’s Instagram or Facebook, they could create a fake porn video from someone they work with or go to school with. This opens up a whole host of problems and issues that will be difficult to solve, not to mention the implications of fake news using these videos, such as fake sex tapes or videos to ruin someone’s reputation.

We’re interested to know what our readers think of these fake videos, are they utterly horrible and something you’d never view, or are they harmless because it’s just a face and not totally real. Do you see them as something similar to the celebrity’s nude photo leak, “the Fappening”, or is it less of an issue because they are just fakes, and they can be easily proven to be fakes by experts. Whatever you think, the issue is going to get bigger and you’ll be hearing more about deep-fakes soon.

Related Posts


  1. Kok Fu SHeng 5th February 2018 Reply

    If this is done to destroy Korean or Asian celebrities, no no.. if American celebrities, serve them right.

    but I am more happy if this can destroy rich people’s lifestyle… dear creators, please do so to elite people… please.

    The elites has been arrogant for to long.. e.g: Apple CEO.

  2. thill321 6th February 2018 Reply

    If this is done for personal entertainment without flaming comparisons/comments, I believe, most connoisseurs of female models/celebrities would be quite keen to keep these derivative works….
    *Cough* Basically, they are hot so that’s an A-okay in my books. And Flava already deals with asian babes strutting their ‘stuff’, then these ‘AV’ photosets could be an additional plus for redFlava

  3. Mr Ious 7th February 2018 Reply

    Actually its okay and good for me to see that, Lol. But, its gonna be hard time for them and their career if their fake photos going worldwide.

    • Kok Fu SHeng 7th February 2018 Reply

      just use that to attack business people and elites, and destroy their life as a lesson to others not to show off and brag their riches, directly or indirectly. I want to see suicide rate among elites increase by 100%.

      • lol manlets 8th February 2018 Reply

        Seek professional help

  4. LeeC22 11th February 2018 Reply

    Celebrity fakes have been around for over 25 years. Celebrity fakes have been online for probably the same length of time. All DeepFakes did, was make it easier to create videos instead of still images. So the outrage wasn’t because of something new and dangerous, it was because of something easy. You would think, with the absurd level of frothing-outrage, that this kind of thing had never happened before. If someone had really wanted to, they could have created this kind of thing by hand but it would have taken a long time.

    And the fappening had no connection to this, because that was idiot famous people, putting stupid pictures in a place that could be hacked/attacked. Only a moron celebrity (or non-celebrity) would create sexual pictures/videos of themselves (and others) and then store them online… but again, Pamela Anderson and Tommy Lee… sex-tapes are nothing new.

    The reaction against DeepFakes was motivated by money, nothing else. They fabricated a bunch of trumped up rules under the guise of “involuntary pornography” and then cleansed the sites based on them. Reddit had a sad-loser that infiltrated DeepFakes (shane something or other), then (according to sources) planted some kiddie porn and used that to get the whole thing shut down. They even shut down the groups that had zero porn in them, it was a fiasco of the highest order.

    Meanwhile, in the real world, Reddit actively protects groups that incite rape against women and minors. They protect groups that incite death-threats and hate-speech. Heck, they protect groups that enjoy killing animals on video… and no, not hunting, throwing pet dogs off bridges etc… and I haven’t even mentioned the dead babies, videos of people getting killed etc…

    In a world of real horrors, they were more concerned about unhappy ad-sponsors. What you have seen here, is how much censorship money can buy and how much power celebrities really have.

  5. kontolajaib 13th June 2018 Reply

    but, this very beautifull ladys…

Leave a Comment!

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: