Forget Scammers, Nearly All Deep-Fake Technology is Being Used for Porn

In 2017, the internet was shaken by the arrival of the deepfake phenomenon. Originally, the term referred to fake porn, where actresses’ heads were placed onto the bodies of porn stars. But it quickly set off alarm bells for its capacity to be used for more nefarious means (assuming you’re not the kind of person who considers pornography the most nefarious means of all). “Deepfake” came to refer to any manipulated video, where people could be made to seem like they were doing or saying things they never did or said.

Obviously the potential for the technology to be used to deceive people, and pull off online scams on an ever-more sophisticated level, was huge, and many fretted about what the brave new fake world would bring us.

Well, as of 2019, the research has been done, and it turns out deepfakes are…mainly still just porn.

A report by DeepTrace, a company that makes tools to monitor synthetic media, has found that 96 per cent of deepfake videos are just fake porn, like in the olden days of 2017.

Of course, it’s no time for complacency: things could easily spiral from here. And hey, even fake porn is, in its way, a trifle disturbing. But for now, the potential for massive conspiracies to be implemented and millions of dollars ripped off lies dormant: the human need for pornography continues to dominate all.