GET free advice & TIPS

To protect and enhance your business

Privacy Policy

Concern as Deepfake videos double in 9 months


Earlier this month Facebook set up a 10M fund to detect deepfakes. 

What are Deep Fakes?

Deep Fakes are manipulated images and video which are produced to create realistic fake clips and they’ve been in the news for a while now. Researchers at the University of Washington came to the attention of the media for creating a fake video of then President Obama in 2017.

What’s in the News?

While there is concern that Deepfake videos could cause political unrest, help rig elections and create incidents on a national and global scale etc , at this point evidence shows that the majority of videos are pornographic in nature.  While the technology is moving fast, it’s not there yet regarding corporations and governments.

The Real Issue Now?

Henry Ajder, head of Research Analysis at Deeptrace stressed ‘The debate is all about the politics or fraud and a near term threat, but a lot of people are forgetting that deepfake pornography is a very real, very current phenomenon that is harming a lot of women’.

Deeptrace, a cyber security company found over 14,000 deepfake videos online, twice as many as the 7,964 found in December 2018. They reported 96% of those as being pornographic, most of them replacing a pornographic actor with a celebrity. Most of these were British and American actresses though South Korean singers featured significantly also. Natalie Portman’s image was used in an explicit video as was Emma Watsons and Wonder Woman’s Gal Gadot. 

The term Deepfake was used on Reddit in 2017. In just 2 years the 4 leading pornographic companies have attracted 134 Million views to their Deepfake videos. 

Revenge Porn Tool

Deepfake technology, as with spyware, is rapidly becoming more accessible and cheaper. While some skill is still required, having enough images (or selfies) is’nt a problem now and machine learning has accelerated the process. 

One app Deepnude allows you to synthetically remove clothes from a womans image for £40. Founder of anti-revenge porn campaign group Badass, Katelyn Bowden called the app terrifying. ‘Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo’ she said ‘This tech should not be available to the public’. W

BBC News reports that the technology assesses where clothes are in the image and masks them with matching skin tone, lighting and shadows and fills in physical features. 

The programmers have now shut it down saying ‘The probability that people will mis-use it is too high’ and that it had been created for ‘entertainment’.

Stay safe Online

Wayne & Team

Found this article useful?

Remember to share it with your family & friends.

Wayne Denner shares his knowledge & expertise on leading tech industry blog.

Read Now


Business, Career & Life Podcast