The Danger of Deepfakes – What You Need to Know
With the rise in the popularity and availability of video, it is possible to find videos of people doing just about anything you can imagine. From the wholesome to the unwholesome, video is here to stay. Unfortunately, one of the hallmarks of videos—the idea that you are seeing reality being filmed—is no longer as reliable as it once was.
Deepfakes—What You Need to Know
Deepfakes are a new type of video, one where the face of an individual in a video is changed to look like someone else. They are worrisome because they can make it appear as though someone has done or said something that they did not do or say—which can have major implications in politics and celebrity, among other areas.
The news site Motherboard got the word out about deepfakes a while back, showing how computer algorithms were being used to put celebrity faces in pornography videos. The computer program was able to make it look like various celebrities were in the adult videos, which was obviously concerning to the celebrities involved and the public as a whole. These programs do not have to use the images of celebrities, either—they can put the faces of friends and family on the bodies of those in videos as well.
Questions about consent and the larger implications of deepfakes naturally followed. Those being put in the pornographic videos were not asked for their consent. If they had the ability to say no, they would. But of course, they cannot because it is random people on the internet creating these deepfakes.
The issues with deepfakes are significant for many reasons. No one should have to worry that they will pop up in a porn video. But the technology can be used for more than just pornography—it can be used to influence politics and other major issues. Political opposition can put the face of a candidate in a video and have it appear as though the candidate does or says things that will ruin chances of election. Good people could have their reputations ruined, potential leaders could never have a chance to help society, and the ability to trust videos, in general, has been severely compromised.
Potential Solutions to Deepfakes
There are two ideas being put forward to address deepfakes. The first is to put a watermark on videos to verify that they are legitimate and are sourced from a reliable individual or organization. The second is to have social media platforms identify and remove deepfakes—since they are arguably the only organizations with the sufficient power and resources to do so and they are the platforms where the deepfakes are shared.
Neither of these solutions will stop deepfakes from being a problem, but they offer a promising start to ensure people are not fooled by fake videos.