Deepfake videos are the product of artificial intelligence or machine-learning applications that merge, combine, replace and superimpose images and video clips onto a video, creating a fake video that appears authentic. The main issue with Deepfake videos is that anyone can produce explicit content without the consent of those involved. While some of these videos are humorous and benign, the majority of them are pornographic. The faces of celebrities and other well-known (and lesser-known) individuals have been superimposed on the bodies of porn stars. The existence of this technology erodes trust in video evidence and adversely affects its probative value in court. This article describes the current and future capabilities of this technology, stresses the need to plan for its treatment as evidence in court, and draws attention to its current and future impact on the authentication process of video evidence in courts. Ultimately, as the technology improves, parallel technologies will need to be developed and utilised to identify and expose fake videos.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.