The myriad of intellectual property issues arising from the creation and use of deepfakes recently came to prominence when Apple announced that it had obtained a US patent for a method for the creation of deepfakes. It showed that deepfake technology has come of age and that Apple is in principle able to protect its technology, at least in the USA.
There may be legitimate use of a deepfake, for example, in satire or in drama. Frequently though they are created and used maliciously to spread false information, fraudulently or as pornography.
It is relatively straightforward to create a deepfake; generally by using AI. There are also a number of available apps, including Zao which let users add their faces to chosen TV and cinema celebrities.
From the intellectual property perspective, established UK law can cater for the consequences of the manufacture and use of deepfakes. This is largely because their creation is no more than a digital creation of images which, albeit less technically precise, could also be created manually by an artist or a printer.
Copyright is the most relevant right. A deepfake created digitally potentially falls into the category of an artistic work as a photograph within section 4(1) CDPA 1988 or if in video format as a film under section 5B(1). Section 9(3) establishes that the author of a computer-generated work is the person by whom the arrangements necessary for the creation of the work are undertaken. Subject therefore to fulfilling the requirements of the legislation including originality, the copyright in the deepfake is, in principle, enforceable.
Copyright in the deepfake will not exist if it is an infringing copy of an earlier work and therefore lacks originality. There is also a defense to infringement if, under grounds of public policy or public interest, the work can be shown to be libelous, obscene, scandalous or irreligious; pursuant to Section 171(3). This is an uncertain but developing area of law entwined with the ECHR but based also upon the inherent jurisdiction of the courts to refuse to enforce copyright where to do so would run counter to public policy.
There is also a “parody” defense available to an infringer stemming from The Copyright and Rights in Performances (Quotation and Parody) Regulations 2014 providing that reproducing an artistic work for the purpose of “caricature, parody or pastiche” may be entitled to a fair dealing defense.
Other than copyright, there are a panoply of rights potentially available to prevent the promotion and circulation of a deepfake including passing off, trademark law, infringement of privacy and breach of the Human Rights Act 1998, breach of confidence, defamation, false attribution of authorship under section 84 CDPA, breach of an author’s moral right of integrity and the criminal law including harassment under the Protection from Harassment Act 1997, obscenity law, trade descriptions and theft.
A good example of the use of passing off was Alan Clark v Associated Newspapers where the well known politician and diarist successfully sued the Evening Standard for passing off and false attribution of authorship for publishing a regular spoof diary in the newspaper using a photograph of Clark headed ;“Not The Alan Clark Diary”.
As a sequel to that case, Alan Clark proceeded to register a photograph of his image in photographic form as a trademark in classes 9 and 16 which would have enabled him to bring a simpler and more straightforward claim for trademark infringement against the newspaper. It is now common practice for celebrities to register their photographic image as a trademark to prevent the image being used by a third party in the course of trade. This potentially provides a remedy against the person using the deepfake image.
The right of privacy remains the creature of judicial determination, following the seminal Naomi Campbell decision, and continues to be a flexible cause of action. A claim for breach of privacy may exist when a private representation is the basis of the deepfake footage. However, most deepfakes are derived from publicly available footage so its use may be more limited.
What is now apparent is that the Courts in the near future will have to grapple with the consequences of the creation and reproduction of deepfakes by using the well-stocked armory of intellectual property law.
Written by Clive D Thorne, Partner, McCarty Denning