Can a Celebrity Claim “Memelord Immunity” Over Videoed Statements Due to Deep Fakes?

In a recent legal dispute, a judge dismissed an unusual defense presented by Tesla during the discovery phase of a wrongful death lawsuit involving the controversial((Gardner, D. (2020) Tesla autopilot name banned in Germany, WhichCar. Are Media. Available at: https://www.whichcar.com.au/car-news/tesla-autopilot-name-banned-in-germany (Accessed: May 2, 2023).)) “Autopilot” system.

A Tesla cruising down the road. This image is created by AI. The colours are dramatic, and there is a sunset behind the car.
A Tesla cruising down the road. This image is created by AI.

Background to the case

Walter Huang, a U.S. Apple engineer, tragically died when his Tesla Model X with Autopilot engaged crashed into a highway median. Huang’s family has filed a wrongful death lawsuit against Tesla, alleging that the Autopilot driver assistance system was responsible for the fatal crash on March 23, 2018. The lawsuit contends that Autopilot misinterpreted lane markings, failed to identify the concrete median, and accelerated into the barrier without braking.((Korosec, K. (2019) Tesla sued in wrongful death lawsuit that alleges autopilot caused crash, TechCrunch. Available at: https://techcrunch.com/2019/05/01/tesla-sued-in-wrongful-death-lawsuit-that-alleges-autopilot-caused-crash/ (Accessed: May 2, 2023).))

What did Musk claim about Autopilot?

Significantly for Tesla’s Elon Musk, the family points to a series of statements he made about the reliability of Tesla Autopilot software as potentially misleading.

Musk videos are faked online so none should be considered

In a recent submissions filing, Tesla argued that no videos featuring Mr. Musk could be used as evidence because he is a target for deep fakes. They stated:

“While it might seem unusual that Tesla could not admit or deny the authenticity of video and audio recordings containing Mr. Musk’s statements, the reality is that he, like many public figures, is the subject of numerous deep fake videos and audio recordings that purport to show him saying and doing things he never actually said or did…”((Sz Huang et al v. Tesla, Inc. et. al., Submissions from Telsa, Lines 10 to 23, Section II. A, p4, Defendant Tesla opposition to plaintiff’s motion to compel re-Tesla Inc supplemental responses to written discovery: motion for the deposition of Elon Musk: and motion for sanctions, 27 April 2023, County of Santa Clara, In the Superior Court of the State of California (USA), Available at: https://www.plainsite.org/dockets/3y1d4wlms/superior-court-of-california-county-of-santa-clara/sz-huang-et-al-v-tesla-inc-dba-tesla-motors-inc-et-al/ as Document 228))

This photo is AI generated of a speech. It is not real.
This photo is AI generated of a speech. It is not real.

Judge denies blanket “memelord immunity”

Her Honour Pennypacker responded to Tesla’s argument, stating((Pennypacker HH, Tentative Ruling, p28, Calendar Lines 4 and 5 for Sz Huang et al v. Tesla, Inc. et. al., Case 19CV346663, 27 April 2023, Available at: https://cdn.arstechnica.net/wp-content/uploads/2023/04/musk-deepfake-ruling.pdf)):

“[Tesla’s] position is that because Mr. Musk is famous and might be more of a target for deep fakes, his public statements are immune. In other words, Mr. Musk, and others in his position, can simply say whatever they like in the public domain, then hide behind the potential for their recorded statements being a deep fake to avoid taking ownership of what they did actually say and do. The Court is unwilling to set such a precedent by condoning Tesla’s approach here.”

Essentially, the judge has ordered Mr Musk confirm or deny the videos authenticity under oath.

The Rise of Deep Fakes – what are they?

Deep fakes are AI-generated videos or audio recordings that manipulate real content to create realistic but fabricated footage. The technology used for creating deep fakes has advanced rapidly in recent years, leading to a surge in convincing counterfeit media. These manipulated materials pose a significant challenge for verifying the authenticity of digital content, especially when it comes to public figures who may be targeted more frequently.

Deep fakes have been used in various malicious ways, such as spreading disinformation, tarnishing reputations, and even blackmail. As a result, they have become a growing concern for both the general public and legal professionals. It’s becoming increasingly important for individuals, businesses, and the legal system to develop strategies and tools to identify and counteract deep fakes, ensuring the reliability and trustworthiness of digital evidence.

Examples of the disputed Tesla videos

Here are two examples of the disputed videos:

     

      • A Youtube video from eight years ago featuring Mr. Musk answering questions at a shareholder meeting, with the upload date stamped by YouTube((Investary0, Youtube User. 2014., Relevant part at 15:42 in video, Tesla Motors 2014 Shareholder Meeting Part 2. [online] Available at: https://www.youtube.com/watch?v=gXeqFrwfIsA&t=942s [Uploaded: 5 June 2014, Accessed 2 May 2023].‌)). He said((Brodkin, J. (2023). Judge slams Tesla for claiming Musk quotes captured on video may be deepfakes. [online] Ars Technica. Available at: https://arstechnica.com/tech-policy/2023/04/judge-slams-tesla-for-claiming-musk-quotes-captured-on-video-may-be-deepfakes/ [Accessed 2 May 2023].)) “I’m confident that—in less than a year—you’ll be able to go from onramp to highway exit without touching any controls.”

    Procedures for testing the validness of evidence exist – kind of

    While it’s true that various tools can modify videos, including altering spoken words, and that deep fakes are becoming increasingly prevalent, the judge’s pushback serves as a reminder that there are already established procedures to address such blanket claims. For this type of evidence, a detailed forensic examination could offer insights.

    It’s essential to note that forensic examinations may not always produce conclusive results, but the demand for such investigations is likely to increase. As deep fakes get better, the percentage of inconclusive results of video forensic studies will increase.

    Practical actions an organisation can take to increase its deep fake defences

    For businesses, it may be worthwhile to consider storing videos and having them date-stamped by reliable third parties. This would provide evidence of any manipulation of spoken words by company representatives and help maintain the integrity of their statements.

    For a comprehensive cyber security review, reach out for a discussion.

     

     

    Declaration on self-driving interest

    It is well-known that I am unable to drive due to extreme vision loss, and have actively cheered on the development and release of self-driving systems.

    Important note on general advice

    I am a cyber security specialist, but I may not be YOUR cyber security specialist. 

    All cyber-security and digital forensics decisions require careful consideration of your own circumstances and risks. General information is not not tailored to your individual needs. You should seek the advice of a suitably qualified cyber-security or digital forensics specialist.