{"id":1847,"date":"2023-11-09T14:03:01","date_gmt":"2023-11-09T20:03:01","guid":{"rendered":"https:\/\/dda.ndus.edu\/ddreview\/?p=1847"},"modified":"2024-09-03T09:40:26","modified_gmt":"2024-09-03T14:40:26","slug":"what-attorneys-should-know-about-deepfakes","status":"publish","type":"post","link":"https:\/\/dda.ndus.edu\/ddreview\/what-attorneys-should-know-about-deepfakes\/","title":{"rendered":"What Attorneys Should Know About Deepfakes"},"content":{"rendered":"\n<h2><em>AI as Problem and Solution<\/em><\/h2>\n\n\n\n<p class=\"has-drop-cap\">Artificial intelligence (AI) is advancing at a startling pace, and society is grappling with AI\u2019s potential to be both beneficial and harmful. \u201cDeepfakes\u201d are one of the harms enabled by AI that has begun to show AI\u2019s potential to spread misinformation, sow distrust, and enable fraud and other criminal acts. Law and technology experts have also begun to sound the alarm on the threats, which deepfakes may pose to fair adjudications in courts of law, as AI has the potential to permit inauthentic evidence to be admitted at trial, while simultaneously allowing authentic evidence to be rejected based upon improper claims of inauthenticity.<\/p>\n\n\n\n<p>In a deepfake, AI is used to create a new\u2014and fake\u2014image, video or audio, based upon a \u201csampling\u201d of actual images, video or audio of a real person. For example, deepfake technology could scan the video of an actual political speech, delivered by a real politician, and then create a fake video purporting to contain a speech delivered by that same politician. The term \u201cdeepfake\u201d is derived from the process used to create fake images, videos and audio, which uses \u201cdeep learning\u201d algorithms that process real-life data (such as voice patterns and images of a real-life speaker) to then produce fake output (such as phony audio and video of that same speaker).<\/p>\n\n\n\n<p>Deepfakes may be used for a variety of malicious purposes, with the common goal of tricking the public into believing that a person said or did something that the person did not actually say or do. Deepfakes have targeted politicians, such as one deepfake video purporting to show President Barack Obama launching into an obscenity laced tirade against President Donald Trump, and another deepfake distributed by a Belgian political party purporting to show a speech delivered by President Trump urging Belgium to withdraw from an international agreement.i<\/p>\n\n\n\n<p>Deepfakes have also been distributed to vilify public figures and leaders of industry, such as a recent deepfake purporting to show Facebook CEO Mark Zuckerberg bragging about having \u201ctotal control of billions of people\u2019s stolen data.\u201dii&nbsp;Deepfakes have also been used to commit crimes. For example, one criminal scheme involved scammers, using deepfake technology to impersonate the voice of a relative, placing desperate calls to unwitting victims, pleading for the victims to quickly transfer funds due to a phony emergency.iii<\/p>\n\n\n\n<h1><strong>Liar&#8217;s Dividend<\/strong><\/h1>\n\n\n\n<p>Legal experts predict that as deepfakes become more prevalent and difficult to detect, they will increasingly be the subject of evidentiary disputes in litigation. Deepfake technology has become easily accessible in recent years, and experts predict that parties will increasingly attempt to introduce evidence into court that is actually a deepfake.<\/p>\n\n\n\n<p>Additionally, experts predict that legitimate audio, videos and images will increasingly be challenged in court as being deepfake in a phenomenon known as the \u201cliar\u2019s dividend.\u201d According to the liar\u2019s dividend, as society \u201cbecomes more aware of how easy it is to fake audio and video, bad actors can weaponize\u201d that skepticism. Because a \u201cskeptical public will be primed to doubt the authenticity of real audio and video evidence,\u201d actors can raise bad faith challenges by alleging that authentic evidence is actually deepfake.iv&nbsp;Consequently, if \u201caccusations that evidence is deepfaked become more common, juries may come to expect even more proof that evidence is real,\u201d which could then require parties to expend additional resources to defend against unfounded claims that authentic evidence is fake.v<\/p>\n\n\n\n<p>A recent high-profile example of a deepfake claim being raised in court to cast doubt upon an authentic video occurred in a wrongful death case pending against automaker Tesla, where the court rejected Tesla\u2019s assertion that a widely publicized video of CEO Elon Musk being interviewed at an industry conference in 2016 is a deepfake. The court found Tesla\u2019s assertion here to be \u201cdeeply troubling,\u201d and the court responded to Tesla\u2019s assertion by ordering a limited deposition of Musk on the issue of whether or not he made certain statements at the 2016 conference.vi<\/p>\n\n\n\n<h1><strong>Detecting Deepfakes<\/strong><\/h1>\n\n\n\n<p>Attorneys should be prepared to address deepfakes in their practices as deepfakes become more commonplace. The following are signs that a video, audio or image could be a deepfake:<\/p>\n\n\n\n<p><strong><em>Unreliable, questionable sources: <\/em><\/strong>Deepfakes are usually shared, at least initially, by unreliable, questionable, non-mainstream sources. If, for example, the originator of a videoed speech by a high-profile person is an unknown online entity, there is a strong likelihood that the recording is a deepfake.<\/p>\n\n\n\n<p><strong><em>Blurriness: <\/em><\/strong>In deepfakes, the target will often appear blurrier than the background.<\/p>\n\n\n\n<p>In particular, the hair and facial features of deepfake targets often appear blurry compared with other aspects of the video or image.<\/p>\n\n\n\n<p><strong><em>Mismatched audio: <\/em><\/strong>Deepfake visuals are often produced separately from deepfake audio, and then \u201cstitched\u201d together to create a final video. Consequently, visuals and audio can be misaligned, resulting in a mismatch between what is seen and what is heard. If, for example, there is a delay between what is heard and the movement of the speaker\u2019s mouth, such that it appears as though the speaker is lip-synching, this is a strong indication that the video has been deepfaked.<\/p>\n\n\n\n<p><strong><em>Mismatched lighting: <\/em><\/strong>Deepfakes will often retain the original lighting from the source video or image and transpose the original lighting into the new video or image, thus causing a mismatch of lighting within the final deepfake. If a video or image contains unusual, inexplicable shadowing, this is a telltale sign that it has been altered and might be a deepfake.<\/p>\n\n\n\n<h1><strong>AI to Detect Deepfakes<\/strong><\/h1>\n\n\n\n<p>As deepfake technology progresses, it will become difficult, and eventually impossible, for the human eye to detect deepfakes. Consequently, it will become necessary for attorneys to rely upon AI to detect deepfakes. Stated differently, we will need to rely upon AI to detect the works of other AIs, thus leading to an arms race between deepfake creators and deepfake detectors. In any event, attorneys should plan for a future in which they must safeguard against being fooled by deepfakes, be able to identify and counter deepfakes offered by their opponents in evidence, &nbsp;and be able to defend against bogus accusations that their own proffered evidence is deepfake. <em>\u25d9<\/em><\/p>\n\n\n\n<p><em>This article was originally published in the June 2023 issue <\/em><em>of Wyoming Lawyer.<\/em><em><\/em><\/p>\n\n\n\n<hr class=\"wp-block-separator\" \/>\n\n\n\n<h2>References<\/h2>\n\n\n\n<p><sup>I <\/sup>Ian Sample, \u201cWhat are Deepfakes \u2013 and How Can You Spot Them<em>?<\/em>\u201d, The Guardian, January 13, 2020, <a href=\"http:\/\/www.theguardian.com\/technology\/2020\/\">https:\/\/www<\/a>.theguar<a href=\"http:\/\/www.theguardian.com\/technology\/2020\/\">dian.com\/technology\/2020\/<\/a> jan\/13\/what-are-deepfakes-and-how-can-you-spot-them; Hans Von Der Burchard, \u201cBelgian Socialist Party Circulates \u2018Deep Fake\u2019 Donald Trump Video,\u201d Politico, May 21, 2018, <a href=\"https:\/\/www.politico.eu\/article\/spa-donald-trump-belgium-paris-climate-agreement-belgian-socialist-party-circulates-deep-fake-trump-video\/\">https:\/\/www.politico.eu\/article\/spa-donald-trump-belgium-paris-climate-agreement-belgian-socialist-party-circulates-deep-fake-trump-video\/<\/a><\/p>\n\n\n\n<p>ii Von Der Burchard, <em>supra <\/em>note 1.<\/p>\n\n\n\n<p>iii Pranshu Verma, \u201cThey Thought Loved Ones Were Calling for Help. It was an AI Scam,\u201d Washington, March 5, 2023, <a href=\"https:\/\/www.washingtonpost.com\/%20technology\/2023\/03\/05\/ai-voice-scam\/\">https:\/\/www.washingtonpost.com\/ technology\/2023\/03\/05\/ai-voice-scam\/<\/a><\/p>\n\n\n\n<p>iv Shannon Bond, \u201cPeople Are Trying to Claim Real Videos are Deepfakes. The Courts are Not Amused.\u201d NPR, May 8, 2023, <a href=\"https:\/\/www.npr.org\/2023\/05\/08\/1174132413\/people-are-trying-to-claim-real-videos-are-deepfakes-the-courts-are-not-amused\">https:\/\/www.npr.org\/2023\/05\/08\/1174132413\/people-are-trying-to-claim-real-videos-are-deepfakes-the-courts-are-not-amused<\/a><\/p>\n\n\n\n<p>v <em>See id.<\/em><em><\/em><\/p>\n\n\n\n<p>vi \u201cElon Musk\u2019s Statements Could Be \u2018Deepfakes,\u2019 Tesla Defence Lawyers Tell Court,\u201d The Guardian, April 26, 2023, <a href=\"https:\/\/www.theguardian.com\/technology\/2023\/apr\/27\/elon-musks-statements-could-be-deepfakes-tesla-defence-lawyers-tell-court\">https:\/\/www.theguardian.com\/technology\/2023\/apr\/27\/elon-musks-statements-could-be-deepfakes-tesla-defence-lawyers-tell-court<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>AI as Problem and Solution Artificial intelligence (AI) is advancing at a startling pace, and society is grappling with AI\u2019s potential to be both beneficial and harmful. \u201cDeepfakes\u201d are one [&hellip;]<\/p>\n","protected":false},"author":127,"featured_media":1848,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[21,4,25,522,679,204,286],"tags":[48,611,596,607,504,610,605,51,608,606,264,601,604,597,600,599,598,609,602,603,586],"_links":{"self":[{"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/posts\/1847"}],"collection":[{"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/users\/127"}],"replies":[{"embeddable":true,"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/comments?post=1847"}],"version-history":[{"count":4,"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/posts\/1847\/revisions"}],"predecessor-version":[{"id":2040,"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/posts\/1847\/revisions\/2040"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/media\/1848"}],"wp:attachment":[{"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/media?parent=1847"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/categories?post=1847"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dda.ndus.edu\/ddreview\/wp-json\/wp\/v2\/tags?post=1847"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}