Extract from Judge Scott Schlegel’s article “Navigating the Murky Waters of AI in Legal Evidence: A Closer Look at Authentication and Attorney Responsibilities”
Recent advancements in artificial intelligence, such as those announced by Adobe for Premiere Pro this week, have ushered in capabilities to seamlessly add or remove objects in videos and photos. These innovations, while impressive, present new challenges in legal contexts, particularly concerning the authenticity of evidence. How can courts distinguish between genuine and altered evidence?
The Rules of Evidence require that all evidence must be authenticated before it can be admitted in court. This means proving that the evidence is what it purports to be. However, with AI’s ability to alter reality so convincingly, traditional methods of authentication face new hurdles. The expertise needed to verify the integrity of digitally altered evidence is not always readily available, and acquiring such expert testimony can be prohibitively expensive for many parties.
Attorneys are bound by ethical duties to the court, including the obligation under Model Rule 3.3 to avoid presenting false evidence. With the advent of sophisticated AI tools, lawyers should now start scrutinizing digital evidence more rigorously than ever before. Counsel should no longer merely accept digital evidence as presented by clients. They should actively verify its authenticity before offering it in court; especially if they have a feeling the evidence is too good to be true.