← Back to Library

When Evidence Can Be Deepfaked, How Do Courts Decide What’s Real?

iStock/yannp / Planet Volumes/Unsplash / Brian Morgan

This story was originally published on thewalrus.ca

By Linda Besner

This story contains details about domestic violence that some readers may find disturbing.


For years, there was a box on a back shelf of P’s house in Edmonton labelled “insurance.” (For reasons of privacy, we are using this initial only.) Her husband at the time thought it contained paperwork, but P had filled it with a different kind of insurance: a physical record of smashed phones, broken eyeglasses, and photographs of the bruises his violent episodes had left on her body. “I had a circle of blue bruises around my mouth because he would cover my mouth and smother me,” she told me.

Photographs, videos, and audio recordings are highly persuasive to judges and juries. When a crime occurs in private, with no witnesses, a court contest is a tussle in which two stories compete to offer the most plausible explanation of the same facts. Photographs and audio recordings join seemingly unimpeachable objectivity with emotional impact: one study says combining visual and oral testimony can increase information retention among jurors by 650 percent.

Criminal defence lawyer Emily Dixon told me that, if a client shows her an exonerating photo or video, she isn’t expected to run analytic tests before submitting it into evidence. It’s reasonable to assume that a photo is real—for now. Yet we are fast approaching a world in which we can no longer believe our eyes or ears. The onset of artificial intelligence, in the justice system as elsewhere, is poised to overturn existing practices.

Specialists can still spot the anomalies that distinguish AI-generated images from real photos. “But in a year,” digital forensics expert Simon Lavallée told me, “we won’t.” Maura R. Grossman is a lawyer who has long worked to promote the use of advanced technologies for legal tasks, such as document review. When it comes to the threat of deepfakes, however, Grossman believes Canada’s evidence laws will require an overhaul. “Before, if I wanted to fake your signature, I had to have some talent,” she told me. “In this day and age, you could make a deepfake of my voice in two minutes.” Juries, Grossman has written, may increasingly be skeptical of all evidence.

For complainants like P, the erosion of this trust carries a high cost: left with only doubt, we may be tempted to rely

...
Read full article on →