Editor's note: All opinions, columns and letters reflect the views of the individual writer and not necessarily those of the IDS or its staffers.
In 1960, “The Twilight Zone” aired an episode about doppelgängers — doubles who slip into our world and seamlessly assume our lives. While the show left these figures’ origins a question, the decades since the episode premiered have supplied an answer. Our age has engineered machines capable of imitating a person’s voice, style, judgment and presence. We call such machines artificial intelligence. Their product: AI deepfakes.
As AI doppelgängers enter our world, what could yours end up doing? You might not be a politician or celebrity. You don’t need to be. Anyone’s online photos can be, and are being, used to generate deepfakes. This phenomenon represents a growing trend in high school bullying. Earlier this month, Philadelphia police charged a student with creating sexualized images of peers. The potential for more such stories only increases now that ChatGPT is planning to allow erotic content. Elon Musk’s Grok already does.
Some governments, like Denmark’s, are granting all citizens copyright to their own bodies, voices and facial features in response to the rise of AI. Where such laws are not being introduced, people are taking matters into their own hands. In January, actor Matthew McConaughey was granted eight trademarks relating to his appearance and voice after applying to the U.S. Patent and Trademark Office.
The point is “to create a clear perimeter around ownership,” McConaughey told The Wall Street Journal.
In Indiana, lawmakers are considering House Bill 1182, which would define “possession of a digital sexual image and distribution of a digital sexual image” as a crime.
This family of legal approaches to regulating AI falls short. It relies on the logic that AI-generated images, specifically deepfakes, are criminal because the people generating these images are violating others’ property rights over themselves.
It may sound empowering to have ownership over yourself, but this implies you are the kind of thing that can be owned — and, if you wish, be leased or sold. That’s just not the kind of thing a human is. Reducing part of your humanity to property requires a separation of your mind — that owns — from your body — that you use, exploit and, possibly, can destroy. But you don’t simply captain this body of a vessel. It is you.
So, I have a suggestion for Indiana lawmakers. We must go beyond the way we currently conceive of AI regulatory law and legislate in a way that respects human dignity. What I propose might sound strange. That’s because anything in our modern legal system is not equipped to respond to AI’s actual threats.
Even before it hit AI’s limit, the ownership-over-self model was doomed to have gaps. It leaves dignity — the essential quality of the person — a nebulous concept, because it cannot be economically pinned down. Therefore, dignity becomes a secondary concern, frequently relegated to idealistic movements, before “being made serious,” or intelligible to our system, through translation into property terms.
Catholicism and Orthodox Christianity are one place we could draw inspiration from. In these traditions, icons — visual depictions of Jesus, Mary and the saints — are not neutral artworks. They represent the figures portrayed for the purpose of others’ veneration toward them. In the icon, they effectively become present for this task.
That situation is not unlike that of the AI-generated deepfake. So, the logic of the Christian icon provides a compelling avenue for rethinking the way we legislate against AI-generated images. Again, the deepfake is not a neutral artwork. It represents the figures portrayed for the purpose of others’ acts toward them, whatever form of irreverence that takes. Deeper than the crime our current system recognizes, the actual offense against the images’ subject is not possession of another's property. It’s that a certain, inappropriate action targets another person as present in their likeness.
In Roman law and its medieval offshoots, the concept of iniura — the origin of the English word “injury” — included affronts to dignity, honor and physical integrity, often by proxy. Crimes against a corpse, for example, amounted to crimes against the person whose body it was. Yet clearly, a corpse would not have been owned by anyone at the time of the crime. Though the person was no longer present, their likeness stood in for them as recipient of the offense.
The crux of the matter is that we should treat wrongs against people as personal crimes, not as property infringement.
Eric Cannon (he/him) is a sophomore studying philosophy and political science and currently serves as a member of IU Student Government.



