Skip to Content, Navigation, or Footer.
Tuesday, April 30
The Indiana Daily Student

city politics

Holcomb signs bill requiring election campaign communications with fabricated media to include disclaimer

cahb1133022824.jpg

A bill requiring certain election campaign communications that include fabricated media to include a disclaimer was signed into law by Indiana Gov. Eric Holcomb on March 12. 

Under House Bill 1133, fabricated media is defined as an audio or visual recording of a person’s speech, conduct or appearance altered without their consent; an artificially generated audio or visual imitation of a person; or audio or visual material depicting the speech, conduct or appearance of an artificially generated person.  

The fabricated media must inaccurately or fictionally depict the person or their speech and be unrecognizable as an imitation or alteration to a “reasonable” person.  

The bill also allows candidates who have been depicted in fabricated media without a disclaimer to bring civil action against the people who paid for the campaign communication, the person who sponsored the communication and the people who released the communication, if the disseminators knowingly removed a disclaimer before sharing the material. This year’s election will be the first where AI will be used as a tool by campaigns, bill sponsor State Sen. Spencer Deery R-West Layfette said.  

“Sometimes that's being used in appropriate responsible ways to be more efficient and effective,” Deery said. “But we are already seeing it being used in ways that are intended to deceive voters, and that's what my real concern is right now.” 

In June 2023, Florida Gov. Ron DeSantis’s presidential campaign released a video on social media that appeared to use AI-generated images to depict former president Donald Trump hugging Dr. Anthoney Fauci, according to CNN. This January, the office of the New Hampshire attorney general said it was investigating an apparent robocall that used AI to mimic President Biden’s voice to discourage voters from coming to the polls during the state’s primary elections, according to the Associated Press. 

While Deery said the technology being used right now is still crude, it was important for him to start getting Indiana’s legislation ahead of AI and draw lines on what is and is not appropriate in the state.  

“One of the worst things that could happen would be just to get to the point where it’s just normal, just what campaigns do,” Deery said. 

This is why, Deery said, he originally filed Senate Bill 7, which had slightly stricter requirements, making it a criminal penalty to not disclose the use of fabricated media. Deery said his colleagues wanted to start with a civil penalty, which was in HB 1133, so the bills were merged. 

HB 1133 didn’t originally apply to federal elections, Deery said. But it was important to him for the bill to not just apply to state and local elections, which are easy for Indiana to regulate. Deery said the senate also adjusted the bill’s language, so communications didn’t have to specifically endorse a candidate for the bill to apply. 

“I imagine that over the years, we're going to have a lot of legislation related to artificial intelligence,” Deery said. “We didn't want to fix all of the problems that may be coming with AI, because it's going to change so quick, but this is one that I felt like we needed to act on really quick.” 

Scott Shackelford, executive director of the Ostrom workshop at IU, which studies governance in various research areas, said fabricated media not only poses a threat to future elections, but has already been used in past races. In Chicago’s 2023 mayoral race, Shackelford said, there were well publicized attempts to insert deepfakes and help sway the elections outcome as a result. 

The European Union, Shackelford said, has taken a much more aggressive approach in regulating this area of technology, including the Digital Services Act, which came into force in November 2022, and the new AI Act, which is in the process of being formally adopted and translated. He said this has required platforms to quickly spot and take down disinformation and deep fakes or be fined. 

 “There's a whole rating scale for different uses of AI so, they're just kind of, frankly, a lot further along,” Shackelford said. 

The U.S., in contrast, has largely left it up to the tech platforms to regulate this on their own, Shackelford said. Each company has their own guidelines with varying effectiveness. In October 2023, President Biden issued an Executive Order on safe, secure and trustworthy AI. 

The order establishes new standards for AI safety and security and aims to protect privacy, advance equality and civil rights, protect consumer rights, while protecting competition in the market. One standard the order establishes, for example, is requiring the developers of the most powerful AI systems to share their safety test results and other critical information with the government. 

Cybersecurity in elections has come a long way since 2016, Shackelford said. And in many ways, he said, one could argue that the 2020 and 2022 election cycles were the most secure in U.S. history. He said this is partly due to the efforts from the Critical Infrastructure Security Agency and the Department of Homeland Security to create new election security resources for communities.  

National elections in the U.S. are a very decentralized process, Shackelford said, with counts coming from local and state election official across the country. 

Shackelford said he expects more states to try different approaches to deal with AI, which could provide precedents for a future federal standard.  

As of November 2023, 30 states have passed more than 50 laws to address AI in some capacity, according to the Brennan Center for Justice. While much of this enacted legislation does not directly address AI’s impact on elections, laws in Texas, California, Minnesota and Washington do. In Texas, for example, it is a criminal offense to create a deepfake video and disseminate it within 30 days of the election if the person is intentionally trying to harm a candidate or influence the outcome of the election.  

But, due to gridlock in Congress, Shackelford said his main concern is that states will probably be putting up competing proposals, which would make it challenging for both platforms and users to navigate.  

“Maybe you're still following races from your hometown, but maybe now you live in a different state,” Shackelford said. “And what if you comment on that or share something mistakenly online, so that opens you up to liability, that kind of thing.” 

The Indiana legislation applies to state, local and federal elections along with elected officials who are not up for reelection.  

The law, according to an article from an article from WNDU, went into effect immediately. 

Get stories like this in your inbox
Subscribe