U.S. Rep. Yvette Clarke (D-N.Y.) has
proposed a bill
that would require political campaigns and groups to reveal their use of AI-generated imagery in political ads.
Clarke said her bill is a direct response to the Republican National Committee's recent
AI-produced ad
depicting dystopian images and scenarios that it suggests could happen if President Biden were re-elected.
Clarke's bill aims to amend federal campaign finance laws to require political ads to disclose any AI-created content, as well as compel the Federal Election Commission (FEC) to create rules
to enforce the disclosure.
- If passed into law, the requirement would take effect on Jan. 1, 2024.
-
Clarke noted in a news release that the 2024 elections will be the first time in U.S. history that generative AI is used in political advertising. The technology can "manipulate and
deceive people on a large scale," she said.
- While the RNC tagged its AI footage with a disclaimer saying "built entirely with AI imagery," Clarke argued that other political groups might not follow suit.
-
"I think there are really important uses of AI, but there have to be some rules to the road so that the American people are not deceived or put into harm's way," she
said
.