State Representative Lipper-Garabedian Joins Massachusetts House to Pass Bills Regulating AI Use in Elections
Bills seek to protect the integrity of the electoral process
BOSTON – Wednesday, February 11, 2026 – State Representative Kate Lipper-Garabedian (D-Melrose) joined the Massachusetts House of Representatives today to pass bills regulating the use of artificial intelligence in political advertisements. The bills would require the disclosure of AI use in political advertisements and would ban deceptive communications about a candidate or about the electoral process within three months of an election.
“In the last few weeks, Massachusetts residents have viewed AI-generated content during hours of Superbowl commercials and in a major social media ad for the gubernatorial race,” said State Representative Kate Lipper-Garabedian (D-Melrose). “As someone who thinks extensively about the exciting potential of technology, I recognize we also must safeguard against its risks as AI becomes an increasing part of our daily lives. Fair elections require accurate information, and the House is working to ensure that voters have the tools they need to make informed decisions based on facts. Thank you to Speaker Mariano, Chair Michlewtiz, Chair Hunt, Chair Farley-Bouvier, and Leader Jones for your leadership on these bills.”
“As artificial intelligence continues to reshape our economy and many aspects of our daily lives, lawmakers have a responsibility to ensure that AI does not further the spread of misinformation in our politics. That’s why these bills are so important, as they mandate that campaigns disclose the use of AI in any political advertisements, and they ban the use of AI in campaign ads 90 days before an election,” said House Speaker Ronald J. Mariano (D-Quincy). “I want to thank my colleagues in the House for their work on this legislation, and for recognizing the importance of the safeguards that these bills put in place.”
An Act to Protect against Election Misinformation prohibits the distribution of deceptive communications within 90 days of an election, which includes:
Audio or visual media which depicts a candidate with intent to injure their reputation or deceive a voter into voting for or against them.
Media that concerns the safety or regular operations of an election with intent to disrupt the integrity of the electoral process.
Content with the intent to mislead voters as to the date and time of an election; the requirements, methods, or deadlines to vote; the certification of an election; and the express endorsement of a candidate or ballot initiative by a political party, elected official, nonprofit organization, or another person.
The bill authorizes a political candidate whose voice or likeness appears in a materially deceptive audio or visual media to seek injunctive or other equitable relief prohibiting the distribution of the media, or to bring an action for damages and attorney’s fees against the party that distributed the media. Exemptions for the 90-day prohibition include: media outlets who air the ads or report on the ads as part of a newscast as long as they clearly acknowledge that there are questions about its authenticity; websites, newspapers, magazines and periodicals; and satire and parody.
An Act enhancing disclosure requirements for synthetic media in political advertising requires that any synthetic media audio or video communication intended to influence voting for or against a candidate or ballot proposition must disclose at the beginning and end of the communication that it contains AI generated materials. Violations are punishable by a fine of not more than $1,000.
An Act to Protect against Election Misinformation passed the House by a vote of 154-3. An Act enhancing disclosure requirements for synthetic media in political advertising passed the House by a vote of 157-0. Both bills now go to the Senate for consideration.
###