Gov. Gavin Newsom would be advised to check with legal experts before putting up posts like this July 28 on X: “Manipulating a voice in an ‘ad’ like this one should be illegal. I’ll be signing a bill in a matter of weeks to make sure it is.”
It included a picture reading: “Elon Musk retweets altered Kamala Harris campaign ad. In the video, Harris seemingly exposes herself as an incompetent candidate for president. The origin of the video isn’t known yet.”
The ad is a parody, therefore protected by the First Amendment guarantee against “abridging the freedom of speech, or of the press.” Will Newsom seek to ban Saturday Night Live’s vicious parodies of President Trump? Not likely.
The determinant U.S. Supreme Court case is 1988’s Hustler Magazine, Inc. v. Falwell. The magazine, published by pornographer Larry Flint, ran a parody ad of the Rev. Jerry Falwell, the co-founder of the Moral Majority conservative action group. It was headlined, “Jerry Falwell Talks About His First Time,” next to a bottle of Campari, a cocktail drink. Falwell’s father was a bootlegger who died at age 55 from cirrhosis of the liver. Falwell claimed the ad defamed him and caused emotional distress.
The court handed down an 8-0 opinion written by conservative Chief Justice William Rehnquist. Its primary holding read, “The First Amendment protects parodies of celebrities or other public figures, even if they are aimed to cause distress to their targets.” This protection is even more important during a presidential election. If a candidate can’t handle the ribbing, how will he or she handle the immense pressures of the office?
Newsom did not specify what bill he was referring to. But at least 14 AI-related bills currently are still alive in the Legislature. The worst is Senate Bill 1047, by state Sen. Sen. Scott Wiener, D-San Francisco, euphemistically named the Safe and Secure Innovation for Frontier Artificial Intelligence Models Act. According to the Assembly Judiciary Committee’s summary, it would require AI developers “to take specific actions in order to mitigate the risk of catastrophic harms” from “covered models.” It defines “covered model” to mean using “a quantity of computing” of more than $100 million.
And the bill would set up a new Frontier Model Division within the California Department of Technology. That bureaucratic mess would hamstring California’s world-leading AI industry and erode the bottom line of the state budget, which depends on AI companies making massive profits to be taxed.
The Financial Times quoted Yann LeCun, chief AI scientist at Meta/Facebook, who warned SB 1047’s “cascading liability clauses would make it very risky to open-source AI platforms . . . Meta will be fine, but AI start-ups will just die.”
The bill already passed in the Senate and currently is in the Assembly Appropriations Committee. The Assembly needs to delete this and the other AI bill when it reconvenes on Aug. 5. Federal regulation is adequate. Or California will see the AI future move even more quickly to Texas and China.