Newsom Threatens Laws Against Deepfakes After Kamala Harris Parody Video Goes Viral

[Collection]Contract WriterSince both parody and satire are rightly protected under the First Amendment, it’s unclear what Newsom is going to try next.

In a clash between Gov. Gavin Newsom and tech magnate Elon Musk, the California governor promoted his intention to endorse a law targeting what he calls the misuse of AI in political advertising.

This decision escalates the ongoing dispute between the two influential figures. Newsom criticized a parody video shared by Musk, which seemed to showcase a campaign ad for Vice President Kamala Harris with a synthetic voiceover, by posting, “Manipulating a voice in an ‘ad’ like this one should be illegal. I’ll be signing a bill in a matter of weeks to make sure it is.”

Musk retorted sharply on social media, emphasizing the legality of parody in the United States.

“I, Kamala Harris, am your Democrat candidate for president because Joe Biden finally exposed his senility at the debate,” the synthetic Harris voice says in the video. “I was selected because I am the ultimate diversity hire. I’m both a woman and a person of color, so if you criticize anything, I say you’re both sexist and racist,” the video continues.

The legislation Newsom refers to is part of a broader legislative effort to combat “deceptive” practices in digital campaign materials.

Current proposals include the “Defending Democracy from Deepfake Deception Act of 2024” by Asm. Marc Berman, D-Menlo Park. This act mandates that social platforms block “misleading” electoral content 120 days before and 60 days after an election. Additionally, it would require platforms to label manipulated content outside these periods and enable California residents to flag such content as “deceptive.”

Another significant proposal, AB 2839 by Asm. Gail Pellerin, D-Santa Cruz, aims to extend the period during which it is illegal to distribute deceptive media of a candidate to 120 days before an election.

California already has a deepfake law (AB 730, updated by AB 972), designed to address the use of artificial intelligence to create deepfake media. Enacted to mitigate the potential harm caused by realistic but fabricated digital content, this law primarily targets deepfakes involving political candidates and explicit content without consent.

Key provisions of AB 972 include:

Political Ads: The law prohibits the distribution of manipulated videos, audio recordings, or images of a political candidate within 60 days of an election if the content is likely to deceive a reasonable person into believing it is authentic unless it contains a disclaimer stating it has been manipulated.

Explicit Content: It is illegal to create or distribute deepfake content that depicts a person engaging in sexual activity, or appearing nude, without that person’s consent. This is aimed at protecting individuals from non-consensual sexual content, commonly known as “revenge porn.”

Civil Recourse: Victims of unauthorized deepfakes have the right to sue for damages. This gives individuals a way to seek redress if they are feel harmed by deepfake content.

However, the sticking point in the law for Newsom is that it has an exception. The law does include certain exceptions, such as parody, satire, or any other protected forms of speech under the First Amendment.

Since both parody and satire are rightly protected under the First Amendment, it’s unclear what Newsom is going to try next.


Trump Wins In A Landslide, Says Top Economist Martin Armstrong


Since both parody and satire are rightly protected under the First Amendment, it’s unclear what Newsom is going to try next.

In a clash between Gov. Gavin Newsom and tech magnate Elon Musk, the California governor promoted his intention to endorse a law targeting what he calls the misuse of AI in political advertising.

This decision escalates the ongoing dispute between the two influential figures. Newsom criticized a parody video shared by Musk, which seemed to showcase a campaign ad for Vice President Kamala Harris with a synthetic voiceover, by posting, “Manipulating a voice in an ‘ad’ like this one should be illegal. I’ll be signing a bill in a matter of weeks to make sure it is.”

Musk retorted sharply on social media, emphasizing the legality of parody in the United States.

“I, Kamala Harris, am your Democrat candidate for president because Joe Biden finally exposed his senility at the debate,” the synthetic Harris voice says in the video. “I was selected because I am the ultimate diversity hire. I’m both a woman and a person of color, so if you criticize anything, I say you’re both sexist and racist,” the video continues.

The legislation Newsom refers to is part of a broader legislative effort to combat “deceptive” practices in digital campaign materials.

Current proposals include the “Defending Democracy from Deepfake Deception Act of 2024” by Asm. Marc Berman, D-Menlo Park. This act mandates that social platforms block “misleading” electoral content 120 days before and 60 days after an election. Additionally, it would require platforms to label manipulated content outside these periods and enable California residents to flag such content as “deceptive.”

Another significant proposal, AB 2839 by Asm. Gail Pellerin, D-Santa Cruz, aims to extend the period during which it is illegal to distribute deceptive media of a candidate to 120 days before an election.

California already has a deepfake law (AB 730, updated by AB 972), designed to address the use of artificial intelligence to create deepfake media. Enacted to mitigate the potential harm caused by realistic but fabricated digital content, this law primarily targets deepfakes involving political candidates and explicit content without consent.

Key provisions of AB 972 include:

Political Ads: The law prohibits the distribution of manipulated videos, audio recordings, or images of a political candidate within 60 days of an election if the content is likely to deceive a reasonable person into believing it is authentic unless it contains a disclaimer stating it has been manipulated.

Explicit Content: It is illegal to create or distribute deepfake content that depicts a person engaging in sexual activity, or appearing nude, without that person’s consent. This is aimed at protecting individuals from non-consensual sexual content, commonly known as “revenge porn.”

Civil Recourse: Victims of unauthorized deepfakes have the right to sue for damages. This gives individuals a way to seek redress if they are feel harmed by deepfake content.

However, the sticking point in the law for Newsom is that it has an exception. The law does include certain exceptions, such as parody, satire, or any other protected forms of speech under the First Amendment.

Since both parody and satire are rightly protected under the First Amendment, it’s unclear what Newsom is going to try next.


Trump Wins In A Landslide, Says Top Economist Martin Armstrong


https://www.infowars.com/posts/newsom-threatens-laws-against-deepfakes-after-kamala-harris-parody-video-goes-viral2024-07-31T05:22:51.000Z2024-07-31T05:22:51.000Z
Previous Post Next Post

نموذج الاتصال