What just happened? Remember the robocalls made to New Hampshire residents where the deepfaked voice of Joe Biden told them not to vote in the Democratic primary election? The FCC has fined the political consultant behind the scheme $6 million.
Louisiana Democratic political consultant Steven Kramer was indicted in May over the robocalls. The 39-second message, which told people to 'save their votes' for the November presidential election, was created using a text-to-speech tool called ElevenLabs. The calls were spoofed so they appeared to originate from the former chairwoman of the New Hampshire Democratic Party, writes The New York Times.
Kramer had worked for Biden's primary rival, Rep. Dean Phillips, who condemned the calls. Kramer claimed that he paid $500 to have the calls sent to voters as a way of raising awareness about the dangers artificial intelligence can pose to election campaigns, which sounds like a questionable justification.
"For me to do that and get $5 million worth of exposure, not for me," Kramer told CBS New York. "I kept myself anonymous so the regulations could just play themselves out or begin to play themselves out. I don't need to be famous. That's not my intention. My intention was to make a difference."
Making a strange story even weirder, Kramer hired an actual New Orleans magician named Paul Carpenter to make the robocalls. Carpenter said creating the recording only took about 20 minutes and cost $1, and that Kramer paid him $150 via Venmo. He believed what he was doing had been authorized by President Biden's campaign. Carpenter's account has since been shut down by ElevenLabs.
The FCC writes that Kramer violated the Truth in Caller ID Act, which makes spoofed calls illegal when made with the intent to defraud, cause harm, or wrongfully obtain anything of value. The FCC this year voted to have the law apply to deepfakes.
Kramer has 30 days to pay the $6 million fine. If he doesn't, the Department of Justice will handle collection. He is also facing 13 felony counts of voter suppression and 13 misdemeanor counts of impersonation of a candidate.
In July ElevenLabs partnered with Reality Defender, a US-based firm that offers its deepfake detection services to governments, officials and enterprises. Reality Defender gets access to ElevenLabs' voice cloning data and models, allowing it to better detect fakes, while the AI firm can use Reality Defender's tools to help prevent its products from being (further) misused.