AI-Generated Biden Calls Lead to $1M Penalty for US Firm
Imagine receiving a phone call from Joe Biden, only to find out it was completely fake. That’s what thousands of New Hampshire voters experienced during the 2024 primary election, thanks to AI-generated calls that mimicked the president’s voice. This deceptive tactic has now landed Lingo Telecom, the company behind the technology, in hot water with the Federal Communications Commission (FCC).
The Deceptive Campaign
Lingo Telecom, a voice service provider, recently settled with the FCC for a whopping $1 million over claims of transmitting spoofed robocalls during the New Hampshire primary. The calls, orchestrated by political consultant Steve Kramer, falsely suggested that participating in the state’s presidential primary would prevent voters from casting their ballots in the November general election.
Kramer, who used Lingo Telecom’s technology, claimed his actions were meant to draw attention to the dangers of AI. To create the fake Biden voice, he reportedly enlisted the help of a New Orleans street magician skilled in generative AI technology.
Consequences for Lingo Telecom
As part of the settlement, Lingo Telecom must pay a hefty $1 million civil penalty and implement a compliance plan to adhere to FCC guidelines on caller ID authentication. This plan aims to combat ID spoofing and maintain the integrity of phone communications.
Despite its initial disagreement with the FCC ruling, Lingo Telecom has no choice but to comply with the regulatory requirements moving forward.
Legal Troubles for the Consultant
Meanwhile, Steve Kramer faces even more severe consequences. The FCC has proposed a $6 million fine for his involvement in the fake Biden calls, as well as potential jail time of up to seven years. Charges of voter suppression and impersonation of a candidate could land Kramer in serious legal trouble.
This case serves as a stark reminder of the dangers posed by deepfakes and the importance of upholding election integrity. The FCC’s actions highlight the need for strict enforcement against deceptive practices that undermine democracy.
In a world where AI technology can blur the lines between reality and fiction, it’s crucial to hold individuals and companies accountable for their actions. Let this serve as a cautionary tale for those who seek to manipulate the truth for personal gain.