The Next Solutions Group’s Post

View organization page for The Next Solutions Group, graphic

854 followers

Fake Biden Robocall Illustrates Risks Companies Face   A robocall that appeared to be an AI-generated voice imitating President Joe Biden should serve as a fresh reminder to corporate executives of the peril they could face from deepfakes. The bogus Biden robocalls went to New Hampshire voters, urging them not to vote in the state’s primary, and instead to save their votes for the general election. The voice sounded like Biden’s and even employed words that he’s known to use, such as “malarkey.” Biden’s campaign said the voice was a fraud and asked the New Hampshire attorney general to investigate. Artificial intelligence allows computers to learn voices—which is particularly easy when there are online clips of someone giving speeches, interviews and other commentary. That technology is now readily available to anyone who wants to spread misinformation and disinformation. And that’s why companies need to take note. Businesses should have plans if someone creates a fake audio or video mimicking the CEO or other top executives making comments that could potentially damage the company and individuals’ reputations. Those plans should include a full-court press to counter the deepfake. Depending on the content of the impersonation, that could include a company statement, a statement from the executive in question—and a recitation of any past statements that contradict what the fake says. Companies should also familiarize themselves with the process for getting online platforms to remove fakes. They should have email addresses and templated requests for removal ready to go. And if the fake message is especially damaging, businesses should consider a long-term reputation campaign. #artificialintelligence #deepfake #reputationmanagement Raymond F. Kerins Jr., Dan Childs, Justin Blum, Lauren Pearle, Madeline B, Justin McCormick, Hannah R. Hughes

  • No alternative text description for this image

To view or add a comment, sign in

Explore topics