Bradley Childers, the Information Systems Manager at the Contra Costa County Clerk-Recorder-Elections Department stands in a room with technicians verifying ballot signatures on Thursday, Feb. 29, 2024, in Martinez, Calif.
Less than three weeks after news of the fake-Biden robocalls broke in January, the U.S. Federal Communications Commission made it illegal to use AI-generated voices for unsolicited robocalls, with agency chairwoman Jessica Rosenworcel citing use of the technology by “bad actors” to “misinform voters” as well as to commit extortion and imitate celebrities.
“We should watch out for foreign interference — that’s been around for a while,” Hyde said. “We should worry about partisan actors ranging from the local to the national.” The federal Cybersecurity and Infrastructure Security Agency warns the technology could be used to spread false voting information by text, email, social media channels or publications. “AI tools could be used to make audio or video files impersonating election officials that spread incorrect information to the public about the security or integrity of the elections process,” the agency said in a bulletin about 2024 election security.
“It’s actually easier to make someone do nothing than do something,” said Toronto-based Cohen, who advises Fortune 500 companies.
Source: Tech Daily Report (techdailyreport.net)
United States Latest News, United States Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: KUTV2News - 🏆 281. / 63 Read more »
Source: mercnews - 🏆 88. / 68 Read more »
Source: CalMatters - 🏆 261. / 63 Read more »
Source: mercnews - 🏆 88. / 68 Read more »
Source: ladailynews - 🏆 332. / 59 Read more »
Source: KUTV2News - 🏆 281. / 63 Read more »