PHILADELPHIA (CBS) – Misinformation is nothing new to politics. Americans have seen fake photos, videos, and headlines popping up in their social media feeds for years. But there’s something about the 2024 race that worries former Department of Homeland Security Chief of Staff Miles Taylor.
“I do think we’re in a more precarious moment than we have been in any election cycle when it comes to misinformation and disinformation,” Taylor said.
Taylor, and his group, The Future US, have spent this election cycle warning state and local election officials about the impact artificial intelligence could have on the race for the White House. He says AI technology isn’t the problem, but how it’s being used when it comes to the spread of misinformation.
In March, CBS Philadelphia spoke with Taylor, who said the advancing tech has made it easier than ever for anyone to make realistic-looking images, videos, and audio recordings related to the elections. Six months later, Taylor says he’s seeing this issue grow.
“We are seeing an uptick in deepfakes. We are seeing an uptick in inauthentic content meant to portray things about one candidate or another candidate that are misleading,” Taylor said.
Taylor does note that AI has not been a massively disruptive force so far in the 2024 election. But misinformation hasn’t been hard to find online.
The News Literacy Project this year launched its “Misinformation Dashboard,” compiling more than 600 of the most viral fake, AI-generated and misleading pieces of content they’ve seen going around social media.
But Taylor said it’s not necessarily what he’s seeing right now that concerns him most. His biggest fear, he says, is what he calls a “November surprise.”
“I’m concerned that bad actors, regardless of what side they’re on on the political spectrum, will try to misrepresent what the results are after the election,” Taylor said.
Taylor points to the slew of legal challenges filed following the 2020 election, saying many of them were dismissed or resolved rather quickly. But he’s concerned that, with the rapid advancement in A.I. technology, convincing and fake videos could drag out the post-election process even further.
“Things like security camera footage at a polling place of poll workers allegedly throwing out ballots or lighting ballots on fire,” Taylor said. “It’s going to take a while to prove that that might be a deepfake.”
Kathy Boockvar knows misinformation well. She served as Pennsylvania’s Commonwealth Secretary in 2020, overseeing the state’s hotly-contested elections.
“AI doesn’t necessarily create new problems, it just exacerbates already existing problems,” Boockvar said. “And it exacerbates them in a very big way.”
Boockvar said she has a number of concerns as we near November. She had worries over interactive, AI-generated phone calls, made to sound like local election officials. She also worries about misinformation leading to threats against election workers.
One of the biggest differences she notices between now and the 2020 race is where this misinformation is coming from. Boockvar claims that four years ago, most of the fake videos, pictures and headlines officials were seeing, came from China, Russia and Iran. But she says now, that’s changed.
“It’s now domestic actors who are doing this at least as much as our oversees adversaries, which is horrifying,” Boockvar said.
Taylor said his group will continue to educate state and local election officials about how to handle questions over fake and misleading content surrounding the vote. But his concerns stretch beyond election day and 2024 as a whole, calling this year’s race “the tip of the iceberg.”
“These elections, and the misinformation we see, are a preview of how these deepfake, AI-powered tactics will affect other parts of our lives going into 2025 and beyond,” Taylor said.
More from CBS News
Dan Snyder
Source link : http://www.bing.com/news/apiclick.aspx?ref=FexRss&aid=&tid=66da6fe3686b4c0a92a086cef4262ce6&url=https%3A%2F%2Fwww.cbsnews.com%2Fphiladelphia%2Fnews%2Fai-2024-election-donald-trump-kamala-harris%2F&c=6780781942962005812&mkt=en-us
Author :
Publish date : 2024-09-05 12:40:00
Copyright for syndicated content belongs to the linked Source.