RIT researchers developing tool to help journalists spot deepfakes

You can’t trust everything you see online. It’s a statement that is probably more true now than ever. Artificial intelligence is used more and more to create videos that aren’t what they seem. Researchers at Rochester Institute of Technology’s Global Cybersecurity Institute are developing a tool to help journalists spot “deepfakes.”

A video from a few years back, involving actor and comedian Bill Hader’s appearance on a late-night talk show, is presented by professors at RIT as a perfect example of the power of artificial intelligence, or AI. Hader begins impersonating actor Tom Cruise, and through computer science techniques and video manipulation, Hader’s face transforms into Cruise’s.

Such videos can be fun. But RIT researchers are also looking at the dangerous side of artificial intelligence technology – Deepfakes. This includes one depicting Russian leader Vladimir Putin, claiming Russian troops would leave Ukraine.

“Well, we know none of that’s true,” said Dr. Matthew Wright, RIT chair of computing security.

“AI can often be very confidently wrong,” said researcher and Ph.D. candidate John Sohrawardi. “Like 97% fake, but it's actually a real video.”

Sohrawardi and Wright are working on deep fake detection. They use a manipulated video of former President Barack Obama, made by actor and director Jordan Peele, as one example of how realistic a fake can be.

“It's just the president saying something, it sounds like, we have a sense that these things can be faked,” said Wright. “But when it's a controversial or confidential source, then maybe it's more believable and it can be potentially more dangerous.”

Researchers say their “DeFake” project is a tool for journalists which is still in the works to help spot deep fakes, especially if traditional fact-checking methods don’t work.

“This is where we come in and hopefully we are building a tool that is somewhat reliable,” said Sohrawardi. “But at same time, we don't want them to rely on them 100%.”

Article

Contact Me