Uncategorized

A deepfake caller pretending to be a Ukrainian official almost tricked a US Senator

Illustration of a masked internet figure whose mouth is moving.
Cath Virginia / The Verge | Photos from Getty Images

The head of the Senate Foreign Relations Committee took a Zoom call with someone using deepfake technology to pose as a top Ukrainian official, the New York Times reports.

Sen. Benjamin L. Cardin (D-MD) received an email last Thursday that appeared to be from Dmytro Kuleba, Ukraine’s former foreign minister, asking him to meet on Zoom. The person on the other end of the call reportedly looked and sounded like Kuleba but was acting strangely. He asked Cardin “politically charged questions in relation to the upcoming election,” according to an email Senate security officials sent legislators that was obtained by the Times. The fake Kuleba demanded that Cardin give his opinion on foreign policy questions, including whether he supported firing long-range missiles into Russian territory.

The tenor of the conversation made Cardin suspicious, according to the Times’s report, and he reported it to the State Department. Officials there confirmed that Cardin hadn’t spoken to the real Kuleba but to an imposter, though it’s still unclear who was behind the call.

In a statement to the Times, Cardin said that “in recent days, a malign actor engaged in a deceptive attempt to have a conversation with me by posing as a known individual.” Cardin’s statement didn’t disclose who the “known individual” was, but the Senate security email did.

Senate security officials told lawmakers to be on the lookout for similar attempts, and warned “it is likely that other attempts will be made in the coming weeks.”

“While we have seen an increase of social engineering threats in the last several months and years, this attempt stands out due to its technical sophistication and believability,” the Senate security office email obtained by the Times read.

As AI tools become easier and cheaper to use, politically motivated deepfakes have increased in frequency — and effectiveness. In May, the Federal Communications Commission proposed levying multimillion dollar fines on a political consultant behind a robocall campaign that impersonated President Joe Biden. On the call — which targeted New Hampshire voters ahead of the state’s primary election — the fake Biden told voters not to show up to the polls.

Elon Musk shared a deepfake video of Vice President Kamala Harris on X, in which Harris appeared to call herself “the ultimate diversity hire” who “had four years under the tutelage of the ultimate deep state puppet, a wonderful mentor, Joe Biden.” And former President Donald Trump posted an AI-generated “endorsement” from Taylor Swift on Truth Social in August — which Swift later cited in her real endorsement of Harris.