President Donald Trump stands behind a podium against a stately backdrop and speaks directly to the Belgian people.

“I had the (guts) to withdraw from the Paris climate agreement and so should you because what you guys are doing right now in Belgium is actually worse,” he says. His lips move unnaturally quickly, but the voice, body language, and mannerisms are all his, if a little out of place. “You guys agreed, but you’re not taking any measures, only blah blah blah, bing bang boom.”

He speaks for several minutes in the video, which has been viewed nearly 24,000 times on Twitter, before urging Belgians to sign a petition asking their government to pull out of the climate accord.

The video is a fairly obvious fake, created for a small Belgian socialist party using a rapidly improving technology that employs artificial intelligence to weave together two peoples’ faces, words or movements.

When done well, the videos, known as deepfakes, can be quite convincing. The unethical applications are legion, and their potential for tricking voters is a source of great concern for many in the artificial intelligence and political communities.

In one clip, produced for Buzzfeed as an example of the technology, actor Jordan Peele’s impression of Barack Obama is integrated seamlessly into a video of the former President speaking.

“It’s an incredibly powerful fake in some ways because the whole video is real except for just around the mouth part and it is literally a very powerful example of putting words into a world leader’s mouth. It’s pretty compelling,” said Hany Farid, a leading computer forensics expert at Dartmouth College who develops techniques for detecting deepfakes. “And those are the types of videos that worry me — where you can generate world leaders saying just about anything.”

There are open software applications that allow anyone with a high-performing system and experience in computer science to create low-quality deepfakes. It takes more skill to produce a video that could reasonably be expected to dupe voters and media outlets, said Tim Hwang, director of the Ethics and Governance of AI Initiative at the Berkman Klein Center for Internet & Society.

There is an ongoing bet among a group of artificial intelligence researchers whether or not a deepfake will have a measurable impact on a political campaign before the end of this year, and Hwang falls into the camp that believes it won’t happen.

In large part that’s because it’s much more cost effective to create an altered picture of a politician or write a baseless news story and share it on social media, he said. Those methods were widely deployed during the 2016 election and beyond.

But one of the more insidious aspects of deepfakes is that, regardless of the quality of the videos, the mere existence of the technology has the potential to influence elections, Farid said, because when authentic embarrassing videos do emerge, candidates can dismiss them as fakes.

Just last week in New Hampshire, state Rep. Frank Sapareto, R-Derry, was accused of assaulting a California man, Jonathan Carter, who claims in a lawsuit that he and Sapareto made an adult movie together. The filmmaker’s lawyer sent shots from the purported film to several news organizations.

Sapareto has denied the allegations and accused the filmmaker of attempting to torpedo his reelection campaign. On Tuesday he went further, telling NECN that, while the pictures were indeed of him, they had been doctored into the videos.

It’s the kind of defense — whether true or not — that could become more prevalent with the proliferation of deepfakes.

“I think there’s a greater distrust in what any campaign or politician says and the term fake news is thrown around a lot more freely now than it used to be … but if your campaign is being targeted with fake news and a hoax, you have to make an attempt to set the record straight and you have to do it convincingly,” said Ryan Williams, a Republican consultant who has worked on gubernatorial and presidential campaigns in New Hampshire. “Going out and lying about a video being fake is not a good strategy; it would eventually backfire on you.”

Farid said he’s recommended to campaigns that a staffer follow a candidate everywhere he or she goes with a video camera to ensure there’s always more than one copy of footage.

Earlier this year, a group of former politicians, intelligence officials, and academics from North America and Europe formed the Transatlantic Commission on Election Integrity to help democracies combat threats to their electoral systems.

Deepfakes are one of the many election interference tools the commission is looking to address, said Eileen Donahoe, a member of the group and Executive Director of the Stanford Global Digital Policy Incubator.

And she doesn’t believe that the modern method used by campaigns to dispel rumors and lies — simply shooting out a press release declaring it to be fake — will work.

“Right there, you’re playing into a narrative where you will lose,” Donahoe said. “We don’t want it to come out first as a partisan case. It needs to be presented as a threat to all of us. And then as individuals are hit with this, they can’t go back to the old playbook. … We have to educate politicians about what kind of narrative response will actually be effective and I think the risk is that the old playbook will just allow (deepfakes) to be utilized in this more polarized, poisoned environment of fake news.”

The Transatlantic Commission on Election Integrity is pursuing a two-pronged response to the problem: educating voters about deepfakes and developing technology to detect them.

There are techniques that work, Farid said. Subjects in deepfake videos tend not to blink as often as a real person and there are ways to detect a lack of blood pressure in faces that could indicate another person’s face has been swapped into a video.

But deepfake creators are finding ways to improve their product and avoid detection almost as fast as the forensic techniques can be developed.

“There’s this cat and mouse nature to the game — there’s always been this cat and mouse nature — the difference now is that it used to take years and now it takes months,” Farid said.