An algorithm for detecting face-swaps in videos

Originally published at:


So… pee pee tape is real


but it only slightly encumbers forgery detection for deep-learning method trained exactly on the forged output data,”

Roger That!

1 Like

Sounds like we need a trace buster buster buster


as you might imagine the very techniques they’re using here could themselves be employed to produce better deepfakes.
I was going to say…

Though really the biggest problem with deep fakes is not the faked footage, but that legitimate footage will now be denied by those who don’t want to believe it. No proof will now ever suffice to convince some people of the existence of some events.


…until somebody decides to train the deep-fakes program against the detection system to make realistic faces that can’t be detected.

1 Like

Really we will have to add signatures to cameras so you can detect if the footage has been screwed with post recording. Nikon had a system like that at one point for their SLR’s.


Yeah, although I’m afraid that won’t stop people in this new era of conspiracies and labeling stories “fake news.” I mean, Trump himself floated the idea of denying his “grab 'em by the pussy” tape, even after he had already admitted it was him. If he had known about deep fakes, I’m sure he would have denied it when it first came out.


Can I just get one of those by itself? (Asking for a friend.)

"Unkwalifide" Face-Swap Video


Posting this Kimmel video not because it’s funny, it is, but, because of the use of face-swap video at the end.

Is this the first nationally broadcast of face-swap video?

It can tell from some of the pixels and from seeing quite a few fakes in its time.

This is very interesting. I guess this was bound to happen. Besides all the celebrity deepfake porn, this tech will be needed to fight against deepfakes with a political agenda. We all now how much fake news and videos can impact elections or public opinion.

This topic was automatically closed after 5 days. New replies are no longer allowed.