Watch how AI accessing a huge face database can create better restorations

Originally published at:

1 Like

This is both creepy and awesome.


This has been inevitable. With enough computing power you can examine multiple images of the same person, or multiple frames of a moving picture, to accumulate details that were not, in their aggregate, in any one image. Then, using that accumulated unchanging image data, you can modify still or moving photography to create restored, rather than assumed, detail. It’s amazing, and I’ve been looking forward to it!

I’m also waiting for iconic porn from prior decades to get run through this process, giving history fetishists a huge boost. New image technology always ends up being used to make porn.


This technology is available to use at the MyHeritage dot com website (currently free). It’s even creepier when you upload a blurry photo of someone you know. “Computer… enhance!”


Really interesting how the period-specific film and lighting qualities must impact this. The image of Marilyn, specifically, looks like it was taken in full color in that era. I can’t help but think other images of her biased it toward the reddish tinge of older photos.


DFDNet does not use other images of the same person to restore the photo. It compares the subject’s features (eyes, nose, mouth, hair, jawline etc) one by one with thousands of high quality stock images until it finds the closest match. The technique has a lot in common with deepfakes because the subject’s features are replaced with other people’s. The red tint is the result of the colorization app and has nothing to do with DFDNet.


So is that a choice by the artist to bias it toward what looks (to me, anyway) like period-specific coloration?

Also, welcome to BBS!

I dunno. To me, from the upper lip on down looks slightly off on both Che and Charlie.

1 Like

That can’t be Charlie Chaplin – there’s no toothbrush mustache.

1 Like

Thanks! The color is automatically generated by a separate AI app so any period bias is likely to be coincidental.

1 Like

Is the library of stock images mostly newer than the vintage photos that are being up-rezzed? If so I wonder how much of a modern bias is imparted on the “restored” photos. Subtleties in hair styles, facial hair, makeup, even things like skin cleanliness and dental hygiene may differ from the period of the original photos and more modern stock images.

1 Like

This seems like a situation where you’ll need to take a hard look at whether "better’ is “more pleasing/believable” or “more accurate representation of what is available in the source material”.

A bot with a massive library of other sources and a talent for confabulation, like this one, is definitely well placed to draw a ‘restoration’ that will be much more pleasing; but also has good odds of being little more than inspired-by the material being restored.

For some purposes that’s ideal. For others that’s basically the worst possible outcome. Say, uprezzing old movies to look good on 8k++ 4D TV and doing a tense round of ‘missile site or not missile site’ on grainy satellite footage, respectively. The former instance is pretty much intended to look good; the latter would…not be well served…by a system that can artfully draw on a library of other ground site infrastructure to paint a beautifully plausible missile site.

Precisely. Look at how badly it botched Shane McGowan’s photo!

Came here to make the ‘computer enhance’ gag, was not disapointed to see that someone had made it already. :slight_smile:

This topic was automatically closed after 5 days. New replies are no longer allowed.