Samsung faked "moon photo" tech

Originally published at: Samsung faked "moon photo" tech | Boing Boing

8 Likes

Proof they were involved in faking the moon landing.

8 Likes

I have an iPhone 14 Pro and it can take some really good photos. It’s not really a great camera from a ‘taking pictures with a camera because that’s fun’ perspective, moreso from a ‘the end resulting photograph’ and ‘it’s in your pocket so you’ll take the picture’ perspective. And it does all kinds of stuff - it’s a 48mp main camera that makes 12mp shots using a bunch of computational stuff no matter what mode you’re in (raw or jpeg); it can do portrait mode ‘fake bokeh’ quite well thanks to the Neural Engine™ and the VCSEL LiDAR chip (!!!); and it doesn’t wholeass generate a moon picture and slap it on top of the blurry one you actually took. Boo, Samsung.

12 Likes

As someone who will hopefully be in the market for another smartphone that I want to use for decent reference photo taking in the future…

Monkey Reaction GIF by Justin

12 Likes

Folk wisdom: The camera does not lie.

Samsung: Hold my beer!

17 Likes

The description of how a camera sees things regarding coloured pixels is a bit off, no?

Also… the Moon. funny they would add all that code just to make the Moon look better… of all things you could shoot. However, if shooting the Moon is a big marketing point, I can see how it would be worth fudging the results.

I have a little Panasonic video camera, does like 80x optical and 1400x digital… it’s crazy numbers, but for being such a cheap camera, the optical zoon is incredible. There’s no way other than some future magic that a cell phone can have the same amounts of glass & travel to make that possible.

3 Likes

The moon is hard to photograph at night. Not by itself, though–you just need a sufficiently long lens and a tripod.

The problem is that the moon is lit by the sun and is brighter than whatever you have in the background. The exposure needed to get the background looking halfway decent makes the moon look blurry. The exposure needed to get the moon looking sharp makes the background look faint.

There are plenty of ways around this (everything from blending two shots at different exposures to just inserting the moon).

Another issue is that the moon appears to be pretty damn small in photos except at long focal lengths.

Samsung has just managed to conquer the technical challenges in the worst way possible.

16 Likes

Since the moon is tidally locked to the Earth, it’s very easy to train your model on other moon images and just slap that texture when a moon-like thing is detected

It is tidally locked, but the elliptical orbit and the moon’s off-axis orbit allows for libration. This means that over time you can see more than 50%of the lunar surface from earth. I wonder whether Samsung accounts for that?

8b57a12f639eda4e3b9968ff2ff2d6ef

14 Likes

star wars that's no moon GIF

11 Likes

Apparently, Samsung has been doing something similar to this for years. From a 2021 mspoweruser article::

AI will first start by detecting the scene/image at the preview stage by testing it from an AI model trained on hundreds of thousands images. Once the camera detects and identifies the image as a certain scene, for example, the Moon, then offers a detail enhancing function by reducing blurs and noises. Additionally in low light/high zoom situations, our Super Resolution processing is happening (I.e., multi-frames/multi-exposures are captured > A reference frame is selected > Alignment and Registration of multi-frame/multi-exposures > Solution Output). The actual photo will typically be higher quality than the camera preview. This is due to the additional AI-based multi-image processing that occurs as the photo is captured.

So what would happen if, for example, the moon sustained a major impact? Do you get a crisp representation of the moon with a crudely outlined new crater? Or would it smooth over the crater completely until the next update?

In reality (what does that even mean anymore?), the phone is adding detail that isn’t unequivocally there. This fact makes their statement that it’s simply “reducing blurs and noises” seem like so much much blur and noise itself.

5 Likes

It’s pretty vague; but it doesn’t sound like it’s doing outright violence to how demosaicing works in cameras using color filter arrays(which is the vast majority of them, excluding dedicated greyscale stuff and anything that’s still using one of those foveon sensors or a derivative of the concept, which is pretty exotic).

1 Like

Elon Musk Moon GIF
It’d be funny if the AI got hacked.

14 Likes

How so? I guess it is debatable if “how red/green/blue” is a good way to describe what those channels represent (Is white greener than black?).

1 Like

It’s kinda remarkable and a sign of the times that it’s so much cheaper and easier to incorporate ridiculously advanced AI technology (or at least, what would have been considered advanced just a few years ago) into a camera to get decent photos rather than to improve the quality of the camera optics. But I definitely don’t like the way this is going. AI is dirt cheap these days so I expect that most phone/camera makers will be leaning hard on it in coming years to allow cheaply-built cameras to deliver photos that are all beautiful lies which can’t be trusted. It’s bad enough when bad actors are faking photos, but if this tech becomes ubiquitous then we won’t be able to take the photos of even the most honest folks at face value.

Example: a photo where the camera takes its best guess as to what the number on a car license plate off in the distance says, which may or may not reflect reality.

12 Likes

As someone who worked in mobile computational photography for a few years (I worked on barrel distortion correction and focus stacking), I hate to be the bearer of bad news. 90% of the advances in cellphone cameras in the last 20 years are all software processing. There’s been very little improvement in the lenses and sensors because we’re already up against physics there.

What Samsung has done is definitely pushing it, but there’s also no question this is the direction things are going. Smartphone cameras are already “strong suggestions of reality” and that suggestion will only get stronger.

13 Likes

So are there groups working on ‘remote vuln in planar lens stack’ using maybe structured light, like there are in crypto?
[Adds obvious a.f. orbiting DLP lens between moon and sun that puts adversarial images up on the moon to look like guac, Harrison Ford’s duck lips, etc.]

4 Likes

Algorithmically generated images coming out of cameras is going to be a huge issue, in large part because people don’t realize that’s what they’re looking at (and the divergence between what’s in the image and what was actually photographed increases). We’re already seeing problems with people using image editing tools to create details that didn’t exist in the original image, and mistaking the artifacts for something real (and furthermore thinking it proves something). This is really just going to get messier and messier as tech companies increasingly ramp up these kinds of features while consumers are increasingly clueless about what’s going on.

4 Likes

Something I didn’t realize until I started trying to photograph the moon with a long lens and tripod is just how fast the moon moves. As @anonotwit says, anything but a very short exposure will make the moon look blurry, and that blur is coming from the moon’s motion, not camera shake or misfocus or anything like that (although I guess you can always get those too).

If all you want is a picture of the moon, using a short exposure isn’t a problem. The moon is very bright, so a short exposure will probably give you all the light you need. In fact, you’ll want a short exposure/small aperture/low ISO to get any kind of detail at all. Otherwise, you’ll just get a blazing white blob.

The problem, as @anonotwit points out, is that the moon will be so much brighter than everything else in your scene – unless the things in your scene are actually on fire – that exposing correctly for the moon will result in everything else disappearing into blackness.

Most of the impressive moon shots you see use assorted tricks to get around this, including compositing multiple images shot at different exposures. The reason why Samsung’s trick sucks is because it’s not under the control of the photographer. It’s just “Oh, that looks like the moon, here, have a moon. Splat!”

The problem with ‘computational photography’ is that it’s optimized for the assumed most common cases, so it will fail badly on edge cases. I’ve seen examples of the iPhone failing hard at, say, small lettering. A conventional camera might have delivered a blurry but legible impression of what was actually there; a computationally-assisted smartphone camera will chew it up and return total garbage because its models aren’t optimized to pay attention to that kind of scene element.

Samsung’s approach looks like a misstep, but I bet we’ll see more of this in future, not less. How about a phone camera that “optimizes” faces to yield more aesthetically pleasing images of the subjects? I’m sure that already happens to some degree, but the same reasoning that says “People will like our phone more if we fake their moon shots so that they look spectacular” will lead the manufacturers to reason that “People will like our phone more if we make their friends look like rockstars”.

I expect this to be a problem for people who use their smartphones for documentation – dentists, technicians, police officers. After a certain point, a smartphone photograph won’t be a trustworthy record of what was actually there any more.

Incidentally, all this suggests a simple strategy for aliens wanting to visit Earth unrecorded: just rig your UFO to look like a big bright disk with some vaguely moon-like details and the Earthlings won’t be able to get a single useful picture of your craft.

6 Likes

There are already filters that do that.

I’ve read that in South Korea formal portrait photos, for instance for resumés, are always heavily Photoshopped. The photo studios do it automatically and for free, because it is simply part of the process of taking a photo. Koreans sometimes get into trouble at border controls because their passport photos are 'shopped to the point that they look like different people.

This universal culture of “improving” photos would explain why Samsung thought that automatically and secretly replacing the moon would be a great idea.

3 Likes

My mind is thinking of a separate R,G & B sensor chip (3 CCD/ 3 CMOS), unlike the single CMOS sensor on the Samsung. The end result is only as good as the first data sample, no matter how good your algorithms and filters are. Video is somewhat forgiving in this aspect but for stills I would think you want THE best input stage possible.

1 Like