"Removing the water": physically-accurate color correction algorithm for underwater photography

Originally published at: https://boingboing.net/2019/11/14/removing-the-water-physic.html


I’ve taken so many underwater shots that look blandly blue-green — I’d love to run them through Sea-Thru.


That’s a clever bit of work and a great name.


interesting - but isn’t what she’s doing just a round-about way of white balancing in the image? as with OP’s comment in the story - I felt that his own work balancing the image using PS’s built-in tools looked more realistic even than Sea-Thru. It’s subjective to a degree, of course.

This made me think about underwater photography before Sea-Thru - however did professional photographers and videographers achieve the beautiful imagery they did before her invention? Proper post production - colour correction, etc. of course! So I am left wondering what Sea-Thru brings to the table. Is it really more accurate or is it just another creative tool for underwater photographers, better in some situations than others but not a universal solution?

1 Like

Sea-Thru uses photogrammetry: it makes decisions about how far objects are away from the camera and other environmental factors before it corrects the scene, and it corrects each part of the scene differently as a result. It’s tested against reality (i.e. it’s trained on underwater scenes before and after turbid water is drained)

All I did was adjust the whole image using the color correction sliders. No science at all, just whatever made it look “right”.

Science will always have to contend with aesthetic preference, especially when the aesthetic is contrived to make the science look bad. :smiling_imp:


Professional videographers use two other techniques: underwater lighting and publishing only low-turbidity photos. She is doing color correction, but the corrections depend on distance - you can do this by hand by adjusting different areas of the picture differently, but it would be tedious and still not scientifically accurate (she uses a color chart to measure what corrections are needed, and does a range map to measure distances)


indeed. looking closer at her sample image I noted that Sea-Thru does still have some issues - the coral has a fringe around it which, while the image may provide more accurate(?) data for scientists would be aesthetically horrible to more creative purposes.


This reminds me of the Blue Sky photo of Mars, where the raw image was processed to give the photo a red sky, because… Mars! But that later study made the scientists realize that Mars’ sky is actually blue (In your face, Cohagen!)


This will be really helpful when scavenging through the remains of coastal metropolises wiped out by rising sea levels!


Very clever method, but really all you have to do is go to the Settings panel for the ocean and change “Ambient Occlusion” to Off.


Here, the paper’s example is seen raw (top left), corrected by Photoshop’s Auto Color tool (top right), corrected by Sea-Thru (bottom left) and corrected by me using the color balance sliders in Photoshop (bottom right).

Science! It works, bitches!


Hey, go easy on the “terrible edit” thing. The sequence was intended to show that Photoshop does the color correction poorly compared to Sea-Thru. Many people I’ve showed it to got the point. I do appreciate your own efforts in this regard. Nicely done.


The old school method seems to be a variable red filter. It’s variable because the blue tinge varies with depth.

Better for closeups,. and images without a lot of depth, maybe? The added problem is that a filter reduces the light available to take the picture.

1 Like

Passed this around the office today, a lot of interest, you might like some of these projects from clients using our software for underwater photogrammetry

1 Like

So how do we try the software? Its a nice article but is it available for use or for sale?

This topic was automatically closed after 5 days. New replies are no longer allowed.