Yes, a good photographer will already know where they want to focus and will have framed their subject in such a way as to accent this choice. They don’t shoot first and focus later.
This only deals with one kind of focus issue anyway, depth focus, it is still just as easy to get motion blur and any of the other focus issues.
I personally think that trying to market this tech to photographers is the wrong market. where this would be best is other applications, like fixed street cams, where there is no active photographer and you don’t know until afterwords if there was something that needed to be captured.
Don’t get me wrong, it is super cool tech, and love reading about it, I just see a disconnect between tech and application.
No attempt to even mention that the camera “reviewed” does not offer any other viewfinder. Point being the photography “reviewers” are either treating big-bucks cameras like bulky telephones or else not even phoning it in.
It captures the angle of light, so it is indeed 3d. The same sort of 3d you’d get from a stereoscopic camera, to be viewed as an anaglyph (red/cyan glasses) or side by side (cross-eyed).
Capturing the angle of the light using a single sensor doesn’t give you multiple distanced perspectives (bi-optic). On their site it says “gives you the illusion of a 3D image through slight perspective shifts” so what it does is calculate the depth of objects by focal distance and creates a 3D images based on that, you end up with a 3D image that looks more like flat objects at different depths in a 3D field then true 3D.
…but the 3D aspect is cool. Just not as cool as a camera meant to capture true 3D.
Shooting with the Illum requires a complete rewiring of the way you look at a scene. A great living picture has two subjects, one in the foreground and one in the background. I began trying to fill every photo with as many things as possible, to put a telling detail or funny image right behind the subject of my photo. The Illum has its own set of rules for how to get a great shot; it comes with a huge, fun, strange learning curve, and it’s unlike any camera I’ve ever shot with. You’re shooting with layers, shooting something that people will be able to interact with later.
Kind of reminds me of the iphone 6s’s Live Photos-- an alternative to traditional photography, and one that may not take off.
excellent idea, that would be a great application of this tech.
possibly.
no. it isn’t that kind of 3D. it is “simulated 3D” in there terms, the results are closer to a popup book then the kind of 3D image you’d need for 3D printing or most of the uses you’d use a 3D scanner for.
i just finished watching the video. it was impressive how much 3D information they were able to reconstruct using the oil lens which captures lite from a 60 degree offset.
I’m assuming they were able to capture such a large offset because the microscope lens was so close to the subject?
i can see a lot of potential for this tech in various areas.
It actually could work well for your use case - simply because (a) you can do the focusing afterwards, on your desktop, with (b) the on-camera tools provide visual feedback on depth of field / refocusing range so you don’t need to go by the clarity of lines.
Nice one. That makes more sense I guess, having flat 2d ‘planes’ at different depths.
On another topic, the pedant in me just has to point out your then/than grammatical error in
I do apologise, you’re not the only one who makes this error, but it’s like fingernails on a chalkboard to me.