Lytro Illum Camera: The First High-End Camera to Capture the Entire Light Field

Yes, a good photographer will already know where they want to focus and will have framed their subject in such a way as to accent this choice. They don’t shoot first and focus later.

This only deals with one kind of focus issue anyway, depth focus, it is still just as easy to get motion blur and any of the other focus issues.

I personally think that trying to market this tech to photographers is the wrong market. where this would be best is other applications, like fixed street cams, where there is no active photographer and you don’t know until afterwords if there was something that needed to be captured.

Don’t get me wrong, it is super cool tech, and love reading about it, I just see a disconnect between tech and application.

No attempt to even mention that the camera “reviewed” does not offer any other viewfinder. Point being the photography “reviewers” are either treating big-bucks cameras like bulky telephones or else not even phoning it in.

It captures the angle of light, so it is indeed 3d. The same sort of 3d you’d get from a stereoscopic camera, to be viewed as an anaglyph (red/cyan glasses) or side by side (cross-eyed).

Just read more about it and saw some examples…

Capturing the angle of the light using a single sensor doesn’t give you multiple distanced perspectives (bi-optic). On their site it says “gives you the illusion of a 3D image through slight perspective shifts” so what it does is calculate the depth of objects by focal distance and creates a 3D images based on that, you end up with a 3D image that looks more like flat objects at different depths in a 3D field then true 3D.

…but the 3D aspect is cool. Just not as cool as a camera meant to capture true 3D.

To which review are you referring? The main post isn’t a review, it’s an ad for the store.

There’s an interesting review at the Verge.

Shooting with the Illum requires a complete rewiring of the way you look at a scene. A great living picture has two subjects, one in the foreground and one in the background. I began trying to fill every photo with as many things as possible, to put a telling detail or funny image right behind the subject of my photo. The Illum has its own set of rules for how to get a great shot; it comes with a huge, fun, strange learning curve, and it’s unlike any camera I’ve ever shot with. You’re shooting with layers, shooting something that people will be able to interact with later.

Kind of reminds me of the iphone 6s’s Live Photos-- an alternative to traditional photography, and one that may not take off.

Could that be used as a rudimentary 3d scanning/rangefinding?

Could the thing be coupled to a microscope and solve the issue with too narrow focal plane?

1 Like

But of course, it’s expensive, and not heavily discounted.

shows what’s possible with a microlens array.

4 Likes

excellent idea, that would be a great application of this tech.

possibly.

no. it isn’t that kind of 3D. it is “simulated 3D” in there terms, the results are closer to a popup book then the kind of 3D image you’d need for 3D printing or most of the uses you’d use a 3D scanner for.

see the video I posted. 3d reconstruction is demonstrated-- though it’s 3d reconstruction of a thin section

i just finished watching the video. it was impressive how much 3D information they were able to reconstruct using the oil lens which captures lite from a 60 degree offset.

I’m assuming they were able to capture such a large offset because the microscope lens was so close to the subject?

i can see a lot of potential for this tech in various areas.

It actually could work well for your use case - simply because (a) you can do the focusing afterwards, on your desktop, with (b) the on-camera tools provide visual feedback on depth of field / refocusing range so you don’t need to go by the clarity of lines.

There’s a technical paper on the subject here. I’m not well versed in optics, though.

Nice one. That makes more sense I guess, having flat 2d ‘planes’ at different depths.
On another topic, the pedant in me just has to point out your then/than grammatical error in

I do apologise, you’re not the only one who makes this error, but it’s like fingernails on a chalkboard to me.

1 Like

Yes, of course. :smiley:

I thought about going layer by layer, tagging the areas that are the best in focus as being in the given z-distance.

i am aware of that glitch, it is an old one and a hard one to shake. i try, but it crops back up. maybe electric shock therapy?

2 Likes

Maybe swap the A and E buttons on your keyboard.
:grin:

2 Likes

or map both to æ !!! (or is that making it worse? lol)

2 Likes

That’d work, or map it to the ampersand (&).

we could compromise on thæn


oops, @redesigned proposed this already

2 Likes