In the last BB thread I asked how these cameras work. Is it an exposed sensor which uses a holographic process? If so how does it work with natural vs coherent light?
The way it works is you get 8 investors to give you $140 million in venture funding.
From Wikipedia: “Lytro’s light field sensor uses an array of micro-lenses placed in front of a an otherwise conventional image sensor; to sense intensity, color, and directional information.”
Ok, it does use the same principal as holography, but the microlenses covering patches of sensor allow natural light to be used rather than the point source coherent illumination used for ordinary holograms.
but this is the real deal and uses angle sensitive pixels
am I correct in imagining this like an optical version of a phased array synthetic aperture radar antenna?
No international shipping? Bummer…
Don’t worry. In a few years, you’ll be able to pick 'em up in your local dollar store.
I only buy items from the BB store if they’re more than 86% off. What deals!
Whoa, a TNG screenwriter responded to me!
I promise it is a real thing, now get back to work polarizing my sunglasses!
(edit)That is a pretty weak repolarizing the X TNG jargon joke dobby…
Though reading the description apparently it is the movement of the aircraft more than scan sweeping of the beam with the phased array antenna which creates the simulated larger aperture.
@Woodchuck45 - reverse the polarity on the phased lens array and you’ll get 87% off!
This is the regular price to buy one of these new on eBay. Or you can get a refurb one for $50 or less.
The concept is super neat, but the problem is that the resolution is super low, and it’s got a too-small sensor as well (which makes it super hard to actually make use of the refocusing feature, everything is pretty much always in focus any way).
I enjoyed the DigitalRevTV review though:
Any sort of beam scanning can be used. For capturing large swaths of terrain, it is better to scan sideways and fly over with an airplane. (Kind of like taking a gigapixel photo of a terrain by mounting a linear image sensor on the plane and flying straight.) (Thought. What about using such scanner sensor on a car, scanning sideways, and sensing position/acceleration for each scan column to stabilize the image in software?)
Both the phased array antennas and synthetic aperture technologies can be used together.
What about using the LCD shutter panels from those active 3d glasses?
Hi, Optical engineer here. I worked at a company that did advanced optical things, one of which was a high-quality array of microlenses. That’s what they use in this plenoptic camera. The light is relayed (not focussed) to the microlens array, each of which semi-focusses the incoming light rays onto a subsection of the sensor array – perhaps 8x8 pixels per microlens. A little work with matrices reveals the wavefront curvature within each sub-aperture (one per microlens). This info is used to reconstruct a ‘focussed’ image from any – or all-- object distances.
It’s not really like SAR, which is just a matter of doing FFTs on a time-series of input data.
This topic was automatically closed after 5 days. New replies are no longer allowed.