Originally published at: https://boingboing.net/2024/09/26/meta-shows-off-orion-augmented-reality-glasses.html
…
Regardless of how sleek the form factor eventually is they’re clearly still looking for a compelling app and failing to find one. Virtual ping pong? Really?
Also some of their footage is clearly faked because they’re saying this isn’t screen-based, and they’re projecting virtual stuff on top of reality that you’re seeing directly through the glasses. But that means that the projected images would all be lighter than the background because you can’t project black.
Yawn. I was hoping they’d be thicker.
I was hoping for more prominent warning lights on the front – a lower power red LED when not recording and a really bright flashing one for when the user is recording. It’s far too easy to disable (or just conceal) the warning lights in current similar intrusion devices, so maybe just have the entire frame glow?
The techbros who design these devices learned nothing from the Google “glasshole” experience. And Zuck famously cares nothing about anyone else’s privacy.
the glasses are tinted and seeing the projections on the outside, they seem really bright. so, contrast-wise should be still pretty good. and a FOV of 70°, which is kinda amazing for this tech.
it is actually an impressive tech-demo, despite its shortcomings what to do with it. and at a current price of $10,000(!) for manufacturing its just that; a tech-demo.
tech-demo currently not for sale. as a broad consumer product still years away.
oh, they learnend. just not the right lesson.
Bummer, I thought it would be glasses that let you see the world through the eyes of an Elvis impersonator.
I dunno. The demo posted by The Verge took place entirely in a room with controlled lighting and dark grey walls. Much darker than a typical room. The contrast was not as high as what’s in the promotional image that I posted, and it definitely couldn’t do the dark black bits that are shown in that image.
If the actual performance was really super impressive then I don’t think Meta would feel the need to create faked images and give their demos under such controlled lighting conditions.
I don’t know… Maybe if they’d add a fake nose & moustache?
“Any sufficiently advanced technology is indistinguishable from a rigged demo.”
— Andy Finkel
*glances at a banana*
WE THINK YOU DEFINITELY NEED THIS BANANA SLICER AVAILABLE ON THE META STORE FOR ONLY $19.99 DO NOT SHAKE YOUR HEAD TO COMPLETE PURCHASE
I’m so glad that Project Mephistopheles will let me bring the torment nexus with me; beyond the constraints of today’s limited device experiences! Why this is the metaverse, nor am I out of it.
It always comes back to the Metaverse and basically any version of it so far peddled to consumers, is neither something people want nor does it provide a compelling use case which would inspire the widespread adoption of it. This is true of the Quest VR Headset, it’s true of the Apple Vision specs, etc.
The biggest problem is that for all of these devices, it either adds a layer of difficulty in executing whatever it is you want to do, which is a fundamental no-no in UI/UX design, and it simultaneously fails to offer a substantially better user experience or use case beyond what is already available.
But the hype train needs to keep rolling, and Silicon Valley can feel that the steam that AI was generating is now running out. So, bereft of ideas, Meta comes out with an Apple Vision Pro Me Too device.
Pity they aren’t interested in actually improving user experience on their actual core products. But then, they’re just selling hype now to gin up the stock valuations. The only thing this modern era of tech billionaire has done is successfully reinvent snake oil in the digital domain.
That and get themselves killed doing outlandish and wildly reckless shit like diving to the bottom of the ocean in a homemade submarine.
That’s extra. I mean it’s drop shipped from China and smells like industrial insecticide when it arrives, but it’s still extra.
Even the cyberpunk novel that introduced the term “Metaverse” never really offered a compelling use-case for the Metaverse.
The most interesting thing Hiro Protagonist did in the Metaverse of Snow Crash was to hang around an exclusive chatroom designed to look like an upscale bar while he was sipping a lukewarm beer in his shitty shipping container apartment.
Later at the climax of the novel there’s a huge gathering of techies in a virtual stadium, but why would anyone want to attend an online event that way? In the physical world, nosebleed seats are a necessary evil because you can’t give thousands of fans a front-and-center view at the same time. Ditto constraints like how fast people can travel from one location to another, or how many people can own property in the most desirable neighborhoods. But in Neal Stephenson’s Metaverse they’ve imposed these kinds constraints as “features” just to make the world more lifelike. Seriously, where is the demand for that?
Today’s billionaire tech geeks took all the wrong lessons from the sci-fi of yesteryear.
Hey, this VR for the masses thing has been going on for the last 30 years or so; they’re bound to sort it out really soon now, right?