About to get tougher, too.
More importantly—the guy in charge of Google’s “Moonshot” division is named “Astro??” How cool is that?
The Singularity, in which all matter in the universe gains sentience and we are bonded into one unified all-knowing all-powerful being capable of surviving the heat-death of the universe. Convincing such a being to buy a wearable will be considerably difficult.
I don’t think that there is anything very tough about it. I have used wearable computers on and off for years. In my opinion, the boom in “smart phones” and tablets is exactly what wearable computing was waiting for. Lighter, high-density batteries. Low-power electronics. Solid-state storage. Compare the tech behind an iPad with my old, bulky Xybernaut!
Why I avoid touch screens is that I think they are messy, inaccurate, and they tie up both my hands and my eyes. But this is what makes them sellable as a fetish object - a colorful, flashy widget that you hold in your hands. Fine design for a toy, but not so great for a tool. Augmented reality still is the future. The combination of the real world plus overlays of data, through one lens, in real time.
There are no shortages of use for wearables. My guess is that Google are encountering psychological barriers to acceptance with prospective users. The answer, I think, is to scale back the scope and make the product available to those who can and will actually use it today. Such as engineers, doctors, and business people. Not unlike how the “smart phone” was predicted by PDAs such as Palm and Newton, and early phones like the IBM Simon. The tech was field tested and refined for 10-15 years and eventually became the biggest consumer electronics boom of the current century. Adoption of them was not an overnight success. Start with 10,000 today, and see where it goes.
That may be true, but psychological barriers should not be underestimated. Almost every sci-fi franchise shows people conversing on video screens, but the truth is that we could have had video phones in the 1960s except that nobody really wanted them.
Brainspore is correct. Finding a compelling use-case for 0.1% of the population - not too hard. Finding a compelling use-case for 25%. Really, really hard.
It’s why Apple has an especially large challenge. Unlike many companies that can put a 100 products out there and see which one’s find a market, Apple puts its eggs in many fewer baskets. Each one gets a lot more publicity and care from the company, but it does mean that the “we don’t know what will work - just put it out there and see” strategy isn’t available to them.
That’s why we need as low barriers to entry the market as possible (thank you, kickstarteroids and open hardware and cheap generic chips!). So the masses of imagination-free nitwits have lower chance of slowing progress down.
Also consider the state of the old transmission lines. Getting a sufficient quality of video across 60’s phone lines was everything but trivial. Doable, but awfully costly and the cost just was not worth the results.
Most people don’t think it’s worthwhile now. The technical barriers would have surmountable half a century ago if there was any demand for video phones, and nowadays most people in developed countries could video-chat from their smartphones if they wanted to. A handful do, but for most of us the technology turned out to be less appealing than we anticipated. In fact communications have trended in the opposite direction, with many of us eschewing voice communication for text chat. Videophones remain more of a niche market than almost any futurist would have predicted back in the 50s, and that’s due to psychological barriers.
I’m not smart enough to predict the future, Google Glass or something like it could still be the Next Huge Thing. But I also remember the 1990s when Virtual Reality was going to be the Next Huge Thing and I still haven’t met anyone who telecommutes via Oculus Rift headset.
I do videoconferences for business every few weeks, sometimes more often. Mom talks with her friends over videochat way more often than this. Granted, it is not a dedicated videophone device but skype from a laptop, but still.
The bandwidth back then just wasn’t there. For a low resolution small display, perhaps, but these are less than appealing. You would be in the same situation as with the early telephone cables, before the multiplexing of more signals to a single cable was solved; speech is fairly narrowband and you can get many channels to a single cable before the high frequency performance of the cables cuts off, but video is not. Few channels available translates to high cost per channel, which limits demand. We are talking about technology in the ballpark of long-distance video between TV studios. Expensive back then.
Ask couples over longer distance. Especially when phone sex gets involved. Is not even limited to couples, see cam2cam chats.
Different kinds/goals of communication require different mediums. For something you want video. For something else, voice. For small quick morsels of information that can be done while multitasking, without being tied to the full-time attention voice comm requires, text rules.
If we include things like laptop-based video calls, not at all.
I remember that era. The tech wasn’t there to match the hype.
Oculus Rift is almost there, and for some purposes even quite there. Both tech-wise and cost-wise. For telepresence more devices will be needed, though, for the physical endpoint side - stereo panorama imaging, for example. Give it some more time for a more conclusive answer. Communication technologies need the equipment on both sides; telephones were of very limited use when only few people had them. Fax machines as well. Then email. Usefulness of comm tech grows exponentially with number of users and some critical mass of users will have to be reached, even if just in niche markets, for the thing to take hold.
And then there’s the porn aspect that may be a good driver for the tech…
The point I’m making isn’t that the technologies I’m talking about (Virtual Reality, Video Phones/Skype, Google Glass) are dead-ends, exactly—just that they’ve found narrower niche markets than their hype would have had us believe.
For most people, video chat is to communication what the Segway is to transportation. Kind of cool technologically, kind of awkward socially, but not the game-changer it was hyped to be.
That’s what matters.
Fuck the society, if it doesn’t like it it should step out of the way.
Few things are:
Society isn’t “in the way,” it’s just not buying. Tech geeks are free to wear whatever gadgets they want.
I’d say that there is a difference between “psychological barriers” and “flashy but impractical”. Video phones were mostly fiction until video teleconferencing took off in the 1990s, and especially since Skype. I doubt that it’s useful for many, but the huge push was business communications, where this tech is quite relevant. Personally, I have never found video phones to be useful. I prefer audio and text. But most people are very visual animals and so must gawk to grok.
Many technologies can seem impressive without being very useful. Or the logistics behind them are at least as complicated as what we have.
The crucial thing I think is to refute hype and consider tech as tools. It either does what you need, or it doesn’t. There isn’t anything universal. The notion that it matters how many people buy something is old, industrial-age thinking.
Insertables are a growth market, but this is not limited to the sex toy story you linked to. When you are trying to interface any computer, a decent neural connection becomes invasive. It gets to the point that your SOC (system on a chip) is smaller than your MEA (multi electrode array) and it is more economical to just put the computer inside the body with the interface.
And if you are a hobbyist - finding a friend or volunteer who is willing and able to help you with the surgery is a complete PITA.
That’s why the extreme-ish end of body-modification community is a friend. They already have most of the surgical-end technology. They do crazy things just for aesthetic purposes; add functionality, even just a thought-controlled blinking LED, and they will be queuing to help.
I think video phones could have become practical a long time ago if there was public demand for them. The basic technologies have been around for almost a century—the first videophone service debuted in 1936. It was expensive and widespread use would have required a lot of infrastructure upgrades, but the same was true of the telephone before it and the telegraph before that. When the first commercial videophone services were finally made available in the U.S. in the early 1970s AT&T struggled just to get subscriber numbers in the triple digits.
Fast forward to today: everyone with a late-model iPhone has instant access to video chat using “Facetime” from almost anywhere. Even so, a small minority of people use it regularly. Video calls just aren’t how most people prefer to communicate. Occasionally useful? Sure. World changing? Hardly.
The reason I don’t videochat: I don’t want to have to dress up.