Will smart watches hurt app development in the long run?

Not having a keyboard at all means the user can’t jot down stuff without using voice commands. But since most people don’t have English as their native language, that technology is very, very sucky (my Swedish accent gets me in all sorts of trouble, for example) so I don’t see how the user could quite possibly use a smart watch for anything useful beyond consuming notifications or reading stuff.

So, how do you make apps that are fun to use (you don’t waste half a life to do meaningful input) but still do stuff you want on the go? It seems like this would just stifle app development and be too big a challenge for designers and programmers to overcome.

1 Like

have you seen this?

1 Like

The jury is very much out on watches, in a way that it was not at all even with the original iPhone.

I personally think we need some kind of projection technology for watch-size devices to work well. Kind of like this


Wow, I had no idea it worked so great, all my friends have it on English and that’s the only experience I have with it! I recently busted my iPhone 4 and an iPhone 6 is on it’s way, I will play with this so much! Judging by the clip, my Gothenburg accent should not be a problem either. :smile:

1 Like

I think you are missing one key piece here. We are entering the age of metrics, everything is going to be measured all the time.

In 15 years every bowl movement you have will be captured digitized and data mined.

Smart watches are a first move in this department, they capture heart rate, its basic, but as time goes by they will start capturing a lot more. Blood pressure, blood sugar levels, more and more.

This information will give people an edge in living longer.

I see smart watches as an early move into the world of digitized health records, evan in its current pre-historic state it is providing some use.


It’s a very recent development. It may, in fact, have come about because the Watch demanded it.

I do think it’s cool, and a programmer would applaud the types of input he has to work with to make an enjoyable app that does a lot, but it doesn’t fit what the consumer has thought of as comfortable historically.

A smart phone is mostly used for pressing or typing. Text is far more popular than phone calls. I think this is because we don’t like to announce to everyone what we’re doing, we get some privacy with our screen.

Before the smart phone, the laptop was (and still is) a very popular way to do some work or surf the web while on the move. The iPad is the in between, mostly used for consuming. They too have a personal screen that you can control who looks at and who doesn’t, to some extent.

And going even further back in time, we have the news paper. If I pull up a news paper I might look very informed, but only I know I’m looking at the comics or the personal ads.

I think this is the reason voice commands haven’t really caught on, beyond that it sort of sucks. I don’t want to tell everyone what I’m doing, or bother other people, just to send a text to my wife.

A hologram would make the UI visible to everyone.

This, a hundredfold. Yes, transparent displays look cool and futurey, but it’s simply impossible to make practical enough to catch on. Not even talking about privacy, you just have to think of how distracting and socially awkward it is to have a loud speakerphone conversation in public and add a big bright floaty visual aspect to the “look at me, I’m obnoxious” factor. Same reason (in a different scale) we didn’t all switch to video calls the moment it was feasible, even though we had been predicting the damn thing for almost a century.

And of course, if you can see whatever is behind your display at all times you better have a perfectly black wall positioned there so it doesn’t get hard to read, make out fine detail or completely mess up any color-dependent work. Ever notice the sci-fi movie screens are early-80s-green-screen-style monochrome (and usually large print) to kind of sidestep this? Now there’s progress, imaginary future designers. /curmudgeon


The computer interface reached its apotheosis in Kubrick’s 2001.

1 Like

Well, interacting with natural language would be absolutely awesome, like with HAL, or through brain waves. However, as programmers know, shouting at the computer to do what you want, or having a desperate, internal monologue, isn’t that effective.

We like to bash kids for shortening , twisting or minting new words but languages evolve. It’s a system, not a machine. If I bury an iPhone for a thousand years in a time capsule, they would need some sort of Rosetta Stone to use Siri properly.

So, until we have computers who can evolve their language along with us, I think there’s a limit to how compact a UI can be and for what I want to do on the go, the smart watches have crossed that limit.

Nah, HAL’s screens were animated by hand, You got something of a sense tnat the screens were very high resolution when it came to rendering nice clean text and line drawings. Kind of like an ipad but without the whole glossy photos and movies too.

Then Holywood designers discovered they could use 1980s graphics with their low resolution and their beeps and their terrible colors, and their damn teletypes, and now all those SF movies look terribly dated. They should have stuck with hand animating the computers.


… nobody expects it!

1 Like

This topic was automatically closed after 575 days. New replies are no longer allowed.