Originally published at: http://boingboing.net/2015/07/21/after-a-rush-aviation-stopped.html
…
Interesting and I have used transportation as an analogy to point out that technology ISN’T constantly exponential, but has uneven starts and stops and “boiling points.” But I’m not sure where in his “three visions” of the web the big media rentiers are. They certainly aren’t the “everybody should be posting cats” crowd and they aren’t the “software will save us” crowd. Indeed they WANT to turn the web into a controlled market like regulated Taxis or the airlines under the old CAB (civil aviation board). Largely those kind of government enforced oligarchies are what the “ubers” of the “software will save us” crowd are reacting against in the real world.
I am hoping that a mix of fusion breakthroughs, room temp superconductors, and materials science will finally get us a power source dense enough to blow fossil fuels off the map. Current physics seems not to support it, but I really hope that there is still stuff within reach of humanity as we know it to discover which will make supersonic travel and even space as accessible as say a Cessna 172 is now.
There is a point that is true through current computer technology: it’s no more primarily aimed at people who DO things. At the beginning of home and personal computer era you bought computer for two things basically: 1) play games, 2) DO THINGS. Be it graphics, animation, creative stuff, web design, music etc. It was a niche, enthusiasts market.
Today most people buy a computer to surf the web and upload a video on youtube, which is with a wide stretch a definition of “doing creative stuff”. And being that computers are now “popular” it means that the vast majority of people use them for “dumb” things. As a consequence mass market computers won’t evolve too much, while workstation and machines for creative stuff will be always more expensive due to the reduced user base.
What became the 787 would have been this, except the airlines wanted lower costs instead of increased performance.
I mainly use my computer for work, but existing computers already comfortably do everything that I need. It wasn’t true ten years ago, but at this point the bottleneck is my own working speed and I’m not going to pay a premium for a higher performance computer that doesn’t improve my user experience at all. Making things more reliable would definitely help, but otherwise there’s no reason for me to support continued developments in speed, miniaturisation and processing power.
Thanks Cory for posting this- very interesting read. Really happy to finally hear someone skeptical of the impending AI doom… It’s like everyone forgot that the reasons AI failed the first time haven’t changed.
It’s also refreshing to hear someone actually suggest that we have a choice in the “internet of things”, rather than just a plaintive resignation that, “it will happen, so get used to it.”
Still, I do appreciate those like Musk who are trying to innovate with crazy stuff like the hyperloop. I much prefer a vision of public tech over the market-driven uber models. Something about a balance of the tech serving humanity collectively rather than the creation of more private consumer goods… An admittedly simplistic notion I suppose.
Meanwhile, this just in: http://www.gizmag.com/son-of-concorde/23118/
I don’t understand the controversy. This is what new tech does, settle down to marginal improvements and efficiency over radical redesign. Look at the automatic pistol, was a marvel, now virtually unchanged in 120 years. The car has not fundamentally changed since Ford. Yes, tweaks like transmissions and such, but the essential design is the same. Electrics may change that,but they are a tiny part of the market, similar to the way long predicted prefab homes are a tiny part of the market compared to traditional construction.
It’s not new (heh - just noticed that article is three years old)
Gulfstream have been sort of talking about supersonic bizjets for quite a few years. I’ll believe it when I see it.
It’s something the very rich might be able to afford, but I don’t see it for passenger jets. More likely you’ll see slower, noisier, open-rotor aircraft with much reduced fuel burn (if they can work out how to certify them). In the US at least, the hold up is capacity at airports, being able to fly faster doesn’t help the throughput much. Modern planes don’t fly any faster now than early jet-era ones.
This is ok and it’s the case for many people, but if you need “more” today the premium you’re going to pay will be probably bigger than in the past because the “mainstream PC” can handle most tasks at a very good level so very few people are going for “the premium”.
Well that and as long as you are not going supersonic speed the same basic design we have for tube and wings is pretty damn efficient already and the pictured design would still not go supersonic and didn’t go that much faster that it was worth the major design change. There have been some improvements with winglets and such but the biggest thing has been the with engines and getting better power and fuel economy out of them which isn’t very visible neither are the things like using carbon fiber composites for the body like the 787 so it will still pretty much look just like every other large passenger jet.
There has been some revolutions though, even in latest years: GPGPU or Nvidia Tesla are something great, but very expensive and for a very dedicated market. Nonetheless it’s an incredible evolution from the standard “cpu for calculation, gpu for visualisation” approach
What has AI “failed” to do? I see it more as a case where the marketing hype was simply divorced from the actual discipline. And also, as I have read others mention, the moving goalposts of that once something in computer science is achieved, it is not considered AI anymore. Places such as MIT where teams have been researching both cognitive science and computer engineering have made quite a bit of progress in both areas. Whereas the marketing world appears to be stuck in a 1950s version of “intelligence”.
You often see variations of this sentiment, but actually algorithmic speeds have massively improved over the years. In fact they have outpaced hardware advances in many cases.
See https://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-nitrd-report-2010.pdf
“Bloat” largely comes down to the use of software libraries, which tend to be included whole. On the one hand the size of the software rises but on the other hand, standardized software libraries have done wonders for development.
Another issue is that software tends to do a lot more these days. So yes, a 1980s copy of Wordstar would run at crazy speeds on a modern computer, but modern word processors do so much more, so we sacrifice the speed increase we would have achieved.
I meant “failed” in the sense of not having achieved intelligence similar to human consciousness, which was an explicit goal, not marketing hype.
I agree though that there is a language problem with terms like “intelligence” and “ai”. Clearly there have been innumerable advances in machine learning and that may be called AI, but it is not the AI of popular imagination, it is something different. And in that sense the goalposts were moved by the researchers themselves when the consciousness project came to naught. Their target became something else.
This other definition of AI may not be like human consciousness, but it could be immensely powerful and dangerous in its own right.
Now the proposed solution for artificial consciousness is brain simulation, and as mentioned in the article, there may be fundamental physical limitations to this project as currently undertaken.
But I might not have all the info about where the research is- were you thinking of this dichotomy between “consciousness” and “intelligence”?
An explicit goal, but it was not the only goal. The hype was in giving it primacy.
As well it should be! Popular imagination arguably doesn’t count for much in the details of research. Using it as a selling point can be counter-productive.
Of course, because what was popularly disregarded was that not only was computer research advancing, but so was information theory and cognitive science. From the 1960s to the 1990s, even the concept of human intelligence itself was almost completely redefined, so many do not appreciate what a moving target it has been.
Well, it is only one approach. And arguably the least efficient option.
I am sure that this factors into it, to an extent. There still is not much consensus about what either term represent even in humans, which makes any aspirations of duplicating them problematic.
Yep, totally agree. The flashy stuff always gets more attention and sometimes can dominate the characterization of an entire field.
Everything else you added makes sense too- the field is very diverse and progressing in all sorts of ways.
Popular imagination matters because ultimately the researchers have to sell their work to the people who write the checks. But as you say, it is counter-productive when they fail to deliver on things that they promised as mere possibilities…
My skepticism is mostly of the Kurzweillian end game, and the inevitability of machine consciousness (though they are not mutually exclusive).
I don’t understand the controversy. This is what new tech does…
i’d say the difference is that software – and by extension, the web – is a platform, not just device.
a gun, or a car, or a plane. those things seem to be a relatively a small family of products(*).
a computer, though, is a product that produces new products.
(* oto, if we consider both guns and rockets an extension of gunpowder – then i guess we did just reach pluto… )