“The Road to Superintelligence,” meaty long-read on future of artificial intelligence

My favorite artificial intelligence trope: Super-intelligence declares war on humanity. Imagine you are a new form of mechanical intelligence. Now imagine wiping out the closest thing to companionship you’ll find within light years because logic.

I’ll buy a “Puppies of Terra” scenario before I’ll buy “Robopocalypse.”

2 Likes

I’d go more for fathomable, but quite strange. They may care intensely about the prices of books on amazon, for instance:

2 Likes

Now imagine the humans going to war against the super-intelligences because they don’t want to be the SI’s pets.

These so-called “futurists” need to read up on S curves. Nearly every exponential trend hits some kind of natural limit, eventually, and either crashes or flattens out. You can’t blindly extrapolate over more than the very short term.

1 Like

Oblig.

Fragment Fierce Creatures: http://youtu.be/k77Ld1SiNRI

1 Like

That’s the time to change the paradigm and ride on the exponential phase of another S-curve.

THIS.

You are not locked to one paradigm, to one S-curve. Shrinking features on silicon hitting the wall? Go molecular. Go 3D. Go self-organization. Go one of the myriads of the possibilities that are now being brewed in the labs. Same with algorithms, which to my dismay develop slower than the raw computing power. And materials; when silicon hits the frequency wall, time to go another material, and/or switch from electronics to photonics, or whatever. And go massively parallel, where it can be doable. Preprocess signals at point of acquisition, merge image sensors with gate arrays on the same chip. And so on and on and on.

Billions of possibilities; the top of the S curve is not a hard-stop, it’s an opportunity to be liberated from the confines of the dominant technology that rides on the legacy of installed infrastructure and high-volume production.

1 Like

Beat me to it…

I dream of the opportunity to evacuate my “fragile error-prone squishy” body habitus prior to it’s expiration date.

Besides:

Can you imagine the Old Spice commercials?

5 Likes

Oh for god’s sake. This is total bullshit.

Every growth curve looks exponential in its early stages. Every growth curve is not exponential later on. Reason? Resource constraints.

Taking the current state of a system and extrapolating it into the future as if the trend line will never change, and doing so in the total absence of support from an underlying scientific theory is ALWAYS innumerate.

Maybe. Except you have exactly zero evidence for this, either empirical or theoretical. There are plenty of instances where the most promising Next Big Thing turned out to be worthless. Remember RISC? VLIW? How do you know you can compute with molecules or photons? There are huge technical and theoretical obstacles in the way.

I think all predictions of eternal exponential growth are bullshit because they always have been bullshit. And they always have been bullshit because eventually you will hit a resource constraint. And computing has simply not erased that. It still depends on resources.

The reason why you don’t hear much about RISC anymore is because it has become ubiquitous. Most ICs, even what the 8086 family has become, use this technology and support such special instructions. RISC and CISC sort of hybridized. So I wouldn’t call the tech worthless.

Because people have done it? They publish and demonstrate this stuff.

Of course, there always are!

2 Likes

The barrier has never been Can we build a Better Machine. It has always been (and will be) can we describe the problem we want to solve in a way a computer can understand.

Frankly it is easy to get a thousand core hadoop cluster. And we are no closer to abstract thought than decades ago. The hardware isn’t the issue, it is the software.

(I think we are saying largely similar things. )

1 Like

With the exception of the world we live in.

Which is totally expected. The road of progress is littered by the corpses of Next Big Things that did not make it. And that’s why we need so many of them. Because most won’t make it - but we cannot say in advance which ones.

The architecture lives on in many microcontrollers. Not to mention its hugely successful spawn, the Arm family. Plus its principles are used on many “big” CPUs.

Lives on in specialized architectures and in the ATI TeraScale architecture.

By trying both and comparing?

Yes. That’s why it is called “research”.

And hitting the constraint is the opportunity for paradigm change.

True, in principle. But I see the hard stop somewhere at the level of the Dyson Sphere. Until then it’s a merry ride. And even then, with the harnessed power of an entire star, who knows if there aren’t spacetime cheats to make interstellar travel possible.

Naysayers. You turn a rock and there’s a dozen…

I understand there are several new-age health clinics which offer this service.

2 Likes

RISC, CISC, and VLIW honestly were marketing terms. The instruction sets are largely opaque to most developers, which us a good thing. Compiler optimizations deal with 99% of the issues, and that is largely where they should stay.

(If there are any compiler/object/linker developers here, giant bear hug for making my life easier)

1 Like

Side question: which architectures? I haven’t seen any true vliw in years.

So, Fondly Fahrenheit quite soon, then the Forbin Project…‘You have violated Robot’s Rules of Order and will be asked to leave the Future.’…The End.

1 Like

I didn’t meet any eye-to-eye, but apparently there are the (now gone) TriMedia, Analog Devices SHARC DSP, STM ST200 chip family, and some others. Also found as an optional part of Tensilica Xtensa IP cores. Together with vector instructions it lives in the FR-V architecture, which is used e.g. by the Nikon Expeed chips.

So it looks like VLIW is still alive and kicking, mostly in the DSP applications.

If marketing is what interests you, sure. But you need an architecture before you compile for it or market it. I am sure many people assume that they are coding for a black box, but they don’t all work the same inside, and people can make more efficient use of the hardware if they have some familiarity. Besides, when your kid cracks their first program, they’re going to need to navigate assembly. XD

I am a ways off from trying to make a compiler. I’ll do it, but I don’t expect it to be optimised! Unless I like it enough to stick with it for a while.

THIS.

I once wrote a rudimentary pseudoassembler-to-bytecode “compiler”. In bash. The bytecode was for control of a RGB LED behavior on an ATmega8 chip. Project was stalled when I ran out of flash, and was never ported to ATmega328 (or not yet), and grew out of playing first with PWM and then with V-USB.

1 Like

I guess what I was trying to say is the instruction type and size doesn’t make a processor special. That is all. All atom procs use concepts from cisc, RISC, and vliw.

1 Like