Is it just me or should "singularity" refer to physics first and everything else second?
I think the "technological" clarification is simply getting dropped at this point.
The movie may be fun and all, but the "singularity" is bullshit based on a naive and ahistorical view of the world. Observing the early portion of an exponential curve does not justify assumptions of continued exponential growth: there are plenty of functions that start off as exponential and end up at some finite value. (Such as just about every curve of population density.)
It's pure mysticism to project current trends into an exponential future, as if history by necessity follows that mathematical curve.
Briefly I thought this was about 'THE' singularity and your comment made my head hurt.
Timelike infinity flitted momentarily through the active memeplex.
Anyhoo, I'd like to hear your thoughts on other future projections such as the Boltzmann Brain.
Depending on which school of Singularity you are talking about. The Ray Kurzweil style theory about the exponentially growing technology, the one that expects to predict WHEN the singularity is coming (2045), indeed has that weak point. However, the basic idea of the "intelligence explosion" doesn't need to presuppose that all technology will continue to grow exponentially, (or even that technology has been growing exponentially until now), just that AIs have to potential to create a positive feedback loop of exponential self-growth.
Hi everyone. I am the film's director/creator. I first wanted to thank everyone for the outpouring of support. You guys are helping us make this film better! We really appreciate it. We very much want to make a final film that does this topic justice.
I agree with alteregos - the idea of a positive feedback loop may be workable or not, but it does have validity. It might be that there is a limit to intelligence, whatever that is. Our understanding of the universe may be pretty close to how it actually works. But there is also historical precedent for a complete rewriting of human understanding due to new ideas. So it is possible that creating a greater intelligence will prove to be much like breaking the sound barrier; something that is interesting but not life changing for most of us. I think it is at least equally possible that a smart AI (or its successor) will see patterns and concepts that we cannot even imagine; we are, after all, still pretty limited apes. What it might see, and what it might do, is fairly unknowable.
In the film, we do not try to say what will happen. We didn't have the budget for it, for starters. We simply try to explore what it will be like to stand on the edge of that abyss. Once a super-intelligent AI comes about, it is fair to say that we may no longer be in control of what comes next. And I think that's a pretty interesting place to be.
Anyway, we love the discussion and are truly honored by the post.
Fiction that predicts the future does not have an ethical obligation to be accurate. That is not the point. Even Kurzweil is free to exaggerate to make his point. You have to look to the message. Are we ceding control to something we don't fully understand? Is it even possible to turn back at this point? These are valid and interesting questions that inspire speculation which becomes theater because of the commonality of the questions. Art works in mysterious ways.
The answer is that there is a finite limit to intelligence and we have evolved to take full advantage of the available technology because life in the real world is brutal and pushes everything to its limits. The next part of the equation is how much difference can be wrung out of the different technology available to the ur-computer.
So what we will have is a well matched contest.
So, The Terminator with Google Glass instead of killer cyborgs?
...or at least design a better AI that can plan how to build such a thing, or at least design a better AI that can design a better AI that can plan how to build such a thing, or at least design a better AI that can design a better AI that can design a better AI that can plan how to build such a thing, or at least design a better AI that can design a better AI that can design a better AI that can design a better AI that can plan how to build such a thing, or at least design a better AI that can design a better AI that can design a better AI that can design a better AI that can design a better AI that can plan how to build such a thing, or...
This topic was automatically closed after 5 days. New replies are no longer allowed.