I wish this pseudo-technical, new-age, flim-flam of an idea would die already. Seriously, does anyone with a real technical background that is not directly making money off of seminars, books and articles selling this believe this stuff?
I have a graduate degree in robotics, and I think that robots will be as capable as humans within a few decades. I also think it is likely we can engineer a solution to aging, but I'm not as familiar with the tech there.
Maybe movies with transhuman/posthuman themes, but the singularity implies things about the rate of technological progress - certainly there wasn't much rapid progress in Terminator after the machines nuked the earth (except, maybe, in the production of new terminator robots, but even that wasn't fast, really).
Eagle Eye was a singularity movie in potential, but in essence it was no different than War Games in that it was about a rampant AI with no progression beyond. I, Robot had a far greater display of the advance of technology than virtually any Sci-Fi movie (except maybe the bicentennial man) and only displayed ~3 generations of robots (the replaced, the being replaced and the new) but is in no sense a singularity movie.
I doubt we'll ever see a movie truly depict the singularity in any way other than a glimpse and will be akin to virtually any sci-fi movie in any setting. We might get there with a great TV series, but that's doubtful.
IMO a singularity TV series would have to take the story progression of Taken in the generation skipping, but instead of giant leaps, it would be a leap of decades, years and then months as technology advances up to the '2030' mark.
For movies we'll be stuck with a BS intro that pails in comparison to the introduction of each Fallout game but will inevitably be describing similar events.
None of those present god (little g) level AI being created by humans but, AI (2001) is fairly close to the idea. By the end of the movie the robots knew how to bring someone back from the dead for a day and who knows what else.
There are a large number of assumptions built up around the singularity concept and CPU's are not getting exponentially faster so at least one of them seems incorrect. Honestly, the hardest problem IMO is the idea that intelligence is fungible and twice as much of it will provide anywhere near twice the result. The real world is fuzzy and thinking faster/deeper only get’s you so far.
Let's say we can build a machine smarter than a single person, that’s a long way from building something smarter than humanity so there is not a quick kick off to infinity. Now let's suggest we build a billion of the things and they are all smarter than the average person. Nothing says you can get anything close to a linear feedback between the intelligence designing an AI and how smart the AI is so there would not necessarily be a feedback loop to keep making more intelligent systems. etc.
When talking about about computation the argument is usually presented as exponential increase in processor computational power (FLOPs/MIPs) per unit cost not processor speed. Even though processor speed peaked recently computational power per unit cost has continued to increase.
Economics drives an increase in computation per unit cost not processor speed. How the increase in accomplished is irrelevant.
Aside from that, the difference between one human and all of us is only 2^32 - so if you can place the intelligence of one human on a chip, you'll be able to do the same thing with all of humanity 64 years later. And that's ignoring things like reduced costs per chip / using multiple chips for the same price.
Well first of all, we all know the limitations of exponential growth in the real world. What does 'enhancing capability and intelligence to an unimaginable scale' really mean? It all strikes me as uselessly fuzzy and mystical. Good for pop 'science' articles and sci-fi speculation (which, don't get me wrong, is fun in and of itself) but not terribly useful or informative otherwise.
'this stuff' := The Singularity - The idea that we will be able to design systems with enough 'intelligence' (a difficult term to define itself) as to be imbued with novel creativity that surpasses all human understanding...in turn, these systems create ever more advanced creative systems until (insert your dis/utopian idea of preference here).
I've read charles stross's accelrando yesterday. One of the best books about singularity. It's also free (Creative Commons License). Probably the best free ebook I've ever read.
On a similar, somewhat-related tangent: The Metamorphosis of Prime Intellect is a fantastic novella depicting rapid change caused by a technological singularity.
Isn't a lot of the singularity idea based on flawed math? I always hear about and see graphs of exponential curves about to "go vertical", but... exponential curves definitely don't do that. You can zoom in on any point of e^x and it will look like it's about to have a singularity...
In more general terms... if ~technology has been getting better with some short doubling time for hundreds of years... why is it that the NEXT doubling is supposed to be the really significant one?
There is a big difference between humans iterating to improve technology and machines iterating to improve technology that improves the machines. The latter is a stronger feedback loop.
But generally, I agree. We're biased to under-appreciate the impacts of past changes.
The last doubling was really significant. You know, instant free communication to humans anywhere on the world as well as instant access to the sum total of human knowledge.
I bet that whole light coming from little threads in your walls that kill you if you touch them was quite a doozy too. Commercial airflight. Fertilizer. The idea of foreign pathogens causing diseases.
Which is not to say that internet technology isn't widely more crazy than all of those previous or that there won't continue to be wider and crazier things to come. It is to say that we're not escaping exponential growth and there still isn't anything like a spike in that sort of curve.
The singularity theory proposes an increase from exponential: some technology that speeds new technology's ability to speed new technology. Without the feedback --- without the efficient feedback --- everything "stays exponential". Which is only really depressing if you're looking for hypersmart AI and endless free human life in the next 20 orso.
Right, and the next doubling will be even better (twice as good!). But the sense that we're right on the threshold of the infinite future has probably been around at least since industrialization.
For example, that bit about instant access to people and knowledge was probably said about the telegraph too.
Sure, I agree. My only point is that people seem to underestimate the regular, non-singularity exponential doubling. Already some aspects of science and civilization have become more far-fetched than the sci-fi of 50 years ago. Like http://www.wired.com/magazine/2009/10/mf_optigenetics/
Reading The Singularity kinda reminded of a Neal Stephenson novel. You start off with well-grounded reality, you progress smoothly, then you finish and you wonder...wait, wtf just happened? How did I get to this wacky place?
I mean, we'll all me omniscient, immortal demi-gods in 39 years? Really?
The great thing about youth's obsession with 'immortality' is that once you hit the 35-40 age bracket, and you get the first of your chronic pain sources, the idea suddenly starts to get a LOT less attractive.
Eh, I don't know about this. It sounds like it has a plot and conflict, but will it be good storytelling? However, it's kind of cool to see so many big AI-community/singularitarian names in the cast credits!
I did some work earlier this year making some machinima video in Second Life for this movie. From what I've seen so far, it's just a documentary-style movie.