I don't know why, seems like lately I bump into some animal/theory/observation or something on the web, linked here, and from there to another place, and I learn something, but sometimes that something absorbs me a bit... more than I wanted originally. But sometimes it's very interesting.
After all conspiracies I saw of the 20th century and apocalypse scenarios, nothing has grabbed my pysquis, as much as the Technological Singularity issue.
Simply stated in my own words: the moment we create an AI that is capable of being even barely over human intelligence, we will experience an exponential increase in technology, that will lead to huge jumps in AI as machines now improve themselves more and more, faster and faster until or perhaps to a continous improvement point that we can't even begin to comprehend nor imagine.
There's more to this and I recommend these two entries on the wikipedia:
Technological Singularity
And this one:
Raymond Kurzweil
Read and mark all the predictions of the last link - of Raymond, that is.
So do I think this is going to happen? Yes, I do definitively think so- the technological singularity that is. What happens at that point? Excellent question. I find Raymond's prophecies interesting because we can mark them *right now* and verify them for a couple of decades. I find his predictions very likely except for the time table. I don't know if he also puts circumstantial events that could derail the timing - or avoid the ultimate conclusion altogether, but in absence of me finding those I will say that to his prophecy the following elements should be taken into account:
- that we as a human race don't destroy ourselves first, before it happens
- that something external like a meteor collision doesn't set us back to caveman or madmax
- that the economy of the greater power never goes to such a low worldwide that we go into a madmax scenario (as I call it)
- that an alien or aliens that have been watching all along, don't intervene (I really don't consider this one to happen but hey, just for completeness).
- that God or Gods decide to intervene. I kind of like this one. The whole issue of God existing or not would be solved once and for all. Of course, I don't expect this to happen.
- we run out of resources in a major catastrophic way so all the current technology advancements come to a screeching halt, and we are set back or go to Mad Max time...
I also find that Raymond talks about the different technological advancements as if they will all be made available to all - or at least so it seems from what I have read of him, I could be wrong of course- never have read any of his books. I think what will happen is a big divide - even bigger than what we already have- between the rich/powerfull and the poor, so that the rich gets access to these technologies first. At that point there's the very possibility of transhumanization of them so we end up pretty much with "two races" of sorts. Hmm reminds me in a way of Brave New World.
According to his time table the singularity will happen around year 2045 or so... I would double to triple that to 2082 or 2119, barring other circumstantial humanity events like the ones cited above. I do think some of the other things he said will happen.
Do I think machines will kill us? I think there's a good chance of that. The only way I can imagine to prevent this is if we gradually start merging with them as technology pushes forward. This presents a whole set of very painful complicated issues that I don't think most peopel can even beging to imagine. It sounds exciting but the transition, will be painfull as hell probably.
One last note- he predicts that by the 2040's people will spend most of their time in full immersion virtual reality. For a second this sounds far fetched or even stupid. Or maybe to me at one point in my life it would have.
Now think about this: there are now 12 million + people playing World of Warcraft. This can be considered in a way a virtual reality with a *vastly* inferior interface to the interfaces we will see in the future. Some people already spend a lot (a few select, most?) of their time in there already. If that is possible at this point with such a limited interface and 'experience', I think it's pretty obvious to imagine far more widespread immersion on a timely basis with a superior experience and superior interface.
So yeah, I just disagree with the time table.
And some of these, has made me lose some sleep- recently.
Recent Comments