[quote]The technological singularity is the theoretical emergence of superintelligence through technological means.[/quote]
Do you guys think that as technology advances we will eventually hit a point where we create a machine with intelligence greater than that of any human being? I mean when you look at it computers, for example, are getting faster [i]faster.[/i]
[quote]A technological singularity includes the concept of an intelligence explosion, a term coined in 1965 by I. J. Good. Although technological progress has been accelerating, it has been limited by the basic intelligence of the human brain, which has not, according to Paul R. Ehrlich, changed significantly for millennia. However, with the increasing power of computers and other technologies, it might eventually be possible to build a machine that is more intelligent than humanity.[/quote]
I do believe that there will be a point in time during the 21st century where we create artificial intelligence that is smarter than humans. The AI would be experiencing its own evolution where it sees large improvements in its intelligence and the rate of which those improvements occur - all in order to compete with the other AI. At this point the AI has absolutely no obligation to promote the existence of their far less intelligent human creators and could chose to end the human race if they chose. The end of the human race could still happen in this scenario even without malicious intent as the AI could simply use the resources humans need for survival. Essentially it could be analogized AI : Humans :: Humans : Dogs in terms of intelligence.
What do you guys think? It's kinda confusing for me but it is interesting at the same time.
Ray Kurzweil, the man bill gates calls, "the best person I know at predicting the future of artificial intelligence," predicts the singularity to occur around 2045. I don't think it's a matter of it, but when.
-
Artificial intelligence can't be smarter than humans because they can only know what we know.