Google Reader Shared Items


"Sometime in the next few years or decades, humanity will become capable of surpassing the upper limit on intelligence that has held since the rise of the human species. We will become capable of technologically creating smarter-than-human intelligence - perhaps through enhancement of the human brain, direct links between computers and the brain, or Artificial Intelligence (AI).

Because a smarter-than-human intelligence could apply the tools that were originally used to create it to further enhance its own intelligence, this unique event could result in a positive feedback spiral of self-enhancements. This event is called the "singularity" by analogy with the singularity at the center of a black hole - just as our current model of physics breaks down when it attempts to describe the center of a black hole, our normal model of the future breaks down once the future contains smarter-than-human minds.

Because of the unique leverage that benevolent smarter-than-human intelligence could offer in confronting the world's humanitarian problems, the non-profit Singularity Institute is attempting to create such a mind through the chosen path of Artificial Intelligence.

Please consider the Singularity Institute's arguments, and help us ensure that the first smarter-than-human intelligence is also kinder-than-human.

Post a Comment

Amazon Deals