7 Totally Unexpected Outcomes That Could Follow The Singularity
http://io9.com/7-totally-unexpected-outc...-512600550
EXCERPT: By definition, the Technological Singularity is a blind spot in our predictive thinking. Futurists have a hard time imagining what life will be like after we create greater-than-human artificial intelligences. Here are seven outcomes of the Singularity that nobody thinks about — and which could leave us completely blindsided....
When Machines Learn Like Humans --"Our Last Great Invention?"
http://www.dailygalaxy.com/my_weblog/201...ntion.html
EXCERPT: [...] Brenden Lake, at New York University, and colleagues sought to develop a model that captured these human-learning abilities. They focused on a large class of simple visual concepts -- handwritten characters from alphabets around the world - building their model to "learn" this large class of visual symbols, and make generalizations about it, from very few examples.
They call this modeling scheme the Bayesian program learning framework, or BPL. After developing the BPL approach, the researchers directly compared people, BPL, and other computational approaches on a set of five challenging concept learning tasks, including generating new examples of characters only seen a few times.
On a challenging one-shot classification task, the BPL model achieved human-level performance while outperforming recent deep learning approaches, the researchers show. Their model classifies, parses, and recreates handwritten characters, and can generate new letters of the alphabet that look 'right' as judged by Turing-like tests of the model's output in comparison to what real humans produce.
So, what's in store for our future: "Let an ultra-intelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever," said I.J. Good, a British mathematician who worked as a cryptologist at Bletchley Park with Alan Turing was the originator "technological singularity" who served as consultant on supercomputers to Stanley Kubrick, director of the 1968 film 2001: A Space Odyssey."Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultra-intelligent machine is the last invention that man need ever make...."
http://io9.com/7-totally-unexpected-outc...-512600550
EXCERPT: By definition, the Technological Singularity is a blind spot in our predictive thinking. Futurists have a hard time imagining what life will be like after we create greater-than-human artificial intelligences. Here are seven outcomes of the Singularity that nobody thinks about — and which could leave us completely blindsided....
When Machines Learn Like Humans --"Our Last Great Invention?"
http://www.dailygalaxy.com/my_weblog/201...ntion.html
EXCERPT: [...] Brenden Lake, at New York University, and colleagues sought to develop a model that captured these human-learning abilities. They focused on a large class of simple visual concepts -- handwritten characters from alphabets around the world - building their model to "learn" this large class of visual symbols, and make generalizations about it, from very few examples.
They call this modeling scheme the Bayesian program learning framework, or BPL. After developing the BPL approach, the researchers directly compared people, BPL, and other computational approaches on a set of five challenging concept learning tasks, including generating new examples of characters only seen a few times.
On a challenging one-shot classification task, the BPL model achieved human-level performance while outperforming recent deep learning approaches, the researchers show. Their model classifies, parses, and recreates handwritten characters, and can generate new letters of the alphabet that look 'right' as judged by Turing-like tests of the model's output in comparison to what real humans produce.
So, what's in store for our future: "Let an ultra-intelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever," said I.J. Good, a British mathematician who worked as a cryptologist at Bletchley Park with Alan Turing was the originator "technological singularity" who served as consultant on supercomputers to Stanley Kubrick, director of the 1968 film 2001: A Space Odyssey."Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultra-intelligent machine is the last invention that man need ever make...."