British researchers say they want to create a Centre for the Study of Existential Risk to analyse the ultimate risks to the future of mankind.
The purpose of the center, they said, would be to consider risks to mankind's survival from biotech, nanotech, extreme climate change, nuclear war and runaway artificial intelligence.
Artificial intelligence is a particular focus, researchers said, with concerns super-intelligent machines could someday be a danger to the human race.
The exponential increases in computing complexity will eventually reach a critical turning point when artificial intelligence allows computers to write programs and create technology to develop their own offspring, they said.
"Think how it might be to compete for resources with the dominant species," said Cambridge University philosopher Huw Price. "Take gorillas for example -- the reason they are going extinct is not because humans are actively hostile towards them, but because we control the environments in ways that suit us, but are detrimental to their survival."
Experts from a number of fields, including law, computing and science, would advise the center and help with investigating the risks, Price said.
"At some point, this century or next, we may well be facing one of the major shifts in human history -- perhaps even cosmic history -- when intelligence escapes the constraints of biology," he said.
"Nature didn't anticipate us, and we in our turn shouldn't take artificial general intelligence for granted."
.
The purpose of the center, they said, would be to consider risks to mankind's survival from biotech, nanotech, extreme climate change, nuclear war and runaway artificial intelligence.
Artificial intelligence is a particular focus, researchers said, with concerns super-intelligent machines could someday be a danger to the human race.
The exponential increases in computing complexity will eventually reach a critical turning point when artificial intelligence allows computers to write programs and create technology to develop their own offspring, they said.
"Think how it might be to compete for resources with the dominant species," said Cambridge University philosopher Huw Price. "Take gorillas for example -- the reason they are going extinct is not because humans are actively hostile towards them, but because we control the environments in ways that suit us, but are detrimental to their survival."
Experts from a number of fields, including law, computing and science, would advise the center and help with investigating the risks, Price said.
"At some point, this century or next, we may well be facing one of the major shifts in human history -- perhaps even cosmic history -- when intelligence escapes the constraints of biology," he said.
"Nature didn't anticipate us, and we in our turn shouldn't take artificial general intelligence for granted."
.
No comments:
Post a Comment