Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

nitpicker

(7,153 posts)
Tue Jan 19, 2016, 07:15 AM Jan 2016

Hawking: Humans at risk of lethal 'own goal'

http://www.bbc.com/news/science-environment-35344664

Hawking: Humans at risk of lethal 'own goal'

David Shukman
Science editor

19 January 2016

From the section Science & Environment

Humanity is at risk from a series of dangers of our own making, according to Prof Stephen Hawking. Nuclear war, global warming and genetically-engineered viruses are among the scenarios which he singles out. And he says that further progress in science and technology will create "new ways things can go wrong".
(snip)

He says that assuming humanity eventually establishes colonies on other worlds, it will be able to survive.

"Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or ten thousand years. By that time we should have spread out into space, and to other stars, so a disaster on Earth would not mean the end of the human race. However, we will not establish self-sustaining colonies in space for at least the next hundred years, so we have to be very careful in this period."
(snip)

On previous occasions, he has highlighted the potential risks of artificial intelligence (AI) becoming powerful enough to cause the extinction of the human race. But he insists that ways will be found to cope. "We are not going to stop making progress, or reverse it, so we have to recognise the dangers and control them. I'm an optimist, and I believe we can."
(snip)
9 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
 

djean111

(14,255 posts)
1. My grandson gets almost apoplectic when I agree with Hawking that AI might be dangerous -
Tue Jan 19, 2016, 10:17 AM
Jan 2016

I tell him that all AI starts with a decision tree created by a human, and that there are always unintended consequences and unforeseen anomalies. As a former programmer and quality assurance tester, I feel can say that with some certainty. What is right and what is wrong, both in general and in specific circumstances, cannot be rigidly determined unless all outside anomalies are avoided or taken into account. JMO and all that.

Odin2005

(53,521 posts)
7. I hate the naive enthusiasm for AI and the "Technological Singularity" among...
Sat Jan 30, 2016, 06:31 PM
Jan 2016

...us Millennials. It's basically the Nerd Rapture.

FiveGoodMen

(20,018 posts)
4. If all 7+ billion on earth will die in whatever disaster we're discussing
Tue Jan 19, 2016, 08:26 PM
Jan 2016

Is it REALLY so important that someone's offspring somewhere are planting their feet on another planet?

We sure think we're the bomb. Universe would be meaningless without us, huh?

hunter

(38,328 posts)
5. I think that's the most useful way of looking at it.
Thu Jan 21, 2016, 01:07 PM
Jan 2016

Sure, it's going to demotivate a certain number of people, Fuck it, I'll turn up the air conditioner to eleven and drive my 4WD over baby seals! but those are not the people who were going to solve this problem anyways.

If hope doesn't make people do the right thing then hopelessness isn't going to change anything. It's just another vehicle on the road to hell. You can get to hell just as fast in a gas guzzling V8 Corvette as a Tesla.

Techno-utopias are just another twist on religion, another kind of heaven. Something will save us, be it fusion power or colonies in outer space.

Reality doesn't care about any of that. Reality is all in the moment. It's all about the tools you have in your hand today.

We've already got all the tools we need to fix this, to deal with the disasters we are making. If we don't fix it, who the hell cares what happens to us?

We're just another innovative species in the geologic record that enjoyed a very brief moment of exponential growth and then crashed and burned.

Our descendants, if any survive, will not be what we are. Most species in the history of life on earth reach a dead end. They are extinct.

proverbialwisdom

(4,959 posts)
6. If only common sense could be monetized it'd be so much easier to sort things out.
Sat Jan 30, 2016, 03:03 PM
Jan 2016
http://www.cnbc.com/2016/01/19/hawking-threats-to-human-survival-likely-from-new-science.html

Hawking: Threats to human survival likely from new science
Physicist Stephen Hawking has warned that new technologies will likely bring about "new ways things can go wrong" for human survival.

Tuesday, 19 Jan 2016 | 11:27 AM ET
The Associated Press


When asked how the world will end, Hawking said that increasingly, most of the threats humanity faces come from progress made in science and technology. He says they include nuclear war, catastrophic global warming and genetically engineered viruses.

<>

https://www.theguardian.com/science/2016/jan/19/stephen-hawking-warns-threats-to-humans-science-technology-bbc-reith-lecture

Most threats to humans come from science and technology, warns Hawking
Speaking ahead of his BBC Reith Lecture on black holes, Stephen Hawking discusses the danger inherent in progress and the chances of disaster on Earth

Ian Sample Science editor
Monday 18 January 2016 19.01 EST


The human race faces one its most dangerous centuries yet as progress in science and technology becomes an ever greater threat to our existence, Stephen Hawking warns.

<>

For thirty years Hawking was Lucasian professor of mathematics, a post once held by Isaac Newton, and one of the most prestigious academic positions in the country. But Hawking said he felt closer to Galileo Galilei, the 16th century astronomer, who overturned the received wisdom of his time with rigorous observations. Given a time machine, Hawking named Galileo as the scientist he would travel back in time to meet.



Latest Discussions»Culture Forums»Science»Hawking: Humans at risk o...