As a participant in the Amazon Services LLC Associates Program, this site may earn from qualifying purchases. We may also earn commissions on purchases from other retail websites.
Professor Stephen Hawking believes that at the rate the human race is advancing, we could DESTROY ourselves within the next 100 years. It is imperative –according to Professor Hawking— that the human race colonizes other worlds in the cosmos.
It seems that rapid progress in the realms of science and technology may spell doom for the human race.
In an interview with BBC, Professor Hawking stated that while progress is something positive for our civilization, it can create ‘new ways how things can go wrong.’
In the interview, Professor Hawking highlights nuclear war, global warming, and genetically engineered diseases which could bring doom upon mankind.
But this isn’t the first time Hawking made apocalyptic comments and that mankind will potentially face self-made disaster.
Not long ago in 2014, Professor Hawking said that Artificial Intelligence could spell an end to the human civilization.
During his numerous comments, Hawking pointed out that while he is ultimately an optimist, he thinks that mankind has the power to overcome various potential problems that humanity may face in the near future.
However, Hawking pointed out that if the worst were to happen, the only way for humankind to survive is to have otherworldly colonies like Mars, for example.
Regrettably, colonizing other worlds –as things are now— won’t be something that we are going to achieve in the next century.
Professor Hawking pointed out that a ‘global disaster’ is a near certainty within the next thousand to ten thousand years, and precisely why Hawking believes the next 100 years are the most critical since we are becoming extremely advanced.
“Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or ten thousand years,” he said.
“However, we will not establish self-sustaining colonies in space for at least the next hundred years, so we have to be very careful in this period.”
Despite the possible apocalyptic scenarios the human race could face in the near future, Professor Hawking pointed out that this was a “glorious time to be alive” for scientists.
“It’s important to ensure that these changes are heading in the right directions. In a democratic society, this means that everyone needs to have a basic understanding of science to make informed decisions about the future.“ Said Hawking.
“So communicate plainly what you are trying to do in science, and who knows, you might even end up understanding it yourself,” added Professor Hawking.
Interestingly, Professor Hawking was among more than 1,000 leading worldwide scientists and businesspeople to sign an open letter from the Future of Life institute.
The letter presented at a conference in South America suggested the ability to create autonomous weapons that think for themselves is “feasible within years.”
The letter states:
“If any significant military power pushes ahead with AI weapon development, a global arms race is virtually inevitable.
“Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. A military AI arms race would not be beneficial for humanity.”