"What is the race weapon of the alleged weapon and is it desirable for a human race?" Asks for hacking. "We really want a cheap weapon to become Kalashnikov tomorrow, criminals and terrorists sold on black market?"
Hawking primary concerns A.I. Finally the competition. People develop on biological scale and say hacking, that we can not hope to live in the form of life that is being developed in a digital way. And it's a stupid concept, says cars that have goals and interests that we have for ourselves. Advanced artificial intelligence should not be hazardous.
"The real risk is not bad, but competence," writes Hawking. "Super Intellectual A. It will be very good in accomplishing its goals and if these goals do not correspond to our problems."
Hawking's famous voiced his concern in this area well before he died. In the new book he calls the reader to continue working on the art intellectual published in the open letter of 2015.
Related to: Stephen Hawking wants to avoid AI from Killing Us at all
As Hawking addresses other issues in the excerpt, his tone grows in the dark, especially when arriving on the topic of genetic expansion.
"We are now entering a new phase of the evolution of so-called self-evolution, in which we can change and improve DNA," adds Hawking. "We have just ordered DNA, which means that we read the book of life, so we can start correction.
This almost necessarily leads to new types of eugenics, Hawking says – and sooner than anyone thinks.
"I am convinced that in this century people will find out how to change intelligence and instincts like aggression," he wrote.
Although laws can be obtained from genetic engineering, some scholars will not be tempted to improve human traits.
"It seems that such superheroes will appear to be important political problems that are not familiar people who can not compete," he says. "Presumably they die or become insignificant, instead of the kind of self-creation that is constantly growing up.
Related to: "Slaughterbots" video reflects the Autonomous Killer drone Dystopian future
So there is any good news for Hawking vision in the future? Kindness. Throughout the episode, Hawking takes the pains to point out that he considers himself an optimist. He believes, for example, that humanity will create additional planetary colonies, as long as we survive for the development of technology.
But unfortunately, even Hawking's chin-up message starts with the glacial look straight down.
"One way or another, I think it's almost unavoidable that nuclear confrontation or ecological catastrophe violates the Earth for the next 1,000 years," he says. "Then I hope and believe that our cosmic race has found a way to sort out the earth's close ties and save the catastrophe."
Reserve your tickets.