- News>
- Science
Stephen Hawking issues warning, says only `world government` can save humanity from destruction!
If humanity is to survive to see the future, then we might need to form a world government, Hawking said.
New Delhi: World renowned physicist Stephen Hawking has time and again warned against various issues that concern the world at large.
The British physicist has been quite outspoken about the existence of alien civilization and has warned against efforts to contact them and has also said that the human race is destined to be doomed unless we move to space.
Now, in another cautionary statement, Hawking has said that human beings' aggressive instincts, together with progressive technological growth may destroy us all by nuclear or biological war.
Adding to it, the Briton said that that only a 'world government' may prevent this impending doom.
Despite the problems of mass species extinction, global warming and the threat of artificial intelligence, Hawking remains optimistic about the future of humanity.
He said that he looked back on his life with gratitude and towards the years to come with cautious hope.
However, he is worried that humans may not have the skills as a species to stay alive.
If humanity is to survive to see the future, then we might need to form a world government, Hawking said.
"We need to be quicker to identify such threats and act before they get out of control. This might mean some form of world government. But that might become a tyranny," said Hawking.
"All this may sound a bit doom-laden but I am an optimist. I think the human race will rise to meet these challenges," he said.
"Since civilisation began, aggression has been useful in as much as it has definite survival advantages," he said.
"It is hard-wired into our genes by Darwinian evolution. Now, however, technology has advanced at such a pace that this aggression may destroy us all by nuclear or biological war," he added.
"We need to control this inherited instinct by our logic and reason," Hawking was quoted as saying by 'The Times'.
He argued that there were new challenges too - among them environmental problems and his concern that artificial intelligence could supplant humans.
Hawking had earlier warned that the creation of powerful artificial intelligence may turn out to be "the worst thing ever to happen to humanity" despite its potential benefits.
(With PTI inputs)