American aerospace manufacturer SpaceX CEO is worried that Google may “accidentally” create an evil robot army that can turn against the human race.
Google's intensive research into robotics could "produce something evil by accident," technology entrepreneur Elon Musk said in a new self-titled authorized biography, The Independent reported.
Musk is the CEO at the Space Exploration Technologies Corporation (SpaceX) and electric car manufacturer Tesla Motors and one of the founders of Pay Pal online payments system.
Without clearly specifying any ultimate purpose, Google started to acquire robotics companies such as Boston Dynamics, which makes numerous types of robots, including a humanoid robot called Atlas and a running robot, called Big Dog, which was developed for and funded by the US military.
Speculations point to the possibility that the robots are designed to run on a Google-developed Artificial Intelligence.
Boston Dynamics Big Dog, developed for and funded by the US military
"You should presume that someday, we will be able to make machines that can reason, think and do things better than we can," Google CEO Larry Page said in a July 2014 interview.In the past, Musk has publicly announced his fears that the development of AI should be closely monitored as it may turn out to be dangerous.
AI is humankind’s biggest existential threat and its development is comparable to “summoning a demon,” he said.
“The risk of something seriously dangerous happening is in the five-year timeframe. Ten years at most," the SpaceX CEO wrote a few months ago in a leaked comment to an internet publication referring to the dangers of AI. "Please note that I am normally super pro-technology and have never raised this issue until recent months. This is not a case of crying wolf about something I don't understand."
Back in December, renowned physicist Stephen Hawking warned, “The development of full artificial intelligence could spell the end of the human race,” adding that "humans... couldn't compete and would be superseded" by robots.