AP Photo/Elizabeth Dalziel

AP Photo/Elizabeth Dalziel Stephen Hawking has been vocal about the dangers of artificial intelligence (AI) and how they could pose a threat to humanity.  In his recent Reddit AMA, the famed physicist explains how that might happen.  When asked by a user how AI could become smarter than its creator and pose a threat to the human race, Hawking wrote:  It’s clearly possible for a something to acquire higher intelligence than its ancestors: we evolved to be smarter than our ape-like ancestors, and Einstein was smarter than his parents. The line you ask about is where an AI becomes better than humans at AI design, so that it can recursively improve itself without human help. If this happens, we may face an intelligence explosion that ultimately results in machines whose…

Link to Full Article: AP Photo/Elizabeth Dalziel

Pin It on Pinterest

Share This

Join Our Newsletter

Sign up to our mailing list to receive the latest news and updates about homeAI.info and the Informed.AI Network of AI related websites which includes Events.AI, Neurons.AI, Awards.AI, and Vocation.AI

You have Successfully Subscribed!