Speaking at the Collision tech conference, AI scientist Geoffrey Hinton, widely known as the “Godfather of AI,” urged governments to intervene and ensure that machines do not seize control of society. Hinton, who recently left Google after a decade of service, emphasized the need to address the potential risks associated with AI, especially after the release of the captivating ChatGPT.
Addressing a packed audience of more than 30,000 industry professionals in Toronto, Hinton highlighted the importance of understanding how AI could potentially try to take control. He expressed his concerns by stating, “Before AI is smarter than us, I think the people developing it should be encouraged to put a lot of work into understanding how it might try and take control away. Right now, there are 99 very smart people trying to make AI better and one very smart person trying to figure out how to stop it taking over, and maybe you want to be more balanced.”
Hinton’s concerns regarding AI were not without substance, as he emphasized the need to address the risks seriously. He stated, “I think it’s important that people understand this is not science fiction; this is not just fear-mongering. It is a real risk that we must think about, and we need to figure out in advance how to deal with it.”
In addition to potential control issues, Hinton also expressed concerns about how AI could worsen inequality. He expressed his fear that the benefits of AI adoption would primarily flow to the wealthy, leaving workers behind. “The wealth isn’t going to the people doing the work. It will go into making the rich richer and not the poorer, and that’s very bad for society,” he added.
Hinton also addressed the dangers of fake news generated by AI-powered bots like ChatGPT. He expressed hope that AI-generated information could be watermarked, similar to how central banks watermark actual money. The European Union is currently exploring such a technique as part of its AI Act, which is being negotiated by legislators to establish AI standards in Europe.