Former Google CEO warns that AI could be exploited by hostile nations like North Korea, Iran, and Russia to target and “harm innocent people.”
Eric Schmidt, who held senior posts at Google from 2001 to 2017, told BBC Radio 4’s Today programme that those countries and terrorists could adopt and misuse the technology to develop weapons to create “a bad biological attack from some evil person”.
The tech billionaire said: “The real fears that I have are not the ones that most people talk about AI – I talk about extreme risk. Think about North Korea, Iran, or even Russia, who have some evil goal. This technology is fast enough for them to adopt that they could misuse it and do real harm.”
Also read | Mass firings begin as Trump and Musk plan to shrink US federal workforce
Schmidt calls for govt oversight
Schmidt called for government oversight on private tech companies that are developing AI models but warned that over-regulation could stifle innovation.
“It’s really important that governments understand what we’re doing and keep their eye on us,” he said.
“My experience with the tech leaders is that they do have an understanding of the impact they’re having, but they might make a different values judgment than the government would make.”
Also read | Ukraine detains senior security official, accuses him of spying for Russia
“Always worried about Osama bin Laden scenario”
In reference to the head of the al-Qaida terrorist group who orchestrated the 9/11 attacks in 2001, Schmidt said: “I’m always worried about the Osama bin Laden scenario, where you have some truly evil person who takes over some aspect of our modern life and uses it to harm innocent people.”
Schmidt agreed with the US export controls introduced by former US president Joe Biden, which restricted the sale of the microchips that power the most advanced AI system to 18 countries to slow adversaries’ progress on AI research.
The former Google CEO was speaking from Paris, where the AI Action Summit finished with the US and UK refusing to sign the agreement.
(With inputs from agencies)