Tech

'Godfather Of AI' Quits Google To Warn About Dangers Of Technology

Swarajya StaffMay 03, 2023, 08:25 AM | Updated 08:25 AM IST
Geoffrey Hinton (Pic Via MIT Technology Review)

Geoffrey Hinton (Pic Via MIT Technology Review)


Geoffrey Hinton, who is sometimes referred to as 'Godfather of AI', has quit Google to warn about dangers of the technology.

Hinton’s pioneering work on neural networks shaped artificial intelligence systems powering many of today’s products.

He worked part-time at Google for a decade on the tech giant’s AI development efforts.

He started working for the company in 2013, and while at Google, he designed machine learning algorithms.

In a tweet Monday, Hinton said he left Google so he could speak freely about the risks of AI, rather than because of a desire to criticise Google specifically.

“I left so that I could talk about the dangers of AI without considering how this impacts Google,” Hinton said in a tweet.

“Google has acted very responsibly," he added.


"I think it's very reasonable for people to be worrying about these issues now, even though it's not going to happen in the next year or two," Hinton was quoted as saying by CBS News.

The 75-year-old computer scientist has divided his time between the University of Toronto and Google since 2013, when the tech giant acquired Hinton’s AI startup DNNresearch.

Hinton’s company was a spinout from his research group, which was doing cutting-edge work with machine learning for image recognition at the time. Google used that technology to boost photo search and more.  

Hinton is best known for an algorithm called backpropagation, which he first proposed with two colleagues in the 1980s.

The technique, which allows artificial neural networks to learn, today underpins nearly all machine-learning models.

In a nutshell, backpropagation is a way to adjust the connections between artificial neurons over and over until a neural network produces the desired output. 

Hinton believed that backpropagation mimicked how biological brains learn. He has been looking for even better approximations since, but he has never improved on it.

Join our WhatsApp channel - no spam, only sharp analysis