Tech

Musk And Other Tech Leaders Propose A Six-Month Break In The Development Of Advanced AI Systems

Swarajya StaffMar 30, 2023, 05:33 PM | Updated 05:33 PM IST
A group, including Elon Musk, has urged all AI labs to halt training of AI systems stronger than GPT-4 for six months.

A group, including Elon Musk, has urged all AI labs to halt training of AI systems stronger than GPT-4 for six months.


Over 1,000 tech researchers and executives, including Elon Musk, have urged for a six-month break in the development of advanced artificial intelligence(AI) systems, such as OpenAI's GPT, in order to slow down a potentially "dangerous" arms race.

A letter warned that AI labs are in a race to create powerful digital minds that even their creators cannot understand, predict or control, leading to an out-of-control situation.

Notable co-signatories of a letter, published by the Future of Life Institute, include prominent AI professors Stuart Russell and Yoshua Bengio, along with the co-founders of Apple, Pinterest, and Skype.

The Foundation, led by AI researcher Max Tegmark, boasts Musk as a major financier. Furthermore, an AI start-up founder, Stability AI, has also signed the letter.

A group has urged all AI labs to halt training of AI systems stronger than GPT-4 for six months. The pause should be visible and involve all stakeholders. Otherwise, governments should enforce a moratorium.


Google, Microsoft and Adobe have added AI features to their search engines and productivity tools, making AI accessible to millions of users.

The rapid development and deployment of AI has raised concern among researchers and ethicists regarding its impact on employment, public discourse, and humanity's ability to cope with it.

Engineers and researchers from Microsoft, Google, Amazon, Meta, and DeepMind signed a letter related to responsible AI. No OpenAI employee was among the first signatories.

Governments globally are working on AI policy responses, while some tech firms are shrinking AI ethics teams.

Join our WhatsApp channel - no spam, only sharp analysis