Tesla CEO Elon Musk, Apple co-founder Steve Wozniak, and numerous other technology leaders have requested a temporary halt on the advancement of powerful artificial intelligence, citing the potential dangers it poses to humanity.
The AI language processing tool ChatGPT has garnered global attention as knowledge workers utilize its capabilities for tasks like composing emails and coding within seconds. The mass-market AI breakthrough has spurred a competitive race among tech companies to integrate similar systems into their products and search engines.
The Future of Life Institute's open letter emphasized the potential impact of AI developments on information channels, employment prospects, and the acceleration of AI surpassing human intelligence.
The letter called for a six-month moratorium on developing AI solutions stronger than GPT-4, the latest version of ChatGPT from OpenAI, while the world evaluates the technology's potential consequences. It asserts that such decisions shouldn't be left to unelected tech leaders and that powerful AI systems should only be developed once their effects are positive, and risks are manageable.
Signatories include Pinterest co-founder Evan Sharp, Ripple co-founder Chris Larsen, DeepMind research scientists Zachary Kenton and Ramana Kumar, former presidential candidate Andrew Yang and academics from renowned institutions such as Stanford and Harvard.
The document suggests a public and verifiable pause, urging AI labs and independent experts to develop shared safety protocols for AI design and development, overseen by independent outside experts.
Elon Musk reportedly sees ChatGPT as "training AI to be woke" and has sought alternatives from top researchers. Microsoft's recent announcement of integrating ChatGPT into Bing and Edge, and Google's plans to incorporate Bard, an experimental conversational AI service, into its search engine, have intensified the conversation around AI safety.
The letter's suggestions resonate with Western governments' efforts to establish ethical AI research standards.