OpenAI logo against a blue computer chip background

OpenAI removed its Artificial General Intelligence safety team

OpenAI has taken apart its AGI Readiness team, members of which will be dispersed within the company.

Artificial general intelligence (AGI) is dedicated to creating software with human-like intelligence and the capacity to self-teach.

Some believe AGI can surpass human intellect on specific tasks. This is the subject of much debate. Many tech leaders claim it is almost implementable, meanwhile others say it’s impossible.

The overarching goal is for the software to execute tasks without being trained or being designed for. Therefore, this places a greater emphasis on agency and self-sufficiency.

The disbandment

Open AI’s Readiness team advised how well the company could handle advanced AI software. And how ready the world at large is to manage it.

Miles Brundage was Open AI’s senior advisor for AGI Readiness. On Oct. 23, he announced his departure via Substack post, citing that the opportunity cost had become too high.

He expressed that his impact could be greater externally and that ultimately he fulfilled his purpose at OpenAI.

Brundage also stated that “Neither OpenAI nor any other frontier lab is ready, and the world is also not ready [for AGI]… AI is unlikely to be as safe and beneficial as possible without a concerted effort to make it so”.

Furthermore, he intends to create his own nonprofit or partner with an existing organization to advocate for AI policy research and awareness.

A word from OpenAI

A OpenAI spokesperson told CNBC: “We fully support Miles’ decision to pursue his policy research outside industry and are deeply grateful for his contributions.

His plan to go all-in on independent research on AI policy gives him the opportunity to have an impact on a wider scale, and we are excited to learn from his work and follow its impact.

We’re confident that in his new role, Miles will continue to raise the bar for the quality of policymaking in industry and government.”

For more on this story, click here.

Head to this link for more stories like this.