OpenAI Staffers Responsible for Safety Are Jumping Ship

  • 📰 Gizmodo
  • ⏱ Reading Time:
  • 26 sec. here
  • 27 min. at publisher
  • 📊 Quality Score:
  • News: 101%
  • Publisher: 51%

Openai News

Daniel Kokotajlo Kokotajlo,Dario,Andrej Karpathy Karpathy

Sam Altman's AI startup recently lost several key members of its Superalignment team, which was supposed to ensure AI doesn't go rogue.

OpenAI launched its Superalignment team almost a year ago with the ultimate goal of controlling hypothetical super-intelligent AI systems and preventing them from turning against humans. Naturally, many people were concerned—why did a team like this need to exist in the first place? Now, something more concerning has occurred: the team’s leaders, Ilya Sutskever and Jan Leike, just quit OpenAI.

Ilya Sutskever & Jan Leike The former leaders of OpenAI’s Superalignment team simultaneously quit this week, one day after the company released its impressive GPT-4 Omni model. The goal of Superalignment, outlined during its July 2023 launch, was to develop methods for “steering or controlling a potentially superintelligent AI, and preventing it from going rogue.” At the time, OpenAI noted it was trying to build these superintelligent models but did not have a solution for controlling them.

Source: Tech Daily Report (techdailyreport.net)

Daniel Kokotajlo Kokotajlo Dario Andrej Karpathy Karpathy Daniel Kokotajlo Fei-Fei Li Pavel Izmailov Aschenbrennar Leopold Aschenbrenner Izmailov Helen Toner Anthropic Cullen Greg Brockman Ilya Sutskever TESLA Sam Altman Andrej Karpathy AI Alignment Daniela Amodei GPT-4 AI Safety Tasha Mccauley Artificial General Intelligence Inflection AI William Saunders Saunders Gizmodo

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 556. in US

United States Latest News, United States Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Grok Army Suggests Potential Reason for OpenAI Cofounder's Resignation, But There's a CatchResignation of OpenAI cofounder Ilya Sutskever making waves in community; other top staffers are following suit
Source: Utoday_en - 🏆 295. / 63 Read more »

Masters of the Universe: CEOs of OpenAI, Google, Microsoft to Join Federal AI Safety PanelSource of breaking news and analysis, insightful commentary and original reporting, curated and written specifically for the new generation of independent and conservative thinkers.
Source: BreitbartNews - 🏆 610. / 51 Read more »

CEOs of OpenAI, Google and Microsoft to join other tech leaders on federal AI safety panelThe US government has asked leading artificial intelligence companies for advice on how to use the technology they are creating to defend airlines, utilities and other critical infrastructure, particularly from AI-powered attacks.
Source: cnnbrk - 🏆 393. / 55 Read more »

OpenAI's Sam Altman and other tech leaders join the federal AI safety boardMariella Moon has been a night editor for Engadget since 2013, covering everything from consumer technology and video games to strange little robots that could operate on the human body from the inside one day. She has a special affinity for space, its technologies and its mysteries, though, and has interviewed astronauts for Engadget.
Source: engadget - 🏆 276. / 63 Read more »

Former chief safety officer details Metro's safety concernsThe former chief safety officer said Metro should consider stopping people from using the transit authority's property as shelter.
Source: FOXLA - 🏆 445. / 53 Read more »

Denver mayor creates neighborhood safety office in $11 million shift in public safety approachDenver Mayor Mike Johnston will launch a new office within city government focused on neighborhood safety — one that will be independent from the city’s police force and its safety department…
Source: denverpost - 🏆 13. / 72 Read more »