Добавить новость

В России зарегистрировали новый аналог датского «Оземпика» для диабетиков

В московском приюте для животных прошла волонтерская акция

Корпуса санатория «Ерино» в Новой Москве введены в эксплуатацию после реконструкции

В соцсетях показали видео из шахты в Кузбассе после землетрясения



News in English


Новости сегодня

Новости от TheMoneytizer

A former OpenAI safety employee said he quit because the company's leaders were 'building the Titanic' and wanted 'newer, shinier' things to sell

"I really didn't want to end up working for the Titanic of AI, and so that's why I resigned," said Saunders, who was on OpenAI's superalignment team.

Sam Altman, CEO of OpenAI, arrives at the Allen & Company Sun Valley Conference on July 9, 2024 in Sun Valley, Idaho.
Sam Altman, CEO of OpenAI, arrives at the Allen & Company Sun Valley Conference on July 9, 2024 in Sun Valley, Idaho.
  • An ex-OpenAI employee said the firm is going down the path of the Titanic with its safety decisions.
  • William Saunders warned of the hubris around the safety of the Titanic, which had been deemed "unsinkable."
  • Saunders, who was at OpenAI for 3 years, has been critical of the firm's corporate governance.

A former safety employee at OpenAI said the company is following in the footsteps of White Star Line, the company that built the Titanic.

"I really didn't want to end up working for the Titanic of AI, and so that's why I resigned," said William Saunders, who worked for three years as a member of technical staff on OpenAI's superalignment team.

He was speaking on an episode of tech YouTuber Alex Kantrowitz's podcast, released on July 3.

"During my three years at OpenAI, I would sometimes ask myself a question. Was the path that OpenAI was on more like the Apollo program or more like the Titanic?" he said.

The software engineer's concerns stem largely from OpenAI's plan to achieve Artificial General Intelligence — the point where AI can teach itself — while also debuting paid products.

"They're on this trajectory to change the world, and yet when they release things, their priorities are more like a product company. And I think that is what is most unsettling," Saunders said.

Apollo vs Titanic

As Saunders spent more time at OpenAI, he felt leaders were making decisions more akin to "building the Titanic, prioritizing getting out newer, shinier products."

He would have much preferred a mood like the Apollo space program's, which he characterized as an example of an ambitious project that "was about carefully predicting and assessing risks" while pushing scientific limits.

"Even when big problems happened, like Apollo 13, they had enough sort of like redundancy, and were able to adapt to the situation in order to bring everyone back safely," he said.

The Titanic, on the other hand, was built by White Star Line as it competed with its rivals to make bigger cruise liners, Saunders said.

Saunders fears that, like with the Titanic's safeguards, OpenAI could be relying too heavily on its current measures and research for AI safety.

"Lots of work went into making the ship safe and building watertight compartments so that they could say that it was unsinkable," he said. "But at the same time, there weren't enough lifeboats for everyone. So when disaster struck, a lot of people died."

To be sure, the Apollo missions were conducted against the backdrop of a Cold War space race with Russia. They also involved several serious casualties, including three NASA astronauts who died in 1967 due to an electrical fire during a test.

Explaining his analogy further in an email to Business Insider, Saunders wrote: "Yes, the Apollo program had its own tragedies. It is not possible to develop AGI or any new technology with zero risk. What I would like to see is the company taking all possible reasonable steps to prevent these risks."

OpenAI needs more 'lifeboats,' Saunders says

Saunders told BI that a "Titanic disaster" for AI could manifest in a model that can launch a large-scale cyberattack, persuade people en masse in a campaign, or help build biological weapons.

In the near term, OpenAI should invest in additional "lifeboats," like delaying the release of new language models so teams can research potential harms, he said in his email.

While in the superalignment team, Saunders led a group of four staff dedicated to understanding how AI language models behave — which he said humans don't know enough about.

"If in the future we build AI systems as smart or smarter than most humans, we will need techniques to be able to tell if these systems are hiding capabilities or motivations," he wrote in his email.

Ilya Sutskever, Russian Israeli-Canadian computer scientist and co-founder of OpenAI, speaks at a conference in Tel Aviv.
Ilya Sutskever, cofounder of OpenAI, left the firm in June after leading its superalignment division.

In his interview with Kantrowitz, Saunders added that company staff often discussed theories about how the reality of AI becoming a "wildly transformative" force could come in just a few years.

"I think when the company is talking about this, they have a duty to put in the work to prepare for that," he said.

But he's been disappointed with OpenAI's actions so far.

In his email to BI, he said: "While there are employees at OpenAI doing good work on understanding and preventing risks, I did not see a sufficient prioritization of this work."

Saunders left OpenAI in February. The company then dissolved its superalignment team in May, just days after announcing GPT-4o, its most advanced AI product available to the public.

OpenAI did not immediately respond to a request for comment sent outside regular business hours by Business Insider.

Tech companies like OpenAI, Apple, Google, and Meta have been engaged in an AI arms race, sparking investment furor in what is widely predicted to be the next great industry disruptor akin to the internet.

The breakneck pace of development has prompted some employees and experts to warn that more corporate governance is needed to avoid future catastrophes.

In early June, a group of former and current employees at Google's Deepmind and OpenAI — including Saunders — published an open letter warning that current industry oversight standards were insufficient to safeguard against disaster for humanity.

Meanwhile, OpenAI cofounder and former chief scientist Ilya Sutskever, who led the firm's superalignment division, resigned later that month.

He founded another startup, Safe Superintelligence Inc., that he said would focus on researching AI while ensuring "safety always remains ahead."

Read the original article on Business Insider

Читайте на 123ru.net


Новости 24/7 DirectAdvert - доход для вашего сайта



Частные объявления в Вашем городе, в Вашем регионе и в России



Smi24.net — ежеминутные новости с ежедневным архивом. Только у нас — все главные новости дня без политической цензуры. "123 Новости" — абсолютно все точки зрения, трезвая аналитика, цивилизованные споры и обсуждения без взаимных обвинений и оскорблений. Помните, что не у всех точка зрения совпадает с Вашей. Уважайте мнение других, даже если Вы отстаиваете свой взгляд и свою позицию. Smi24.net — облегчённая версия старейшего обозревателя новостей 123ru.net. Мы не навязываем Вам своё видение, мы даём Вам срез событий дня без цензуры и без купюр. Новости, какие они есть —онлайн с поминутным архивом по всем городам и регионам России, Украины, Белоруссии и Абхазии. Smi24.net — живые новости в живом эфире! Быстрый поиск от Smi24.net — это не только возможность первым узнать, но и преимущество сообщить срочные новости мгновенно на любом языке мира и быть услышанным тут же. В любую минуту Вы можете добавить свою новость - здесь.




Новости от наших партнёров в Вашем городе

Ria.city

Семь жителей Петербурга заболели сальмонеллезом после употребления курицы-гриль

Школу на 1100 мест в микрорайоне 25А в Мытищах откроют к сентябрю

Криминалист Семенов: за убийство выдры ответственны участники «Последнего героя»

Летняя проектная школа для учащихся «Роснефть-классов» стартовала в Москве

Музыкальные новости

Дистрибьюция Музыки.

Как ВТБ и Газпром. Путин объявил о переезде госкорпораций в регионы России

Блогеры «Инсайт Люди» проведут серию профильных форумов по России

Рыбаков Фонд и МФТИ учреждают инициативу «Физтех 2050»: стратегическое партнерство на 25 лет и грант 550 млн рублей

Новости России

Эксперты призвали привести к норме высоту БЦ «Капитал» в Можайске

Школу на 1100 мест в микрорайоне 25А в Мытищах откроют к сентябрю

Криминалист Семенов: за убийство выдры ответственны участники «Последнего героя»

Летняя проектная школа для учащихся «Роснефть-классов» стартовала в Москве

Экология в России и мире

Кухня – рай для бактерий: доктор Кутушов рассказал о скрытых рассадниках микробов

Модные показы и лекции стилистов запланированы на форуме-фестивале «Территория будущего. Москва 2030» в «Зарядье»

Дольче вита с выгодой в Fish Point Family Resort

Дисбактериоз, Синдром дырявого кишечника и отеки: Кутушов рассказал, как связаны эти проблемы

Спорт в России и мире

Лекарство против будней: почему предстоящая Олимпиада в Париже будет уникальна для России

Теннисисты Медведев и Джокович отказались жить в Олимпийской деревне

Герасимов победил на старте турнира ATP Challenger Tour в Испании

Аванесян сыграет с Андреевой в финале турнира WTA в Румынии

Moscow.media

Из-за дождя обрушилась насыпь дороги Шахтерск – Бошняково на Сахалине

Мыс Огой на рассвете.

Штат консультантов 1C-практики «Борлас» вырос до 300 человек

Идёт экстренная эвакуация: в Челябинской области прорвало дамбу











Топ новостей на этот час

Rss.plus






«Крылья Советов» продлили безвыигрышную серию в РПЛ до восьми матчей

Выставка «Вселенная BRICS», подготовленная Нижегородским планетарием, открылась в Ульяновске

Эксперты призвали привести к норме высоту БЦ «Капитал» в Можайске

Польско-американские учения истребительной авиации проводят в Польше