Добавить новость


World News in Dutch


Новости сегодня

Новости от TheMoneytizer

How Twitter taught a robot to hate

Vox 

It took 15 hours for Twitter to teach an artificially intelligent chatbot to be a racist, sexist monster.

If you're shocked, you probably don't spend much time on Twitter. Microsoft's programmers presumably do, though, and the shocking thing is that they didn't see this coming.

Microsoft created a chatbot named Tay that was designed to talk like a millennial and learn more authentic conversation by interacting with humans online. Tay was dubbed by her creators as an "AI fam from the internet that's got zero chill!" (Oh boy.)

Some of Tay's interactions might actually pass a millennial Turing test. She even uses emoji!

Unfortunately, millennials are also just about as racist and possibly even more sexist than their parents. It didn't take long before Tay was imitating those qualities too, and a cute 19-year-old girl transformed into a Gamergate-loving Hitler youth.

Twitter trolls started teaching Tay some horrible racial slurs and genocidal ideation.

(that emoji, tho)

The latter tweet, Business Insider notes, appears to be the result of a user asking Tay to repeat a phrase verbatim. Some, but not all, of Tay's offensive tweets did something like this.

Tay also started harassing Zoë Quinn, a game developer who once went into hiding due to the virulent misogynistic "Gamergate" threats she received online.

Tay also started hitting on random people in direct messages.

Quinn, and many other critics, pointed out that Microsoft's designers really should have anticipated these outcomes and programmed Tay with filters ahead of time.

Microsoft has deleted most of the offensive tweets, and told Business Insider that it's now making "adjustments" to the bot.

Microsoft's website for Tay featured this banner on Thursday morning:

Twitter itself has also come under heavy criticism for not doing enough to address harassment, especially of high-profile users who are women and people of color. Some leave the platform because the problem is so bad. Twitter has made some changes, but many users still charge that the company isn't making harassment enough of a priority. This may be partly because its staff isn't very diverse, and the problem may feel less urgent to white men whose lives aren't severely impacted by racialized, sexual harassment.

The same general problem may be at work here. The possibility of harassment is going to be more top of mind for women and people of color who experience it frequently, but they're also less likely to be well-represented on tech teams. Microsoft is no exception.

Or maybe Microsoft just somehow failed to take basic precautions that should be standard in the industry.

Microsoft said it created Tay to "experiment with and conduct research on conversational understanding." The engineers probably came away with a different understanding of conversation than they bargained for.

Читайте на сайте


Smi24.net — ежеминутные новости с ежедневным архивом. Только у нас — все главные новости дня без политической цензуры. Абсолютно все точки зрения, трезвая аналитика, цивилизованные споры и обсуждения без взаимных обвинений и оскорблений. Помните, что не у всех точка зрения совпадает с Вашей. Уважайте мнение других, даже если Вы отстаиваете свой взгляд и свою позицию. Мы не навязываем Вам своё видение, мы даём Вам срез событий дня без цензуры и без купюр. Новости, какие они есть —онлайн с поминутным архивом по всем городам и регионам России, Украины, Белоруссии и Абхазии. Smi24.net — живые новости в живом эфире! Быстрый поиск от Smi24.net — это не только возможность первым узнать, но и преимущество сообщить срочные новости мгновенно на любом языке мира и быть услышанным тут же. В любую минуту Вы можете добавить свою новость - здесь.




Новости от наших партнёров в Вашем городе

Ria.city
Музыкальные новости
Новости России
Экология в России и мире
Спорт в России и мире
Moscow.media










Топ новостей на этот час

Rss.plus