Добавить новость

За мусорными полигонами в России будет следить нейросеть

В доме 1910 года постройки на улице Плющихе заменили лифты

«АХМАТ», Dolce Vita и Hubex/Pirogovo - тройка победителей первого этапа серии Tenzor International Cup 2024

«Ъ»: Счетная палата добилась рекордного штрафа для «Почты России»

News in English


Новости сегодня

Новости от TheMoneytizer

Opinion | Free Speech Isn’t a Barrier to Regulating Social Media. Profits Are.

There’s a First-Amendment-friendly way to clean up social media. But tech CEOs won’t like it.


You can’t use a mega-sound system to hold a political rally in front of a hospital in the middle of the night. You can’t pack a theater so full of people that no one can reach the fire exits without being trampled. In the physical world, these kinds of noise control and fire safety regulations uneventfully coexist with our First Amendment free speech and free assembly rights. They’re accepted as common-sense ways to keep us safe and preserve our sanity.

The same ideas can be applied to social media. By reverse engineering the noise and lack of crowd control that has overrun social media platforms, we can make the internet a more peaceful, reliable, less polarizing place.

And we can do it without the government policing speech. In fact, Congress does not have to do anything. It doesn’t even need to touch Section 230, the now infamous 1996 law that gives social media platforms immunity for the harmful content — from healthcare hoaxes to election misinformation to Russian and Chinese state-sponsored propaganda — that has created a world of chaos and division, where so many people don’t believe even the most basic truths. Instead, the Federal Trade Commission and other consumer protection regulators around the world could enforce the contracts the platforms already have with their users.

Meta, the parent company of Facebook and Instagram, already promises users that it will enforce “community standards” that prohibit, among other abuses: inciting violence, “inauthentic” behavior such as setting up fake accounts, promoting suicide, bullying, hate speech, graphic or sexually explicit content, human exploitation and misinformation “that will cause imminent physical harm, health-care misinformation and misinformation about elections and voting.”

This list could more accurately describe all the categories of harmful content that have flourished on Facebook and Instagram.

Most of the other platforms have similar lists of prohibited content and contracts with their users. These terms of service do not say, “We take our role seriously, but our algorithms encourage a lot of that content, and the volume of it flowing through our platform makes it impossible to prevent much of it from being posted even if we wanted to. Sorry.” Yet that’s basically what they say whenever they embark on another session of their yearslong apology tours testifying in front of Congress and similar tribunals around the world.



The FTC is responsible for protecting consumers, including by suing companies that defraud them by violating the terms of a contract. Section 5 of the law creating the FTC declares that “unfair or deceptive acts or practices in or affecting commerce … are … unlawful,” and empowers the commission to prevent companies from using such deceptive practices. The commission’s Policy Statement on Deception defines “deceptive” practices as a material “representation, omission or practice that is likely to mislead a consumer acting reasonably in the circumstances.” In fact, the FTC has already taken action against Facebook for violating the privacy promises it makes in those same terms of service. It imposed a $5 billion fine in 2020.

The FTC’s website explains that “the Commission may use rulemaking to address unfair or deceptive practices or unfair methods of competition that occur commonly, in lieu of relying solely on actions against individual respondents.” Accordingly, the commission could enforce the content promises in these terms of service by promulgating a rule that any digital platform must prominently and clearly spell out in its terms of service what content it will allow and not allow — and then, as with their privacy assurances, make sure they keep those promises.

Spelling out those terms prominently and clearly would mean posting a large chart on a prominent screen listing all possible offending content and requiring the platform to check a box if the content is prohibited or allowed.

In keeping with First Amendment restrictions on government regulation of the content of speech, it would be up to the platforms to decide which content to prohibit — that is, to check or not check each box. A platform that wants to allow misinformation or hate speech could choose to do so. However, it would have to level with its users in that prominent chart by declaring that it is choosing to allow it. This would give a stricter platform a competitive advantage in the marketplace; a platform that has to declare in large print that it allows misinformation or hate speech is likely to turn off many potential users and advertisers.

First Amendment protections would prohibit the government from forcing the platforms to prohibit hate speech or most misinformation. Yet nothing stops the proprietors of a platform from making those decisions by defining what it considers hate speech or harmful misinformation and screening it out. That’s called editing, which is protected by the First Amendment when private parties, not the government, do it. In fact, editing is what the authors of Section 230 had in mind when they wrote it; it shielded platforms from liability not only for what they do allow but also for what they do not allow. This is why the provision was titled the “Protection for ‘Good Samaritans’ Act.” Let’s make those who run the platforms be Good Samaritans.


This would be a logical and content-neutral way for the commission “to address deceptive practices” — in this case an obvious and widespread failure by the platforms to deliver on the promises made in their contracts with users. The FTC rule would require that the platforms prove that their declarations about what content they will not allow are real — not aspirational. As with inspectors checking enforcement of building codes, making sure there is adequate exit access in a crowded theater or catering hall, the FTC rule should require that each platform demonstrate that it has the capability to screen the volume of its content.

If this means that a platform has to cut its profit margins to hire thousands of people to screen all content before it is posted, or that it has to drastically lower the volume of users or the amount of content that they can post, so be it.

We should have learned by now that the ability for anyone anywhere to send any kind of video or text message instantly to everyone anywhere may be a technology marvel, but it is anything but a positive development. A quieter, less spontaneous online community is superior to the alternative of people live-streaming a murder, summoning rioters to the Capitol or, as happened in the days following the Hamas attack on Israel, Facebook, X and TikTok posting hundreds of lurid videos — quickly receiving millions of views worldwide — of the terrorists celebrating as they committed unspeakable atrocities.

The promises from the Silicon Valley witnesses at these now routine congressional hearings to work harder and do better are meaningless because they do not manage the volume and velocity of what gets posted. Instead, they apologize for being overwhelmed by the fire hose of content that is the essence of their business model — and the controllable but uncontrolled source of their bonanza profits.

To enforce this capability requirement, the FTC would have independent auditors review the platforms’ content on a regular basis to determine whether they have proved capable of keeping their promises. The audits should be publicly available. And if the audits demonstrate that a platform’s promises are not being kept, fines or even an order to suspend service would follow.

The FTC has the regulatory authority to proceed on its own without Congress to enforce the platforms’ own contractual promises. And you can encourage the commissioners to do just that. The FTC website invites consumers to report fraud. Although this is clearly meant for specific complaints about some online scam or unwelcome robo-call, if you think your social media company is not keeping its promises about preventing harmful content, you can report them.

Читайте на 123ru.net


Новости 24/7 DirectAdvert - доход для вашего сайта



Частные объявления в Вашем городе, в Вашем регионе и в России



Smi24.net — ежеминутные новости с ежедневным архивом. Только у нас — все главные новости дня без политической цензуры. "123 Новости" — абсолютно все точки зрения, трезвая аналитика, цивилизованные споры и обсуждения без взаимных обвинений и оскорблений. Помните, что не у всех точка зрения совпадает с Вашей. Уважайте мнение других, даже если Вы отстаиваете свой взгляд и свою позицию. Smi24.net — облегчённая версия старейшего обозревателя новостей 123ru.net. Мы не навязываем Вам своё видение, мы даём Вам срез событий дня без цензуры и без купюр. Новости, какие они есть —онлайн с поминутным архивом по всем городам и регионам России, Украины, Белоруссии и Абхазии. Smi24.net — живые новости в живом эфире! Быстрый поиск от Smi24.net — это не только возможность первым узнать, но и преимущество сообщить срочные новости мгновенно на любом языке мира и быть услышанным тут же. В любую минуту Вы можете добавить свою новость - здесь.




Новости от наших партнёров в Вашем городе

Ria.city

Аналитик Плотников оценил вероятность продления льготной IT-ипотеки на 2025 год

Голый мужчина напал на полицейского в столице и попытался скрыться вплавь

При России такого не было: Норвегия вновь лишает газа Европу

Москва на портале поставщиков заключила с региональным бизнесом более 32 тыс контрактов

Музыкальные новости

Российские компании придерживаются собственного подхода в ESG

Неблагодарный батька или о чем говорил президент Белоруссии с делегацией Азербайджана?

Триумфальное возвращение Льва Лещенко на сцену спустя месяцы затишья. Артист презентовал дуэт с солистом группы “Парк Горького” Сергеем АРУТЮНОВЫМ

Филипп Киркоров, Shaman, Сергей Жуков, Катя IOWA показали награды и наряды в студии "Авторадио"

Новости России

Мастер-классы для врачей на “Здравнице 2024” провели эксперты Академии врачей Uniprof

Москва на портале поставщиков заключила с региональным бизнесом более 32 тыс контрактов

Аналитик Плотников оценил вероятность продления льготной IT-ипотеки на 2025 год

"КП": 42% россиян признались, что едят покупные салаты

Экология в России и мире

BIA Technologies повысила производительность сборки специй в компании FOSFOREL на 15%

Три спектакля о любви, которые можно посмотреть этим летом

Фрукт с сюрпризом: доктор Кутушов рассказал, чем опасен съеденный червь

Где реально можно купаться летом в Москве и Подмосковье: 8 пляжей которые оборудованы для летнего отдыха

Спорт в России и мире

Кирилл Скачков из Новокузнецка стал победителем Игр стран БРИКС-2024

Курникова показала трогательные фото Энрике Иглесиаса

Теннисисты Рублев, Хачанов и Самсонова не сыграют на Олимпиаде-2024

Теннисистка Самсонова вышла в финал турнира в Хертогенбосхе

Moscow.media

В центре Орла машина сбила двух пешеходов

Филиал № 4 ОСФР по Москве и Московской области информирует: Отделение СФР по Москве и Московской области выплатило единовременное пособие при передаче ребенка на воспитание в семью 474 семьям региона

Зелёная щурка.

В Екатеринбурге начался второй этап опрессовок











Топ новостей на этот час

Rss.plus






Академик Онищенко: уколы ботокса — тот же токсин, что вызывает ботулизм

Загрутдинов: на станции метро «Бачуринская» облицевали пол платформенной части

Обвиняемая в мошенничестве экс-мундеп Хараидзе освобождена из СИЗО

«Ъ»: Счетная палата добилась рекордного штрафа для «Почты России»