Добавить новость


News in English


Новости сегодня

Новости от TheMoneytizer

Apple scraps plans to scan iPhones for child abuse images

The initial plan was that iPhone users’ entire photo libraries would be checked for known child abuse images if they were stored on its iCloud service (Picture: Unsplash)

Apple has officially scrapped its controversial plan to scan iCloud images for Child Sexual Abuse Material (CSAM).

On Wednesday, the company announced that it would not be moving forward with its plans for on-device scanning.

‘We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos,’ said Apple in a statement to Wired.

‘Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,’

In August 2021, Apple announced that it was in the process of developing a system which would automatically recognise illegal images when they were uploaded to iCloud and alert the authorities.

Apple has officially scrapped its controversial plan to scan iCloud images for Child Sexual Abuse Material (Picture: Getty Images)

The initial plan was that iPhone users’ entire photo libraries would be checked for known child abuse images if they were stored on its iCloud service.

Later, the technology giant announced the launch had been pushed back to ‘make improvements’ after campaigners said the programme breached privacy standards.

Some suggested the tool could be hijacked by authoritarian governments to look for other types of images.

Now, it seems the plan has been scrapped entirely and the company will focus on deepening its investment in the ‘Communication Safety’ feature.

In April, Apple released the child safety feature to detect nudity in messages using artificial intelligence (AI).

On Wednesday, the company announced that it would not be moving forward with its plans for on-device scanning (Picture: Unsplash)

The feature was launched in the US last year and then expanding to the Message apps on iOS, iPadOS, and macOS in the UK, Canada, New Zealand, and Australia.

Now parents can turn on warnings on their children’s iPhones so that all photos sent or received by the child on Apple Message will be scanned for nudity.

Once enabled, if nudity is found in photos received by a child, the photo will be blurred, and the child will be warned that it may contain sensitive content and be nudged towards resources from child safety groups.

MORE : Apple is launching ‘advanced’ security for chats and pictures on iCloud

MORE : Two women sue Apple claiming ex-partners used AirTags to stalk them

Читайте на сайте


Smi24.net — ежеминутные новости с ежедневным архивом. Только у нас — все главные новости дня без политической цензуры. Абсолютно все точки зрения, трезвая аналитика, цивилизованные споры и обсуждения без взаимных обвинений и оскорблений. Помните, что не у всех точка зрения совпадает с Вашей. Уважайте мнение других, даже если Вы отстаиваете свой взгляд и свою позицию. Мы не навязываем Вам своё видение, мы даём Вам срез событий дня без цензуры и без купюр. Новости, какие они есть —онлайн с поминутным архивом по всем городам и регионам России, Украины, Белоруссии и Абхазии. Smi24.net — живые новости в живом эфире! Быстрый поиск от Smi24.net — это не только возможность первым узнать, но и преимущество сообщить срочные новости мгновенно на любом языке мира и быть услышанным тут же. В любую минуту Вы можете добавить свою новость - здесь.




Новости от наших партнёров в Вашем городе

Ria.city
Музыкальные новости
Новости России
Экология в России и мире
Спорт в России и мире
Moscow.media










Топ новостей на этот час

Rss.plus