Plugavel.
  • Home
  • Tech
  • Car
  • More
    • Privacy policy
    • About us
    • Contact us
No Result
View All Result
Plugavel.
  • Home
  • Tech
  • Car
  • More
    • Privacy policy
    • About us
    • Contact us
No Result
View All Result
Plugavel.
No Result
View All Result

How will Apple analyze your photos to hunt down pedophiles?

7 de August de 2021
in Tech
Apple va comparer les images avant de les envoyer sur iCloud afin de détecter les photos pédopornographiques. © Jan Vasek, Pixabay
ADVERTISEMENT

Apple has just announced new measures to protect children. Among these, a new system that allows the firm to track pedophiles by analyzing the images transferred to iCloud in the United States and by reporting all those listed in a national database of child pornography images.

You will also be interested


[EN VIDÉO] Forensic science: how fingerprints betray criminals
During a criminal investigation, it is necessary to know how to take advantage of the information left by the criminal on the scene. Fingerprints are one of the clues that can help find a suspect. Nicolas Thiburce, head of the Fingerprints department of the National Gendarmerie, explains to us during this video produced by the City of Sciences and Industry, how to take these traces.

Apple has just announced the implementation of a new function on iPhone and iPad aimed at protecting children. Images sent to its iCloud online storage will be analyzed to detect child pornography images. Already launched in the test phase, it will be extended to all users in the United States with the launch of the update iOS 15 this fall.

The photos will be processed locally, directly on the user’s device. Before being transferred to their iCloud space, they will be submitted to a hash function, which calculates a footprint cryptographic. It is this fingerprint that will then be compared to a database provided by the National Center for Missing and Exploited Children (NCMEC). This database will be directly integrated into iOS 15 and will contain the fingerprints of all child pornography images already listed.

An alert is launched beyond a certain threshold

This method ensures the confidentiality data, since all the processing is done locally, and only the cryptographic fingerprint is compared. If the system detects a match with the database, it sends the result along with additional data, in encrypted form, to iCloud along with the photo. From a certain, unspecified threshold, an Apple employee will be able to access these images in order to determine whether they actually correspond to child pornography. If applicable, Apple deactivates the user’s account and contacts NCMEC.

Other online storage services have already implemented similar systems. However, that ofApple can only report images that have already been identified by the authorities. There is no analysis of the content of the images, so the system does not risk flagging parents when they take pictures of their children in the bath, for example.

Interested in what you just read?

.

Tags: analyzeAppleChild protectionhuntiCloudiOSiOS 15iPadiPhonepedophile networkpedophilesphotos
ShareTweetPin

We would like to send you notifications with news, you can unsubscribe at any time.

Unsubscribe
  • Home
  • Privacy policy
  • About us
  • Contact us
© 2021 Plugavel - News about technology and cars on one site Plugavel.
No Result
View All Result
  • Home
  • Tech
  • Car
  • More
    • Privacy policy
    • About us
    • Contact us