Apple's ongoing system for scanning 'data related to child sexual exploitation (CSAM)' generates hash values from user data stored on devices and in the cloud, and uses it as a database of known CSAMs ...
In August 2021, Apple released a mechanism to scan images stored on iOS devices and iCloud to check 'data related to sexual exploitation of children (CSAM)' from hashes in the latter half of 2021. It ...