Apple is explaining more of the process involved in its new anti-child abuse measures.
Apple said its technology is specifically limited to detecting child sexual abuse material (CSAM) and cannot be turned into surveillance tools.
“With this new technology, Apple will learn about known CSAM photos being stored in iCloud Photos where the account is storing a collection of known CSAM. Apple will not learn anything about other data stored solely on device.”
The technology works by scanning an image before it is backed up to the iCloud. Then, if an image matches the criteria of CSAM, Apple receives the data of the cryptographic voucher.
Groups such as the Electronic Frontier Foundation voiced their concerns about the technology last week, saying that the tech could be “repurposed to create a database of ’terrorist’ content that companies can contribute to and access for the purpose of banning such content.”
One of the significant challenges in this space is protecting children while also preserving the privacy of users.
Get the Latest Tech News Delivered Every Day