Apple to scan iPhones for child sex abuse images

Apple has announced details of a system to seek out child sexual assault material (CSAM) on US customers’ devices.Before a picture is stored onto iCloud Photos, the technology will look for matches of already known CSAM Apple said that if a match is found a person’s reviewer will then assessContinue Reading