Why Apple S Image Scanning Tech Isn T At All Private
Apple recently introduced a new technology to spot child sexual abuse material (CSAM), but it’s getting more criticism than praise from the privacy community. Key Takeaways Apple’s new policy against child sexual abuse material has cause controversy among users and privacy experts. The technology works by scanning images in iCloud for CSAM and using machine learning to identify explicit photos in Messages. Experts say no matter how private Apple says its scanning technology is, it still ultimately allows a back door to be open where anything could happen....