News details

image
MacRumors / 01 September, 2023

Apple Provides Further Clarity on Why It Abandoned Plan to Detect CSAM in iCloud Photos

Apple on Thursday provided its fullest explanation yet for last year abandoning its controversial plan to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos. Apples statement, shared with Wired and reproduced below, came in response…

Leave a Reply

Your email address will not be published. Required fields are marked