Apple has further detailed that its child safety mechanism will require at least 30 photos matching with Child Sexual Abuse Material (CSAM) flagged by organisations in at least two countries before an account is flagged for human review.

from Gadgets.NDTV

Post a Comment

Previous Post Next Post