Apple is reportedly developing a tool that would scan for child sexual abuse material (CSAM) in your iPhone photos using hashing algorithms. The system is said to be deployed on the user’s device for greater security and privacy.

from Gadgets.NDTV

Post a Comment

Previous Post Next Post