Pro@programming.dev to Technology@lemmy.worldEnglish · 2 days agoGoogle will use hashes to find and remove nonconsensual intimate imagery from Searchblog.googleexternal-linkmessage-square16linkfedilinkarrow-up181file-text
arrow-up181external-linkGoogle will use hashes to find and remove nonconsensual intimate imagery from Searchblog.googlePro@programming.dev to Technology@lemmy.worldEnglish · 2 days agomessage-square16linkfedilinkfile-text
minus-squaregian @lemmy.grys.itlinkfedilinkEnglisharrow-up6·2 days agoThey say to use PDQ for images which should output a similar hash for similar images (but why MD5 for video ?). So probably it is only a threshold problem. The algorithm is explained here https://raw.githubusercontent.com/facebook/ThreatExchange/main/hashing/hashing.pdf it is not an hash in the cryptographic sense.
minus-squareLorem Ipsum dolor sit amet@lemmy.worldlinkfedilinkEnglisharrow-up2·2 days agoThere was a github thread about this when it came up for CSAM, they managed to easily circumvent it. I’m rather confident this will end up similarly
They say to use PDQ for images which should output a similar hash for similar images (but why MD5 for video ?). So probably it is only a threshold problem.
The algorithm is explained here
https://raw.githubusercontent.com/facebook/ThreatExchange/main/hashing/hashing.pdf
it is not an hash in the cryptographic sense.
There was a github thread about this when it came up for CSAM, they managed to easily circumvent it. I’m rather confident this will end up similarly