AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
iPhone Backups
Apple announced changes to CSAM last year. If you were backing up a photo on iCloud photos, it would then scan the photo while it's still on your device. And create a hash of it and try to match that to known child sexual abuse imagery. Now Apple has said that that program is dead. It wouldn't tell us who the researchers were. They're like, just enjoy it. Just shut up.