John Syracuse: I'm trying my darned to understand this a little better, but I need a little more time to get it 100% right. Photos are compared on device to a list of known CSAM from the National Center for Missing and Exploited Children. Apple will provide some sort of tool to Nick Mac to scan all their files and their database. These are things that they know are bad. They will scan all that and that will generate a bunch of hash numbers which Apple can then compare your photos to.
- Pre-show: Marco files a verbal Radar for Apple Music
- Follow-up:
- Apple’s New Child Safety Features
#askatp
- What’s our current thoughts on accelerating the heat death of the universe cryptocurrency (via Lalo Vargas)
- Is there any way to limit Messages’ size on disk? (via Richie Aharonian)
- What camera/lens should I rent for a Disney World trip? (via Andrew Nelson)
- Post-show: Some more camera banter
Sponsored by:
- Burrow: Setting a new standard in furniture. Get $75 off and free shipping.
- ExpressVPN: The fastest and most reliable VPN. Get an extra three months free with a 1-year package.
- Memberful: Monetize your passion with membership. Start your free trial today.
Become a member for ad-free episodes and our early-release, unedited “bootleg” feed!