The technology works by using something called neural match, which is designed to detect images of a sexual abuse of children. It scans for these images before they are up loaded to eye cloud, while still on the device. If the techni locates a match, the photo in question will be reviewed by a human if child poren is confirmed. Apple says will only scan in the united states and other countries to be added one by one,. But according to an internal slack channel, 800 messages long, many apple employees spoke out against the plan as told to royders by an anonymous employer who sawed the thread.
Podcasting from Italy after a 2-week break of news, Jason covers Casey Newton's failed dunk on Superhuman's $75M raise (2:58), the privacy implications of Apple's approach to preventing child porn distribution (23:14), the Helium Networks internet project (35:02), crypto regulation (43:03), and how WhatsApp is handling Afghanistan (49:45).