
Rethinking Email
AppStories
Intro
The hosts explain their strategy of updating the robots.txt file to block AI crawlers, particularly those identifying themselves as AI, from scanning their websites in response to training large language models.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.