Gpt three is a computer programme that works by guessing what the next word will be. It has been shaped into something that comes up with coherent stuff most of the time. But it's not fundamentally an intelligent system in the way that we think it is, yet we are describing it as though it is and we need to teach it how to be ethical. What this reminds me of the most is u - i might have ranted about this on the show before,. There was a fad for these um a things where t was, i taught an a i to write a signfeld episode, and then there would be like semi nonsensical text. That doesn't hurt anybody for us to maintain
Is artificial intelligence a problem, or is the real problem how we’re using the term in the first place? Linguistics professor Emily Bender joins Adam to discuss why we should resist the urge to be impressed when it comes to big tech’s AI promises, and how our belief in the fantasy of A.I. could be worse than the reality. You can follow Emily on Twitter at @emilymbender.
Learn more about your ad choices. Visit megaphone.fm/adchoices
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.