
Danny Postma: OpenAI, GPT-3, AI Business Ideas & His 7-Figure Sale
Creator Lab - interviews with entrepreneurs and startup founders
00:00
How to Train a Computer to Know What to Do Next
GPT-3 is basically trained on 100 billion parameters Basically Wikipedia books everything that stacks on the internet. It's been training such a way how it works in the basic that if you type something it knows the next words to write. So what we are possible to do with it is we can feed it text so you could feed it for example three Really good facebook advertisements and it will get trained to know that If you give it a non-adexample like a description of your product it knows how to turn that product description into a facebook ad because it knows from the examples you input before it what to do next.
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.