AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
JBETG3 and OpenAI - The Transformer
JBETG3 has been using a language transformer from 2017. The context is about 2048 symbols, tokens in the language. These symbols are not characters but they project them into a vector space where words that are statistically co-occurring a lot are neighbors already. So what to make the statistics over? And the context is significantly larger than the adjacent word. Yes.