Transformers are a key piece in training neural networks for text and natural language processing tasks.
The transformer model or transformer architecture is a new and more efficient way of training models compared to traditional sequence models.
Transformers can learn abstract concepts and abstractions from large amounts of training data without explicit supervision.
Scaling the training data and using simple models can lead to remarkable achievements that were previously considered impossible by computers.
This week OpenAI released ChatGPT, its prototype AI chatbot. On this episode of Intercom on Product, Intercom Co-Founder and Chief Strategy Officer Des Traynor sits down to chat with our Director of Machine Learning, Fergal Reid for reaction and analysis on the implications of ChatGPT and what the future holds.