GRTiQ Podcast cover image

AI & Crypto - A Panel Discussion with Semiotic Labs

GRTiQ Podcast

00:00

Machine Learning Research

Data curations are very important process writing machine learning. transformer based models, such as GPT are very data hungry to bring some of the numbers that might be will enable our listeners to build on the intuition for different chat GPT variants. When I was leaving and video actually, my colleagues from IDLR trained a megaton Turing model with 540 billion parameters. This model has 105 layers and was actually needed a supercomputer to train it,. It's worth to remember this, that there's price, we need data and crypto industry is just discovering AI, right?

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app