AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Scaling Is Not the Most Satisfying Solution
It was about two orders of magnitudein in model size, which loosely speaking, represents how big we could make these models and still train them on existing hardware. You can find tune the model tto pretty good performance on a new task in, let's say, a few hours, on a single gob cloud tpu. And kolab gives you the t pu for free.