Building models under 10 megabytes that run significantly faster, do not require extensive compute power, and can outperform larger models like GPT-4 with minimal task-specific data is emerging as a promising workflow of the future. The focus is on improving this process to be more user-friendly for individuals without a deep machine learning background.
There hasn't been a boom like the AI boom since the .com days. And it may look like a space destined to be controlled by a couple of tech giants. But Ines Montani thinks open source will play an important role in the future of AI. I hope you join us for this excellent conversation about the future of AI and open source.
Episode sponsors
Sentry Error Monitoring, Code TALKPYTHON
Porkbun
Talk Python Courses
Links from the show