Front-End Fire cover image

npm Under Siege: The “Shai-Hulud” Worm Attack

Front-End Fire

00:00

Running LLMs in the browser and offline

TJ and Jack compare Chrome built-in AI, WebLLM, MediaPipe, and ONNX for local/browser LLM inference.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app