

Local AI Models with Joe Finney
Oct 2, 2025
Joe Finney, a mobile product owner and MVP, dives into the world of local AI models. He discusses the advantages of using models like Tesseract for local OCR and the integration of Windows AI APIs. Joe highlights the potential of Hugging Face models for various tasks and shares insights on managing hardware requirements when running local models. He also weighs the pros and cons of running models locally versus in the cloud, emphasizing security and cost control. His pragmatic advice encourages developers to experiment with local solutions for optimal productivity.
AI Snips
Chapters
Transcript
Episode notes
TextGrab Evolved Into PowerToys Feature
- Joe built TextGrab, a local OCR app that became the basis for PowerToys text extractor.
- He upgraded OCR over time from Windows APIs to Tesseract and newer local models as they improved.
Prefer Built-In Windows AI APIs First
- Use the new Windows AI APIs when you want simple local AI features in a Windows app.
- Check device support and rely on built-in models instead of shipping huge model files with your app.
Non-LLM Models Are Broad And Useful
- Hugging Face hosts a wide range of non-LLM models for OCR, segmentation, detection and more.
- Many useful ML capabilities existed before the LLM boom and remain valuable for specialized tasks.