TechFirst with John Koetsier

John Koetsier
undefined
Nov 1, 2025 • 18min

Amazon, NVIDIA, and a new "physical AI" fellowship

What happens when Amazon, NVIDIA, and MassRobotics team up to merge generative AI with robotics?In this episode of TechFirst we chat with Amazon's Taimur Rashid, Head of Generative AI and Innovation Delivery. We talk about "physical AI" ... AI with spatial awareness and the ability to act safely and intelligently in the real world.We also chat about the first cohort of a new accelerator for robotics startups.It's sponsored by Amazon and NVIDIA, run by MassRobotics, and includes startups doing autonomous ships, autonomous construction robots, smart farms, hospital robots, manufacturing and assembly robots, exoskeletons, and more.We talk about:- Why “physical AI” is the missing piece for robots to become truly useful and scalable- How startups in Amazon’s and NVIDIA’s new Physical AI Fellowship are pushing the limits of robotics from exoskeletons to farm bots- What makes robotic hands so hard to build- The generalist vs. specialist debate in humanoid robots- How AI is already making Amazon warehouses 25% more efficientThis is a deep dive into the next phase of AI evolution: intelligence that can think, move, and act.⸻00:00 — Intro: Is physical AI the missing piece?00:46 — What is “physical AI”?02:30 — How LLMs fit into the physical world03:25 — Why safety is the first principle of physical AI04:20 — Why physical AI matters now05:45 — Workforce shortages and trillion-dollar opportunities07:00 — Falling costs of sensors and robotics hardware07:45 — The biggest challenges: data, actuation, and precision09:30 — The fine-grained problem: how robots pick up a berry vs. an orange11:10 — Inside the first Physical AI cohort: 8 startups to watch12:25 — Bedrock Robotics: autonomy for construction vehicles12:55 — Diligent Robotics: socially intelligent humanoids in hospitals14:00 — Generalist vs. specialist robots: why we’ll need both15:30 — The future of physical AI in healthcare and manufacturing16:10 — How Amazon is already using robots for 25% more efficiency17:20 — The fellowship’s future: expanding beyond startups18:10 — Wrap-up and key takeaways
undefined
Oct 28, 2025 • 30min

AGI: will it kill us or save us?

Artificial general intelligence (AGI) could be humanity’s greatest invention ... or our biggest risk.In this episode of TechFirst, I talk with Dr. Ben Goertzel, CEO and founder of SingularityNET, about the future of AGI, the possibility of superintelligence, and what happens when machines think beyond human programming.We cover: • Is AGI inevitable? How soon will it arrive? • Will AGI kill us … or save us? • Why decentralization and blockchain could make AGI safer • How large language models (LLMs) fit into the path toward AGI • The risks of an AGI arms race between the U.S. and China • Why Ben Goertzel created Meta, a new AGI programming language📌 Topics include AI safety, decentralized AI, blockchain for AI, LLMs, reasoning engines, superintelligence timelines, and the role of governments and corporations in shaping the future of AI.⏱️ Chapters00:00 – Intro: Will AGI kill us or save us?01:02 – Ben Goertzel in Istanbul & the Beneficial AGI Conference02:47 – Is AGI inevitable?05:08 – Defining AGI: generalization beyond programming07:15 – Emotions, agency, and artificial minds08:47 – The AGI arms race: US vs. China vs. decentralization13:09 – Risks of narrow or bounded AGI15:27 – Decentralization and open-source as safeguards18:21 – Can LLMs become AGI?20:18 – Using LLMs as reasoning guides21:55 – Hybrid models: LLMs plus reasoning engines23:22 – Hallucination: humans vs. machines25:26 – How LLMs accelerate AI research26:55 – How close are we to AGI?28:18 – Why Goertzel built a new AGI language (Meta)29:43 – Meta: from AI coding to smart contracts30:06 – Closing thoughts
undefined
Oct 15, 2025 • 28min

9 million robot deliveries (!!!)

What changes when robots deliver everything?Starship Technologies has already completed 9 million autonomous deliveries, crossed roads over 200 million times, and operates thousands of sidewalk delivery robots across Europe and the U.S. Now they’re scaling into American cities ... and they say they’re ready to change your worldIn this episode of TechFirst, I speak with Ahti Heinla, co-founder and CEO of Starship and co-founder of Skype, about: - How Starship’s robots navigate without GPS - What makes sidewalk delivery better than drones - Solving the last-mile problem in snow, darkness, and dense cities - How Starship is already profitable and fully autonomous - What it all means for the future of commerce and city lifeHeinla says:“Ten years ago we had a prototype. Now we have a commercial product that is doing millions of deliveries.”Watch to learn why the future of delivery might roll ... as well as fly.🔗 Learn more: https://www.starship.xyz🎧 Subscribe to TechFirst: https://www.youtube.com/@johnkoetsier00:00 - Intro: What changes when robots deliver everything?01:37 - Meet Starship: 9 million robot deliveries and counting02:45 - Why it took 10 years to go from prototype to product05:03 - When robot delivery becomes normal (and where it already is)08:30 - How Starship robots handle cities, traffic, and construction11:20 - Snow, darkness, and all-weather autonomy13:19 - Reliability, unit economics, and competing with human couriers16:23 - Inside the tech: sensors, AI, and why GPS isn’t enough18:03 - Real-time mapping, climbing curbs, and reaching your door19:54 - How Starship scales without local depots or chargers22:04 - How city life and commerce change with robot delivery25:53 - Do robots increase customer orders? (Short answer: yes)27:05 - Hot food, Grubhub integration, and thermal insulation28:26 - Will Starship use drones in the future?29:38 - What U.S. cities are next for robot delivery?
undefined
Oct 11, 2025 • 18min

1 million qubits in 50 square millimeters (!!)

Imagine a quantum computer with a million physical qubits in a space smaller than a sticky note.That’s exactly what Quantum Art is building. In this TechFirst episode, I chat with CEO Tal David, who shares his team’s vision to deliver quantum systems with: • 100x more parallel operations • 100x more gates per second • A footprint up to 50x smaller than competitorsWe also dive into the four key tech breakthroughs behind this roadmap to scale Quantum Art's computer:1. Multi-qubit gates capable of 1,000 2-qubit operations in a single step2. Optical segmentation using laser-defined tweezers3. Dynamic reconfiguration of ion cores at microsecond speed4. Modular, ultra-dense 2D architectures scaling to 1M+ qubitsWe also cover:- How Quantum Art plans to reach fault tolerance by 2033- Early commercial viability with 1,000 physical qubits by 2027- Why not moving qubits might be the biggest innovation of all- The quantum computing future of healthcare, logistics, aerospace, and energy🎧 Chapters00:00 – Intro: 1M qubits in 50mm²01:45 – Vision: impact in business, humanity, and national tech03:07 – Multi-qubit gates (1,000 ops in one step)05:00 – Optical segmentation with tweezers06:30 – Rapid reconfiguration: no shuttling, no delay08:40 – Modular 2D architecture & ultra-density10:30 – Physical vs logical qubits13:00 – Quantum advantage by 202716:00 – Addressing the quantum computing skeptics17:30 – Real-world use cases: aerospace, automotive, energy19:00 – Why it’s called Quantum Art👉 Subscribe for more deep tech interviews on quantum, robotics, AI, and the future of computing.
undefined
Sep 30, 2025 • 31min

Robotic hands: a $50 trillion opportunity

Are humanoid robots distracting us from the real unlock in robotics ... hands? In this TechFirst episode, host John Koetsier digs into the hardest (and most valuable) problem in robotics: dexterous manipulation. Guest Mike Obolonsky, Partner at Cortical Ventures, argues that about $50 trillion of global economic activity flows through “hands work,” yet manipulation startups have raised only a fraction of what locomotion and autonomy companies have. We break down why hands are so hard (actuators, tactile sensing, proprioception, control, data) and what gets unlocked when we finally crack them.What we'll talk through ...• Why “navigation ≠ manipulation” and why most real-world jobs need hands• The funding mismatch: billions to autonomy & humanoids vs. comparatively little to hands• The tech stack for dexterity: actuators, tactile sensors (pressure, vibration, shear), feedback, and AI• Grasping vs. manipulation: picking, placing, using tools (e.g., dishwashers to scalpels)• Reliability in the wild: interventions/hour, wet/greasy plates, occlusions, bimanual dexterity• Practical paths: task-specific grippers, modular end-effectors, and “good enough” today vs. general purpose tomorrow• The moonshot: what 70–90% human-level hands could do for productivity on Earth ... and off-planetChapters00:00 Intro—are we underinvesting in robotic hands?01:10 Why hands matter more than legs (economics of manipulation)04:30 Funding realities: autonomy & humanoids vs. hands08:40 Locomotion progress vs. manipulation bottlenecks12:10 Teleop now, autonomy later—how data gets gathered14:20 What’s missing: actuators, tactile sensing, proprioception17:10 Perception limits in the real world (wet dishes, occlusions)22:00 General-purpose dexterity vs. task-specific ROI26:00 Startup landscape & reliability (interventions/hour)29:00 Modular end-effectors and upgrade paths30:10 The moonshot: productivity explosion when hands are solvedWho should watchRobotics founders, VCs, AI researchers, operators in warehousing & manufacturing, and anyone tracking humanoids beyond the hype.If you enjoyed thisSubscribe for more deep-tech conversations, drop a comment with your take on the “hands vs. legs” debate, and share with someone building robots.Keywordsrobotic hands, dexterous manipulation, humanoid robots, tactile sensing, actuators, proprioception, warehouse automation, AI robotics, Cortical Ventures, TechFirst, John Koetsier, Mike Obolonsky#Robotics #AI #Humanoids #RobotHands #Manipulation #Automation #TechFirst
undefined
Aug 30, 2025 • 31min

Do robots really need legs?

Are humanoid robots the future… or a $100B mistake?Over 100 companies—from Meta to Amazon—are betting big on humanoids. But are we chasing a sci-fi dream that’s not practical or profitable?In this TechFirst episode, I chat with Bren Pierce, robotics OG and CEO of Kinisi Robots. We cover: - Why legs might be overhyped - How LLMs are transforming robots into agents - The real cost (and complexity) of robotic hands - Why warehouse robots work best with wheels - The geopolitical robot arms race between China, the US, and Europe - Hot takes, historical context, and a glimpse into the next 10 years of AI + robotics.Timestamps:0:00 – Are humanoids a dumb idea?1:30 – Why legs might not matter (yet)5:00 – LLMs as the real unlock12:00 – The hand is 50% of the challenge17:00 – Speed limits = compute limits23:00 – Robot geopolitics & supply chains30:00 – What the next 5 years looks likeSubscribe for more on AI, robotics, and tech megatrends.
undefined
Aug 27, 2025 • 25min

This kills 10,000 weeds per minute with lasers

The future could be much healthier for both farmers and everyone who eats, thanks to farm robots that kill weeds with lasers. In this episode of TechFirst, we chat with Paul Mikesell, CEO of Carbon Robotics, to discuss groundbreaking advancements in agricultural technology. Paul shares updates since our last conversation in 2021, including the launch of LaserWeeder G2 and Carbon's autonomous tractor technology: AutoTractor. LaserWeeder G2 quick facts: - Modular design: Swappable laser “modules” that adapt to different row sizes (80-inch, 40-inch, etc.) - Laser hardware: Each module has 2 lasers; a standard 20-foot machine = 12 modules = 24 lasers - Laser precision: Targets the plant’s meristem (≈3mm on small weeds) with pinpoint accuracy - Weed kill speed: 20–150 milliseconds per weed (including detection + laser fire) - Throughput: 8,000–10,000 weeds per minute (Gen 2, up from ~5,000/min on Gen 1) - Coverage rate: 3–4 acres per hour on the 20-foot G2 model - ROI timeline: Farmers typically achieve payback in under 3 years - Yield impact: Up to 50% higher yields in some conventional crops due to eliminating herbicide damage - Price: Standard 20-foot LaserWeeder G2 = $1.4M, larger models scale from there - Global usage: Units in the U.S. (Midwest corn & soy, Idaho & Arizona veggies) and Europe (Spain, Italy tunnel farming)We chat about how these innovations are transforming weed control and farm management with AI, computer vision, and autonomous systems, the precision and efficiency of laser weeding, practical challenges addressed by autonomous tractors, and the significant ROI and yield improvements for farmers. This is a must-watch for anyone interested in the future of farming and sustainable agriculture.00:00 Introduction to TechFirst and Carbon Robotics01:10 The Science Behind Laser Weeding05:46 Introducing Laser Weeder 2.006:39 Modular System and New Laser Technology09:26 Manufacturing and Cost Efficiency11:47 ROI and Benefits for Farmers13:24 Laser Weeder Specifications14:08 Performance and Efficiency14:49 Introduction to AutoTractor17:23 Challenges in Autonomous Farming18:23 Remote Intervention and Starlink Integration23:23 Future of Farming Technology24:50 Health and Environmental Benefits25:18 Conclusion and Farewell
undefined
Aug 9, 2025 • 17min

Smart farm robot cuts herbicide, fertilizer use by 90%

Can robots reduce herbicide and fertilizer use on farms by up to 90%?Probably yes.In this episode of TechFirst we chat with Verdant Robotics' CEO Gabe Sibley about SharpShooter, the company's state-of-the-art farm tech that precisely targets herbicide and fertilizer application, massively reducing chemical use.That's huge for the environment.It's also huge for farmer's pocketbooks ... because herbicide and fertilizer are increasingly expensive.We dive into: - How Sharpshooter targets plants with pinpoint accuracy — 240 shots per second - Why this approach can save farmers millions in input costs - The environmental benefits for soil, water, and food - How AI and edge computing make split-second farm decisions possible - The future of robotics in agricultureIf you’re interested in agtech, AI, or sustainable farming, this one’s for you.00:00 Introduction to Robotic Farming00:28 Interview with Gabe Sibley, CEO of Verdant Robotics00:50 How Sharpshooter Technology Works02:40 Economic and Environmental Benefits04:59 Technical Specifications and Capabilities11:11 Future of Agricultural Automation11:54 Personal Insights and Motivation16:39 Conclusion and Final Thoughts
undefined
Jul 19, 2025 • 36min

Welcome to the agentic browser

Will your next browser be AI-enabled? AI-first? Perhaps even an AI agent?In this episode of TechFirst, John Koetsier sits down with Henrik Lexow, Senior Product Leader at Opera, to explore Opera Neon, a big step toward agentic browsers that think, act, and create alongside you.(And buy stuff you want, simply hard problems, and do some of your work for you.)Opera’s new browser integrates real AI agents capable of executing multi-step tasks, interacting with web apps, summarizing content, and even building playable games or interactive tools, all inside your browser.We chat about • What an agentic browser is and why it matters • How AI agents like Neon Do and Neon Make automate complex workflows • Opera’s vision for personal, on-device, privacy-aligned AI • Live demos of shopping, summarizing, and game creation using AI • Why your browser might replace your operating system🎮 Watch Henrik demo the Neon agent building a Snake game from scratch🛍️ See AI navigate Amazon, add items to cart, and act independently🧠 Learn why context is king and how this changes everything about search, tabs, and multitasking00:00 Introduction: Should Your Browser Be an AI Agent?00:52 The Evolution of AI in Browsers04:53 Introducing Opera's Agentic Browser11:51 Neon: The Future of Browsing20:26 Exploring the Cart Functionality20:53 Future of AI in Shopping22:39 Trust and Privacy in AI25:05 Neon Make: Generative AI Capabilities26:05 Creating a Snake Game with Neon28:33 Analyzing Car Insurance Policies31:58 Sharing and Publishing with Neon35:53 Conclusion and Future Prospects
undefined
Jul 4, 2025 • 41min

Nuclear waste can solve our AI power problem (and more)

Can nuclear waste solve the energy crisis caused by AI data centers? Maybe. And maybe much more, including providing rare elements we need like rhodium, palladium, ruthenium, krypto-85, Americium-241, and more.Amazingly:- 96% of nuclear fuel’s energy is left after it's "used"- Recycling can reduce 10,000-year waste storage needs to just 300 years- Curio’s new process avoids toxic nitric acid and extracts valuable isotopes- 1 recycling plant could meet a third of America’s nuclear fuel needs- Nuclear recycling could enable AI, space travel, and medical breakthroughsIn this episode of TechFirst, host John Koetsier talks with Ed McGinnis, CEO of Curio and former Acting Assistant Secretary for Nuclear Energy at the U.S. Department of Energy. McGinnis is on a mission to revolutionize how we think about nuclear waste, turning it into a powerful resource for energy, rare isotopes, and even precious metals like rhodium.Watch now and subscribe for more deep tech insights.

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app