Episode 28: How will augmented reality and AI tools revolutionize how we interact with our devices? Matt Wolfe (https://x.com/mreflow) and Nathan Lands (https://x.com/NathanLands) ponder if AI-generated entities like podcast hosts change our understanding of reality.
In this episode, Matt and Nathan share insights on new AI tools like Notebook LM and OpenAI’s advanced voice mode, and how these technologies could transform learning and human-computer interactions. Whether it’s using AI for content creation, translation, or personal and business tasks, the hosts navigate the thrilling yet unsettling advancements in AI technology.
Check out The Next Wave YouTube Channel if you want to see Matt and Nathan on screen: https://lnk.to/thenextwavepd
—
Show Notes:
- (00:00) AI translation struggles led to amusing moments.
- (05:13) Sam Altman uses advanced voice as a companion.
- (09:00) OpenAI preempted Meta's advanced voice mode launch.
- (12:05) Glasses remember parking spots using vision features.
- (14:43) AI-driven voice interaction is the future.
- (19:41) Microphone mistaken for surveillance device at conference.
- (22:23) Notebook LM impresses with versatile document integration.
- (25:36) Technological acceleration expected; improvements surpass expectations.
- (30:13) Complex topics explained well using podcasts.
- (31:42) Automated podcast creation with AI tools nearing.
—
Mentions:
—
Check Out Matt’s Stuff:
• Future Tools - https://futuretools.beehiiv.com/
• Blog - https://www.mattwolfe.com/
• YouTube- https://www.youtube.com/@mreflow
—
Check Out Nathan's Stuff:
The Next Wave is a HubSpot Original Podcast // Brought to you by The HubSpot Podcast Network // Production by Darren Clarke // Editing by Ezra Bakker Trupiano