
The Good Stuff 37 - Stocking Fillers - AI Predictions for 2026
# The Good Stuff, with Pete and Andy - Episode 37: Stocking Fillers - Predictions for 2026
Hosts: Pete and Andy (sporting festive headgear at the beach)
Pete and Andy kick off their Christmas predictions episode with zero preparation and maximum confidence. They tackle the biggest question: what will happen in 2026?
Key Moments:
[00:51] Andy's two-for-one Christmas prediction: most AI predictions won't eventuate, no AGI in 2026
[03:09] Revised timeline: feels more like a 10-year transition than 5 years
[04:02] Pete's first prediction: models become less relevant, focus shifts to agent tooling
[05:40] Are model releases becoming underwhelming? The breakthrough isn't as significant anymore
[05:53] Elon's take: "The whole framework is incorrect—throwing more data doesn't give you intelligence"
[07:08] Why software automation is so valuable: automate software, you get free option on everything else
[08:17] Less discussion about new model releases—when Opus came out, people raved for a few days then moved on
[09:05] Death of one-shot benchmarking: "Nobody uses these things in that fashion"
[10:26] The Presidio Bitcoin example: Gemini 3 got everything wrong because the task was too broad
[12:07] Flow matters: sequential tasks build on previous work, jumping around creates cognitive load
[13:15] Andy's prediction: website development will be completely commoditized
[14:01] Big agencies won't be able to justify high spends anymore—except for enterprise clients
[16:24] The WYSIWYG problem: hard to update flat file websites without developer tools
[18:13] Pete's prediction: agents move out of the terminal
[19:08] The model switching advantage: when one gets nerfed, quickly switch to another
[20:30] Why Goose is better for non-coding tasks: less sandboxed, happier to just do work
[22:06] The spare machine problem: agents need machine access to be powerful
[23:47] Most people still talk about Copilot, not Claude Code or Codex
[24:44] The Excel analogy: "This is like super Excel, why wouldn't you want to learn it?"
[27:06] Calling them "coding agents" creates mental resistance for non-coders
[27:27] The agent becomes the engineer, you become the product manager/technical architect
[30:46] The killer use case isn't software—it's the business you already have
[31:20] Service 10x the market with no additional headcount: "That's insane"
[32:29] Value will accrue to small businesses, not S&P 500 companies with cultural inertia
[33:50] Andy's prediction: AI becomes more politically negative in 2026
[34:36] Politicians will use fear-mongering based on job loss to accrue more power
[36:43] Greater divide between adopters and laggards—companies that embraced AI take big leap forward
[38:38] Small nimbler SMEs will be the standout stories, not big enterprises
[39:32] More of your business can run in software than you thought it could
[40:40] The Replit debate: "Why would I use Replit when I can just change Wingman myself?"
[41:40] Wingman needs the concept of a business inside it—metadata that flows through apps
[44:26] Pete's hot take: large models are a dead end, we don't need bigger models with more data
[45:20] "The model should understand what you're doing, not be responsible for knowing stuff"
[46:40] Fast models over big models: "I think that becomes the rallying cry"
[48:02] The inverse direction: everyone's been focusing on thinking time, slower and more deliberate
[49:12] Speed unlock: "If it took 10 seconds instead of 10 minutes, you'd use it 100 times more"
[51:15] Pete's fundamental belief: "I don't think you want the model to know stuff—it's a bug we ship as a feature"
[52:56] Why domain-specific models don't make sense: graphs do the heavy lifting of knowing
[55:05] Timeline check: specialized models and speed focus probably not a 2026 thing
[57:24] Rise of the Agents: more use, simpler to use, non-coding use cases becoming clear
[58:38] Final agreement: agents are the future, not bigger models
