
FutureCraft GTM Boring Problems, Big Wins, Community‑Driven AI Adoption
AI is not overhyped, it is under-implemented. Ken Roden and Erin Mills chat with Sheena Miles on how to move from tool obsession to behavior change, her three stage framework, and the practical KPIs that prove progress before revenue shows up. We also talk AI policy that unlocks safe experimentation, community as an accelerator, and Sheena demos how she spins up n8n workflows from a prompt.
Chapter markers00:00, Cold open and disclaimer
01:00, Is AI overhyped, what is really failing
03:20, Early indicators versus lagging revenue, set better goals
04:20, Exec view, target 3 percent faster time to market
06:00, Avoid AI slop, find repetitive, boring work
07:00, Guest intro
09:00, Real state of adoption, dual speed orgs and siloed champions
10:45, Teach concepts, not tools
12:00, Policy, security review, AI council
14:00, Behavior beats features
15:30, Community for accountability and shared assets
17:30, Live n8n demo, import a skeleton workflow and adapt
35:00, AI first versus AI native, embed into workflows
36:30, Influence without authority, solve a champion’s boring problem
38:00, Inclusion and usage gaps, why it matters to the business
40:00, Skills that matter now, prompting, rapid testing, communicating thought process
43:00, Why to be optimistic
45:00, Lightning round
48:00, Host debrief and takeaways
- Hype versus reality, most failures are vague goals and tool-first rollouts, not AI itself.
• Measure what you can now, speed to market, cycle time, sprint throughput, ticket deflection, before revenue.
• Framework, Activate, Amplify, Accelerate, start small, spread what works, then institutionalize.
• Policy unlocks velocity, simple rules for data and tool vetting plus a cross functional council.
• Behavior over features, learn inputs and outputs so skills transfer across tools.
• Community compounds, accountability and shared templates speed learning.
• Start with boring problems, compliance questionnaires, asset generation, ticket clustering, call insights.
• AI first versus AI native, move from sidecar to embedded with human review gates.
• Inclusion is a business lever, close usage gaps or accept a productivity gap.
Activate, prove value safely
• Define the problem, validate AI fit, run a small pilot.
• Track accuracy thresholds and time saved.
• Example, auto draft responses to repetitive compliance questionnaires from a vetted knowledge base.
Amplify, spread what works
• Connect adjacent teams, add light governance, share patterns.
• Run cross team pilots and publish playbooks.
• Example, connect support tickets, payments, compliance, partner success to detect issues proactively.
Accelerate, institutionalize
• Assign ownership, embed training, integrate tools, set ROI guardrails.
• Roll out across channels and systems with quality gates.
• Example, ad copy system owned by demand gen, content as QA, used across paid, email, social.
“Policy enables speed if you write it to unblock safe experiments.”
“Stop memorizing tool steps, learn the concepts so they transfer.”
“Solve the boring problem first, that is where AI pays for itself.”
“If NRR belongs to someone, it belongs to everyone.”
Resources & Links
About FutureCraft
Stay tuned for more insightful episodes from the FutureCraft podcast, where we continue to explore the evolving intersection of AI and GTM. Take advantage of the full episode for in-depth discussions and much more.
To listen to the full episode and stay updated on future episodes, visit our website, https://www.futurecraftai.media/
Disclaimer: This podcast is for informational and entertainment purposes only and should not be considered advice. The views and opinions expressed in this podcast are our own and do not represent those of any company or business we currently work for/with or have worked for/with in the past.
Music: Far Away - MK2
