
Risk Management Show The AI Data Privacy Risk Financial Advisors Ignore with Daniel Yoo
Feb 2, 2026
Daniel Yoo, founder and CEO of Finmate AI and former licensed financial advisor who managed large client assets. He discusses AI-driven note-taking risks, the sensitive client data captured in meetings, why common AI safeguards can fail, and red flags to watch when evaluating AI vendors. Short, urgent takes on data deletion and building proper architectures.
AI Snips
Chapters
Transcript
Episode notes
AI Necessitates Data And Creates Immediate Risk
- AI requires extensive data to train and modify, which immediately creates client-data risk in financial services.
- Daniel Yoo emphasizes that any AI touching client data raises privacy and regulatory concerns.
Proactively Delete Processed Client Data
- Avoid pulling PII from advisors' existing systems and proactively delete processed data regularly.
- Daniel Yoo says periodic full data deletion is safer than vague promises not to use data later.
Meetings Capture Deeply Sensitive Client Details
- Client meetings often capture both PII and sensitive behavioral details that exceed basic financial facts.
- These softer details increase stakes because they can harm clients if exposed or reused by AI vendors.

