

How Does Data Normalization Impact AI? - with Mika Newton
May 6, 2025
Explore how data normalization significantly influences AI in healthcare. The conversation dives into the challenges large language models face with medical coding systems like ICD-10. Key insights reveal the essential role of proprietary data and editorial standards in enhancing AI accuracy. Discover how generative AI is transforming data representation through ontologies and knowledge graphs. The synergy of AI with human expertise in managing medical terminology is showcased, promising improvements in the usability of health information.
AI Snips
Chapters
Transcript
Episode notes
LLMs Need Medical Context
- Foundational LLMs are trained on general internet data, not specialized medical content.
- Using proprietary medical code sets with LLMs improves AI accuracy and reduces hallucinations in healthcare coding.
Proprietary Medical Content Enhances AI
- Proprietary medical concepts, synonymy, and editorial policies form the essential context for accurate AI-driven coding.
- These assets help inform LLMs to give vastly more accurate and relevant answers.
Use AI Agents with Human Loop
- Use LLMs to speed up ontology creation and dictionary migration tasks with human review.
- Employ AI agents for terminology mapping but keep humans in the loop for exceptions.