Data Engineering Podcast cover image

Using Trino And Iceberg As The Foundation Of Your Data Lakehouse

Data Engineering Podcast

NOTE

Data Lake and Lake House Insights

Data lakes historically used traditional storage like HDFS but now predominantly utilize cloud object storage like S3, GCS, or Azure. A 'lake house' emphasizes standard data representations and storing data in its native form within the lake for direct operations, avoiding proprietary formats and unnecessary data import-export processes for transformations and maintenance, distinguishing it from additional bolt-on functionalities.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner