AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Strive for Diversity in Model Training Data
Training image generation models with skewed data sets can result in generating outputs that reflect the bias in the data. Google had to engineer prompts to ensure diversity in generated images, even to the extent of generating racially diverse Nazis. Despite historically accurate prompts, the model consistently produced racially mixed outputs, sparking controversy with its overly diverse results.