Kinfolk Keep Trying To Warn Companies About The Potential Harms AI Can Do To Underserved Communities - The Tech Jawn 111
Dec 19, 2023
auto_awesome
A Facebook DEI lead embezzled $4 million, Elon Musk wants DEI initiatives to end, Cox Media Group can listen to ambient sounds through device mics, Black employee group presidents warn of AI harm to underserved communities.
The lack of accountability in addressing AI bias is evident as companies like NVIDIA downplay the significance of diversity and its potential impact on AI technology.
Companies' reluctance to prioritize addressing bias in AI technology allows the AI train to run amok, jeopardizing the fairness and inclusivity of AI systems.
The podcast episode highlights the legality and ethics concerns surrounding a Cox media subsidiary's practice of listening to conversations on phones to serve targeted ads.
Deep dives
Employees Warn of Bias in AI Technology
Former presidents of the black employees group at NVIDIA, Mashika Allgood and Alexander Tissato, presented a 22-page report to CEO Jensen Huang, urging the company to address potential bias in AI technology. They emphasized the need to prioritize the examination of how AI systems could harm underserved communities. However, the meeting with Huang left them devastated, and they eventually left the company. NVIDIA, a trillion-dollar chip maker, had only 1% of its employees identifying as black at the time. Despite concerns raised, Huang dismissed the need for diversity, citing the presence of diversity of thought.
Lack of Accountability in AI Bias
The lack of accountability in addressing AI bias is evident as companies like NVIDIA downplay the significance of diversity and its potential impact on AI technology. With no direct consequences yet, many companies wait for evidence of negative outcomes to take action. The poor choice of data sets and insufficient consideration of underrepresented groups in the development of AI models contribute to bias. By neglecting to address these issues, companies risk facing backlash, legal battles, and public apologies when AI bias leads to negative consequences.
The AI Train Runs Amok
Companies' reluctance to prioritize addressing bias in AI technology allows the AI train to run amok. The rapid advancement of AI without careful consideration of potential bias poses risks to underrepresented communities. Opening the floodgates without proper safeguards jeopardizes the fairness and inclusivity of AI systems. Failing to account for diverse perspectives and experiences in AI development perpetuates discrimination and undermines the potential benefits of AI technology.
Apple vs. Epic and Google Lawsuits
The podcast episode discusses the ongoing lawsuits involving Apple, Epic, and Google. The speaker questions whether Apple may need to change its app store policies based on the outcome of the lawsuits. It highlights the similarities between Apple and Google's app store models and expresses surprise that Google lost the case while Apple was able to avoid a jury trial. The possibility of Epic appealing against Apple after winning against Google is mentioned.
Cox Media Subsidiary's Controversial Listening Practices
The podcast episode focuses on a Cox media subsidiary that claims to listen to conversations on phones to serve targeted ads. The speaker questions the legality and ethics of this practice. The discussion touches on the consent given in terms and conditions for software updates and the potential privacy concerns raised by this listening technology. The importance of perspectives from diverse groups, particularly communities of color, when developing AI models is highlighted. The episode concludes by mentioning the need for inclusivity in the tech industry.
A DEI lead at Facebook has admitted to embezzling 4 million dollars in a scheme that involved fake expense reports, payments via CashApp, and all kinds of easily “firgureoutable” types of fraud.
Elon Musk has declared that Diversity, Equity, and Inclusion initiatives need to die and has even equated DEI to being a different type of discrimination.
A Cox Media Group subsidiary claims the company can listen to ambient sound via device microphones and can market to customers based on the data it hears.
And, Black Employee Group presidents at Nvidea warn the CEO that work the company is doing in AI could harm underserved communities...