Artificial intelligence is being used by campaigns to support themselves, as well as attack ads. It creates for what we call like dark money groups, so 501c4 political advocacy groups. This is a real wide opening for groups that aren't necessarily representing a specific person but an interest area or a larger party.
As the saying goes: a lie gets halfway around the world before the truth has a chance to put its pants on. As AI is increasing productivity across industries, it’s also raising concern about how to regulate its output and keep it from putting many of us out of work. And as the next campaign season approaches, another question comes into focus: what about its potential to quickly create and spread misinformation about political rivals?
Bloomberg’s Laura Davison and Emily Birnbaum raise the curtain on the little regulated and largely vexing ability to disseminate political hay and deepfakes via a chatbot.
Read more: AI Is Making Politics Easier, Cheaper and More Dangerous
Listen to The Big Take podcast every weekday and subscribe to our daily newsletter: https://bloom.bg/3F3EJAK
Have questions or comments for Wes and the team? Reach us at bigtake@bloomberg.net.
See omnystudio.com/listener for privacy information.