The question is, can algorithms know when they're being gamed, and when they're amplifying hate or false things or bad things? And according to face book's own logic, they can't know. The example of that is, do you remember trending topics on facebook? Now, they used to have on the right hand side, here's the most popular news stories. And they had human beings, human editors, who were curating that. They had some contractors. Facebook got accused by conservatives in the united states saying, oh, you're, you're biased against conservatives. We're going to get rid of our human editors, and we're going to have just the machines decide
Brittany Kaiser, a former Cambridge Analytica insider, witnessed a two day presentation at the company that shocked her and her co-workers. It laid out a new method of campaigning, in which candidates greet voters with a thousand faces and speak in a thousand tongues, automatically generating messages that are increasingly aiming toward an audience of one. She explains how these methods of persuasion have shaped elections worldwide, enabling candidates to sway voters in strange and startling ways.