Light listening: Algorithmic surveillance

We’re so used to hearing about algorithms now that most people don’t spend much time thinking much about them. They operate in the background invisibly shaping our decisions as we go about our day. Most of us are quite clueless about how we’re manipulated by this technology.

This 22 minutes talk from techno-sociologist Zeynep Tufekci is the antidote to that ignorance. As the Dr Tufecki explains, these algorithms do more than make ads follow us around. They power Facebook’s dark ads that are used to manipulate voters and form the foundation for surveillance authoritarianism. Worse yet, it’s hard to know exactly how these algorithms operate and how we’re being affected.

Here’s a snippet from her talk:

Now, we started from someplace seemingly innocuous — online adds following us around — and we’ve landed someplace else. As a public and as citizens, we no longer know if we’re seeing the same information or what anybody else is seeing, and without a common basis of information, little by little, public debate is becoming impossible, and we’re just at the beginning stages of this. These algorithms can quite easily infer things like your people’s ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age and genders, just from Facebook likes. These algorithms can identify protesters even if their faces are partially concealed. These algorithms may be able to detect people’s sexual orientation just from their dating profile pictures.

Now, these are probabilistic guesses, so they’re not going to be 100 percent right, but I don’t see the powerful resisting the temptation to use these technologies just because there are some false positives, which will of course create a whole other layer of problems. Imagine what a state can do with the immense amount of data it has on its citizens. China is already using face detection technology to identify and arrest people. And here’s the tragedy: we’re building this infrastructure of surveillance authoritarianism merely to get people to click on ads. And this won’t be Orwell’s authoritarianism. This isn’t “1984.” Now, if authoritarianism is using overt fear to terrorize us, we’ll all be scared, but we’ll know it, we’ll hate it and we’ll resist it. But if the people in power are using these algorithms to quietly watch us, to judge us and to nudge us, to predict and identify the troublemakers and the rebels, to deploy persuasion architectures at scale and to manipulate individuals one by one using their personal, individual weaknesses and vulnerabilities, and if they’re doing it at scale through our private screens so that we don’t even know what our fellow citizens and neighbors are seeing, that authoritarianism will envelop us like a spider’s web and we may not even know we’re in it.

This 22 minutes will bring you up to speed on how algorithms are shaping our lives and what it means for the future.

 

 

 

While the talk above focuses a lot on Facebook, Dr Tufekci points out Amazon too is leading the way in algorithmic surveillance, especially with its release of Echo Look.

 

If this subject interests you check out the book, Weapons of Math Destruction. It’s a deeper dive into how algorithms shape our lives. And it’s a quick read.