The big, disturbing AI experiment in the classroom

“These classrooms are laboratories for future generations… just how this all works out won’t be apparent until they become adult citizens.”

China’s capacity to implement and integrate new AI technology into their society never fails to shock me. The video below on all the ways China is using AI in the classroom is no different.

Watching it reminded me of a term I just learned, parental anxiety management. The term comes from the article, The Case Against Spying on Your Kids With Apps, which does an excellent job of showing off the creepy ways parents can track their kids alongside reasons why they shouldn’t. The takeaway is that surveillance tech markets themselves as the solution to parents’ collective anxiety about their kids safety and health.

While China’s government certainly plays a massive role in the expansion of AI, the cultural acceptance of new technology that supposedly benefits our kids isn’t limited to China. Watch the Chinese parents share why they think it’s a good idea to use artificial intelligence in the classroom, and you’ll see little difference between those parents and US parents who just want what’s best for their kids.

I can only hope that our government steps up and puts huge limits on AI in the classroom so our kids don’t end up monitored, tracked, and shamed as they go through the education system.

This call is being monitored (and used to discipline call center workers)

The premise of using affect as a job-performance metric would be problematic enough if the process were accurate. But the machinic systems that claim to objectively analyze emotion rely on data sets rife with systemic prejudice, which affects search engine results, law enforcement profiling, and hiring, among many other areas. For vocal tone analysis systems, the biased data set is customers’ voices themselves. How pleasant or desirable a particular voice is found to be is influenced by listener prejudices; call-center agents perceived as nonwhite, women or feminine, queer or trans, or “non- American” are at an entrenched disadvantage, which the datafication process will only serve to reproduce while lending it a pretense of objectivity.

Recorded for Quality Assurance

All of us are used to hearing the familiar phrase “This call is being monitored for quality assurance” when we contact customer service.

Most of us don’t give a second thought to what happens to the recording after our problem is solved.

The article above takes us in the new world of call center work, where your voice is monitored, scored by AI, and used to discipline workers.

“Reps from companies claim their systems allow agents to be more empathetic, but in practice, they offer emotional surveillance suitable for disciplining workers and manipulating customers. Your awkward pauses, over-talking, and unnatural pace will be used against them.

The more I read about workplace surveillance, the more dystopian the future of work looks. Is this really what we want? Is this what managers and leadership want?

What if we used the voice analysis on leadership. Why aren’t we monitoring and analyzing how leadership speaks to their subordinates or peers in meetings? Grant it, I don’t think that’d actually produce a healthy work environment but it only seems like a fair deal for leadership who implement and use these algorithms in their organizations.

On a related note, there’s a collection of papers out from Data & Society that seek to “understand how automated, algorithmic, AI, or otherwise data-driven technologies are being integrated into organizational contexts and processes.” The collection, titled Algorithms on the shop floor: Data driven technologies in organizational contexts, shows off the range of contexts in which new technology is fundamentally reshaping our workforce.

With companies racing to implement automated platforms and AI technology in the workplace, we need so much more of this research.


Didn’t see this coming: Facial recognition at summer camp

I spend a lot of time reading and writing about AI in the workplace which means I spend a lot of time reading about AI in general. But I wasn’t at all prepared for this:

Now hundreds of summer camps across the United States have tethered their rustic lakefronts to facial-recognition software, allowing parents an increasingly omniscient view into their kids’ home away from home.

If you just yelled what the fuckity fuck when you read that quote, than you’re really not going to like the article, As summer camps turn on facial recognition, parents demand: More smiles, please. The article details how summer camps are using facial recognition tech to keep parents up to date on teens often without their kids knowing it.

I spend a lot of time reading about AI products and their impact on society, but using facial recognition on teens at a summer camp (and a phone-free one at that) so companies can sell fear and anxiety to parents who then transfer that anxiety right back onto their kids, really caught me off guard.

If this fires you up, follow @ruchowdh and @hypervisible on Twitter.

Excellent analysis by @drewharwell – “Some of the kids… are so accustomed to constant photography that they barely notice the camera crew.” – we are acclimating the next generation to a surveillance state.— Rumman Chowdhury (@ruchowdh) August 9, 2019

Then read Shoshana Zuboff’s new book surveillance capitalism.