Didn’t see this coming: Facial recognition at summer camp

I spend a lot of time reading and writing about AI in the workplace which means I spend a lot of time reading about AI in general. But I wasn’t at all prepared for this:

Now hundreds of summer camps across the United States have tethered their rustic lakefronts to facial-recognition software, allowing parents an increasingly omniscient view into their kids’ home away from home.

If you just yelled what the fuckity fuck when you read that quote, than you’re really not going to like the article, As summer camps turn on facial recognition, parents demand: More smiles, please. The article details how summer camps are using facial recognition tech to keep parents up to date on teens often without their kids knowing it.

I spend a lot of time reading about AI products and their impact on society, but using facial recognition on teens at a summer camp (and a phone-free one at that) so companies can sell fear and anxiety to parents who then transfer that anxiety right back onto their kids, really caught me off guard.

If this fires you up, follow @ruchowdh and @hypervisible on Twitter.

Excellent analysis by @drewharwell – “Some of the kids… are so accustomed to constant photography that they barely notice the camera crew.” – we are acclimating the next generation to a surveillance state.— Rumman Chowdhury (@ruchowdh) August 9, 2019

Then read Shoshana Zuboff’s new book surveillance capitalism.

The dystopian nightmare that’s coming for your international vacations

“I think it’s important to note what the use of facial recognition [in airports] means for American citizens,” Jeramie Scott, director of EPIC’s Domestic Surveillance Project, told BuzzFeed News in an interview. “It means the government, without consulting the public, a requirement by Congress, or consent from any individual, is using facial recognition to create a digital ID of millions of Americans.” – The US Government Will Be Scanning Your Face At 20 Top Airports, Documents Show

Facial recognition systems are headed to the airport, and it’s happening at a rapid pace, without public commentary and guardrails around data privacy or data quality.

Consider this news alongside of new reporting from ProPublica, that found TSA’s body scanning technology discriminates against black women by regularly flagging black women as security threats, resulting in increased screening for those women. Then add to that the reporting out from the New York Times last week called The Privacy Project. One of the most impactful reports took data from three cameras in Bryant Park and used it to create a facial recognition tracking software for less than $100. In the end they used it to identify one of the people in the park and it only took a few days work.

An AI dystopia in which bias is encoded into the algorithms and marginalized communities are further marginalized is hurtling towards us faster than the average person can keep up.

The result of increased use of facial recognition in public spaces puts our society on track to developing a system thats not entirely different from China’s social credit system. From the Buzzfeed article quoted above:

The big takeaway is that the broad surveillance of people in airports amounts to a kind of “individualized control of citizenry” — not unlike what’s already happening with the social credit scoring system in China. “There are already people who aren’t allowed on, say, a high-speed train because their social credit scores are too low,” he said, pointing out that China’s program is significantly based in “identifying individual people and tracking their movements in public spaces though automated facial recognition.”

It all reminded me of a tweet I saw this week which captures my frustration at American journalist’s continued reporting on China’s social credit system while ignoring our own American AI nightmare that’s headed full stem ahead:

*whispers* the us invests in mass surveillance and social credit systems the same way china does and yet some of us only ever point to china with outrage and it’s getting tiring— a once blue haired enby from oakland | tired of it (@WellsLucasSanto) April 16, 2019

Consider this: Just last month, landlords in NYC announced their interest in install ing facial recognition technology in rent subsidized apartments. Yet in Beijing, 47 public housing projects used the technology last year.

The use of facial recognition technology isn’t limited to the government . Companies are doing a bang up job already using facial recognition technology in unsuspecting places:

All of this makes me wonder: how do average people, those outside of tech, academia, and spheres of influence, push back against these technologies? Can you opt out of facial recognition tech at the airport? How do you know to opt out if you didn’t know it was being used to begin with? What happens when you opt out? Will you be subjected to more invasive searches? Will opting out delay your next flight? So many questions and sadly zero answers.