It’s always a treat to guest on a podcast but I think the treat is even sweeter when the podcast is hosted by someone with a British accent. I had was thrilled to chat with Jane Barrett, Founder of Career Farm, all about our new world of work.
More than two-thirds of workers, specifically 64 percent, trust robots more than their managers…
Notably, 45 percent of workers—less than half—said managers are better than robots at understanding their feelings. Thirty-three percent believe managers are better at coaching while 29 percent said they’re better at creating work culture. However, 26 percent believe robots are better at providing unbiased information and 29 percent said they were better at problem-solving.
I don’t even know what to write about this survey and really I just feel like typing WTF over and over again. I didn’t dive into the report to see the methodology or question phrasing, so I’m taking everything surface value here. But I’m still floored.
What the hell is happening with management? I mean I’ve worked for some absolutely terrible managers. In a previous job I had a manager who stole my work and passed it off as hers, bad mouthed me to make herself look good, made my coworkers cry on the regular, and threatened to take away all the best parts of a job unless I did her pet project. She caused me all kinds of stress. And even then I didn’t wish to be managed by algorithm. I’m also firmly in the camp that AI will make managers worse.
It’s common knowledge that people leave their jobs because of bad bosses. Bad management is everywhere. But algorithms aren’t much better as bosses. Just ask the Uber and DoorDash workers how they feel about algorithms as managers. So why do so many workers think that algorithms > managers? That’s hella depressing news for managers in general.
I’m also curious who is working for robots that understand feelings. Is there some kind of virtual reality manager that’s more compassionate than a human?
I don’t have an answer to that. But workers in low wage jobs are seeing an increase in management by algorithm. From Axios:
Even the most vigilant supervisor can only watch over a few workers at one time. But now, increasingly cheap AI systems can monitor every employee in a store, at a call center or on a factory floor, flagging their failures in real time and learning from their triumphs to optimize an entire workforce.
First, the phrase “optimize an entire workforce” should strike fear into employees across workplaces. Workers are human, they aren’t designed to be optimized. They need breaks, moments to reflect, engage, connect, and encouragement from humans. They need to be human. Optimizing strips human needs from humans. The term “optimizing” masks the brutality of it.
We’ve seen what’s happened to those working in the world’s most optimized workforce, Amazon, especially people working in warehouses and as delivery drivers. We don’t need more of it.
And yet leadership is proceeding ahead as if optimization is the holy grail of the workplace. Again from Axios:
How often is an employee going out to smoke a cigarette? How long a lunch are they taking? How long are they sitting in the lunchroom?” These are the questions clients want answered with AI software, says Kim Hartman, CEO of Surveillance Secure, a D.C.-area company that installs security systems.
Hartman says his company has put in video analytics for several area retailers and restaurants that wanted to monitor their employees’ productivity.
Employee surveillance isn’t just used to keep tabs on employees – it can also be used to discipline employees. This all happens first with low-wage workers because they have less power, and less ability to push back. It’s harder to fight the system when you can’t miss a paycheck. Once these automated systems are tested, integrated, tweaked and finessed – and they’ve collected enough data – leadership will move onto automating middle-wage jobs.
I wonder what’s going to happen to all the middle managers who oversee these workforces. Where will they go? Will they be laid off? Retrained to use AI software to manage their workforce? What is a middle manager to do at this point?
At every discussion of automating workers, I wonder why we never talk automating leadership. Here’s my proposal to push back: Automate the c-suite.
The premise of using affect as a job-performance metric would be problematic enough if the process were accurate. But the machinic systems that claim to objectively analyze emotion rely on data sets rife with systemic prejudice, which affects search engine results, law enforcement profiling, and hiring, among many other areas. For vocal tone analysis systems, the biased data set is customers’ voices themselves. How pleasant or desirable a particular voice is found to be is influenced by listener prejudices; call-center agents perceived as nonwhite, women or feminine, queer or trans, or “non- American” are at an entrenched disadvantage, which the datafication process will only serve to reproduce while lending it a pretense of objectivity.
All of us are used to hearing the familiar phrase “This call is being monitored for quality assurance” when we contact customer service.
Most of us don’t give a second thought to what happens to the recording after our problem is solved.
The article above takes us in the new world of call center work, where your voice is monitored, scored by AI, and used to discipline workers.
“Reps from companies claim their systems allow agents to be more empathetic, but in practice, they offer emotional surveillance suitable for disciplining workers and manipulating customers. Your awkward pauses, over-talking, and unnatural pace will be used against them.
The more I read about workplace surveillance, the more dystopian the future of work looks. Is this really what we want? Is this what managers and leadership want?
What if we used the voice analysis on leadership. Why aren’t we monitoring and analyzing how leadership speaks to their subordinates or peers in meetings? Grant it, I don’t think that’d actually produce a healthy work environment but it only seems like a fair deal for leadership who implement and use these algorithms in their organizations.
On a related note, there’s a collection of papers out from Data & Society that seek to “understand how automated, algorithmic, AI, or otherwise data-driven technologies are being integrated into organizational contexts and processes.” The collection, titled Algorithms on the shop floor: Data driven technologies in organizational contexts, shows off the range of contexts in which new technology is fundamentally reshaping our workforce.
With companies racing to implement automated platforms and AI technology in the workplace, we need so much more of this research.
Whether you are a grocer, doctor, factory worker, or journalist. All of our jobs will soon be reshaped by automation. Some will benefit from the new work that will emerge. And others will watch their jobs disappear with no clear path to another livelihood. Managing this transition will be the defining challenge for us in the decades ahead. And we need to be ready for it.
Employee surveillance is all the rage in 2019. Advancements in facial recognition technology, wearables and sensor data, data analysis and machine learning, have created a rich product landscape that makes it easy for your employers to track you at work and outside of it.
The market for Employee (Automated) Monitoring Solutions is around $1.1 billion but analysts expect it to grow to about $3 billion by 2023. That’s a whole lot of worker spying headed our way.
Amazon is the most enthusiastic and well-known employer to embrace employee surveillance technology. They routinely subject their warehouse employees to a brutal work environment in which everyone is tracked, measured, and pushed to meet ever-increasing metrics. The mindset seems to be that any moment spent not producing – whether its going to the bathroom, saying hello to a coworker, or taking a moment to think – is money stolen from the company. The result is a hellish place, in which workers suffer from depression and injuries, creating a corporate culture of distrust.
Employee surveillance tech is hot hot hot
If you don’t work in an Amazon warehouse it’s easy to think that surveillance technology is a world a way from your workplace. But you’d be wrong. Gig economy workers are already managed by algorithm, with plenty of tracking and nudges to get workers to obey the algorithm and keep working.
In fact, companies use of employee surveillance technology is only growing:
Last year, the research firm Gartner found that more than 50% of the 239 large corporations it surveyed are using “nontraditional” monitoring techniques, including scrutinizing who is meeting with whom; analyzing the text of emails and social-media messages; scouring automated telephone transcripts; gleaning genetic data; and taking other such steps. That’s up from just 30% in 2015. And Gartner expects those ranks to reach 80% by next year. – Workplace tracking is growing fast.
Employee surveillance technology is going to make your worst manager even worse. Employers are collecting increasing amounts of data about you, both at work and outside of work. The data is fed into algorithms designed to categorize and analyze you. The result is delivered on a dashboard, accessible by your boss and leadership. The data your produce, and the decisions made based on that data, are rarely shared with with you, the employee. Sometimes your data is shared with third party companies.
Choose a company that trusts their employees and respects your private data
Now that employers are highly invested in monitoring their employees habits it’s important to know just what kind of culture you’re headed into as you search for new employment. It’s unlikely employers will play up their use of employee surveillance tech on the about page (algorithms aren’t so photogenic after all). Ensure you don’t end up working for a company culture that breeds distrust or puts your personal data into the hands of a bad manager or third parties by asking the right questions.
We all know that asking questions as the end of the interview is a smart move. It makes you look informed and engaged. Use this time to ask the hard questions about employee monitoring.
Employee surveillance interview questions
Here are the top interview questions to guide you in your search for a company that both trusts their employees and cares about your data privacy.
What is the company’s position on employee monitoring?
The future of work is not set in stone. We don’t have to trade our personal data and privacy for a job. Asking questions about data privacy and surveillance monitoring helps us push back on invasive tech and data privacy violations in the workplace. You deserve to work in a place where you aren’t monitored continuously. Find those companies and champion them.
If this article is your jam you’ll definitely like my book. It’s jam packed with upgraded career advice to navigate a new world of work. Sign up to get on the list to get notified when it’s published.
“I want people to know how powerless you feel when your income comes from a faceless app and when you open it up one morning, things are just different and you’re earning less money and there’s no boss you can talk to, you weren’t told about it, you just see your income is lower today and you just have to deal with it.”
Management by algorithm and faceless bosses, this is the future of work. Consider last week’s report from Business Insider:
“A new report indicates that the company doesn’t just track worker productivity at its warehouses — it also has a system that can automatically generate the paperwork to fire them if they’re not meeting targets.”
Companies like Uber and Amazon are leading the way for workforces managed by algorithm. They’re experimenting with the most vulnerable workers first – contractors. But you can expect them to apply what they learn to the white collar workforce next.
Though Intel forecasts flat sales in 2019, people inside the company said this week’s layoffs don’t appear to be strictly a cost-cutting move. Rather, they said the cuts appeared to reflect a broad change in the way Intel is approaching its internal technical systems… Intel will now consolidate operations under a single contractor, the Indian technology giant Infosys.
Intel is laying off hundreds of their IT staff, according to the Oregonian. Unless you or a friend or family member is immediately affected, you’ve probably scrolled right past the news. That’s no shame on you; stories of layoffs are a dime a dozen in our newsfeeds. It’s easy to scroll right on past.
One of the things that sort of keeps us up at night is if you think about the way that we check that our current systems are fair in, say, criminal justice is that we have a system of appeals. We have a system of rulings. You actually have a thing called due process, which means you can check the evidence that’s being brought against you. You can say, “Hey, this is incorrect.” You can change the data. You can say, “Hey, you’ve got the wrong information about me.”
This is actually not how AI works right now. In many cases, decisions are gonna be made about you. You’re not even aware that an AI system is working in the background. Let’s take HR for a classic case in point right now. Now, many of you have probably tried sending CVs and résumés in to get a job. What you may not know is that in many cases, companies are using AI systems to scan those résumés, to decide whether or not you’re worthy of an interview, and that’s fine until you start hearing about Amazon’s system, where they took two years to design, essentially, an AI automatic résumé scanner. – How will AI change your life? AI Now Institute founders Kate Crawford and Meredith Whittaker explain.
Everyone who works on AI products needs to understand the ethical implications of their work. AI engineers and product managers need to understand their product’s impact on users. Business leaders and engineers need to bring in diverse voices and specialties to help ensure their product doesn’t have negative implications. Human resources leads need to hire interdisciplinary workers, who connect the dots between design, engineering, and business performance.
All of this is of course easier said than done. Judging by the many, many, many fails in AI product development, we aren’t even close to that point inside of AI organizations. These “fails” have a tremendous impact on people’s lives.
Ethics is a loaded term and businesses aren’t quite sure what ethics and AI even looks like. Just look at the recent dissolving of Google’s AI ethics board. While many questioned who got to be on that board, many others questioned exactly how an ethics board translates into ethical business practices and products.
Thankfully there are several individuals and organizations working at the intersection of AI and ethics. My personal favorite is the AI Now Institute. I could have pulled so many other impactful quotes from their recent interview on the Recode Decode podcast. Have a listen to that episode to get your head around the many challenges of AI and ethics. And if you’re really into AI and ethics, check out this list of people to follow on Twitter.
Now that my first book on the future of work is moving forward, I’m turning my research towards AI and ethics, specifically how organizations train talent to reduce bias in AI products. So expect more of this type of content in the coming months.
I’m also speaking at Portland’s Machine Learning for All conference on how to have curious conversations. I’ll be teaching software and machine learning engineers how to hone their soft skills to build connections and work interdisciplinary to ensure they’re bringing the right voices into their work.
I just learned how to write a first draft of a book. I’m fresh off of four months of nearly daily writing to wrangle 60,427 words into a first draft. Actually I topped out around 68,000 but chopped it down before handing it over to a developmental editor who will cut it down even more. My first draft is quite the beast.
Wrangling your ideas and thoughts into a coherent narrative isn’t easy. It doesn’t come naturally to most. The funny thing about writing a book is that you don’t need to be good at writing to write a book. Instead, you need to be good at discipline. You need to commit to writing until it’s all out of your head and fight the feeling of quitting because your words look so awkward outside of your head. You also have to slay the procrastination monster on the regular.
I’ve spent the last four months learning how to write a first draft. I learned how to manage the logistics of writing at volume. I learned how to build an outline, find a pace that worked for me, and manage my writing time. Most importantly, I learned how overcome doubt.
Writing a book is a BFD and it’s pretty overwhelming at the start. So I started the process by breaking it down into manageable parts. I’ve conquered the first part: writing a first draft.
Here’s what worked to get me there and what might work for you if you’re trying to write your first draft.