Is it right or productive to watch workers?
As remote working has become more common, sales of monitoring technology have boomed. But using surveillance tools carries a cost.
It goes without saying that the biggest shift in the workplace over the last two years has been its disappearance. Or rather, its retreat from the physical world and its reemergence in the work-anywhere digital limbo of Zoom meetings and Slack channels. And, with a few caveats, the robust conclusion has been that, yes, remote employees can still get their work done from their kitchen table, their spare bedroom, their shed, or the patio of their favorite coffee shop.
But let’s be honest: the office isn’t merely a place to do work—it can also be the place to be observed doing work. Which is why a significant number of companies whose workforce has recently gone remote have enlisted the help of surveillance software, also known as “tattleware” or “bossware,” to know what their employees are doing.
There’s an old saw in business: what gets measured gets managed. That has been elevated to gospel when it comes to raw materials, waste, energy use, emissions, and so forth. Viewed this way, surveillance tech may not be an altogether bad idea. There is value in measuring what your employees are doing and how productive they are. What makes surveillance challenging is connecting it to management, or even control.
In January 2021, reports emerged that one in five companies were using surveillance software to remotely monitor their employees—in some cases without the employees’ knowledge or consent. Where monitoring software had once been a relatively small market, populated by benign-sounding products like Hubstaff, ActivTrak, Workpuls, and Time Doctor, it’s grown. A lot.
Concrete numbers are difficult to come by, but, according to analysis from Top10VPN released in August, US demand for employee surveillance software is up 58% since 2020. The same report noted that in April 2020, as the full implications of lockdowns and work-from-home orders were realized, demand for employee monitoring software soared 87% and fell only slightly, to 71% above the pre-pandemic average, a month later.
Since then, employee monitoring software has remained a booming business. And for people who like privacy and employers who want to have a good relationship with their employees, that may not be a good thing.
“When the pandemic hits, you suddenly see the reality, which is that [organizations] don’t trust employees, they never did,” Ben Laker, professor of leadership at Henley Business School, near London, told me. “Suddenly organizations are panicking—how can we control [our workers] if we can’t see them? At that very core, there’s just no trust.”
Surveillance tech can include taking screenshots of an employee’s computer at regular intervals, tracking what websites they visit during company hours, monitoring their keystrokes and mouse movement, and even noting their remote location, allowing employers to know whether their workers are at their desks in their home offices, getting lunch, or picking up their children from school. The ostensible purpose of monitoring is often “increased productivity.” But the big question is: is it worth it?
Studies demonstrate that being watched reinforces positive socially normative behaviors and inhibits negative behaviors. If you think you’re being observed, for example, you’re more likely to donate to charity and less likely to litter, steal a bike, or take too much Halloween candy. And there are, theoretically, valid positive reasons for monitoring, including to safeguard employees from internal discrimination and harassment; employees can also use the collected data to, as one academic put it, “stare back” at employers and expose problematic or even dangerous practices through whistleblowing.
But even back in the 1980s, with the minimal electronic surveillance available, employees whose performance was monitored perceived their working conditions as more stressful and reported higher levels of job boredom, fatigue, anger, anxiety, and even depression and other health complaints. Observers, including in places like the Wall Street Journal, worried that electronic surveillance would turn the modern office into “fishbowls” and “sweatshops.”
Workplace surveillance, however, continued, despite evidence that it tended to undermine trust between employee and employer. And now, the shift to remote working has meant that surveillance that was once limited to the office is happening, well, anywhere the employee is. Almost nowhere is safe from Big Brother.
In some cases, though, there is evidence that monitoring doesn’t promote productivity or curb negative behaviors. For one thing, remote workers are already more productive, monitoring or no. An April 2021 Bloomberg report found that working from home during the pandemic lifted productivity 5% across the US. What’s more, monitoring can backfire. In a 2011 study, computer monitoring that employees felt violated their privacy increased employees’ destructive behavior. Henley Business School’s Laker suggests that employees who are over-monitored are robbed of a sense of freedom and autonomy, which can in turn undercut their performance. “Without autonomy, [employees] won’t master [new skills] and they won’t have purpose,” he told me.
Any discussion of surveillance, however, must take into account the world we inhabit now. The concerns that we had about physical and digital privacy even five years ago are not the same ones we have currently. Conversations about what it means to be private are shaped by the convenience and ubiquity of social media, big data, and what Australian Roger Clarke, a consultant and research professor, described in 2019 as the “digital surveillance economy.”
Despite the possibility that increased acceptance might mitigate some of monitoring’s negative impacts, it’s hard to get past the inherently icky nature of surveillance.
The recent rise in employee surveillance accelerated during the pandemic, largely because it had to, but the bottom line is that we are now more than ever accustomed to being watched. We accept the intrusion of cameras in novel spaces under the promise of increased safety; doorbell cameras spring to mind, but so too do webcams and smartphones; we accept data tracking to prove we’re “not a robot” on websites; we accept that our information, our clicks, and our preferences are observed and noted.
We seem to be primed now to accept that companies have a reasonable expectation to protect their own safety, so to speak, by monitoring us. One recent survey by media researcher Clutch of 400 US workers found that only 22% of 18- to 34-year-old employees were concerned about their employers having access to their personal information and activity from their work computers. Meanwhile, in a pre-pandemic survey of US workers by US media group Axios from August 2019, 62% of respondents agreed that employers should be able to use technology to monitor employees.
And yet, despite the possibility that increased acceptance might mitigate some of monitoring’s negative impacts, it’s hard to get past the inherently icky nature of surveillance. “Show me someone who wants to be surveilled,” Laker said.
So, is there a way to ethically, appropriately monitor workers? A lot of that comes down to how the employees themselves feel about being monitored. Amy Vatcha of the London School of Economics wrote in a 2020 paper that employee acceptance of workplace monitoring “depends on these factors—transparency on data collection from employers, clarification of data usage for system security or for hiring and firing decisions, and the avenues available for employee privacy concerns to be heard.”
These sound like sensible measures, but it’s hard to imagine that all the companies that have rushed to install the technology have thought through these protocols. So maybe the solution to maintaining increased productivity and keeping the remote office bumping along is to trust employees and leave the spyware alone.