Technology that monitors your location and the time spent in the bathroom at work. A program that takes random screenshots of your laptop screen. A monitoring system that detects your mood during your shift.
These are just some ways employee surveillance technology — now enhanced, thanks to the rapid growth of artificial intelligence — is being used.
Canada’s laws aren’t keeping up, experts warn.
“Any working device that your employer gives you, you can assume it has some way of monitoring your work and productivity,” said Valerio De Stefano, Canada research chair in innovation law and society at York University.
“Electronic monitoring is a reality for most workers.”
Artificial intelligence could also be deciding whether someone gets, or keeps, a job in the first place.
Automated hiring is already “extremely widespread,” with nearly all Fortune 500 companies in the United States using AI to hire new workers, De Stefano said.
Unlike traditional monitoring, he added, AI is making “autonomous decisions about hiring, retention and discipline” or providing recommendations to the employer about such decisions.
Employee surveillance can look like a warehouse worker with a mini-computer on their arm that’s tracking every movement they make, said Bea Bruske, president of the Canadian Labour Congress.
“They’re building a pallet, but that particular mini-computer is tracking every single step, every flick of the wrist, so to speak,” Bruske said.
“They know exactly how many boxes are being placed on that pallet, how much time it’s taking, how many extra steps that worker might have taken.”
There is little data documenting how widespread AI-powered worker surveillance might be in Canada. Unless employers are up front about their practices, “we don’t necessarily know,” Bruske said.
In a 2022 study by the Future Skills Centre, the pollster Abacus Data surveyed 1,500 employees and 500 supervisors who work remotely.
Seventy per cent reported that some or all aspects of their work were being digitally monitored.
About one-third of employees said they experienced at least one instance of location tracking, webcam or video recording, keystroke monitoring, screen grabs or employer use of biometric information.
“There is a patchwork of laws governing workplace privacy which currently provides considerable leeway for employers to monitor employees,” the report noted.
Electronic monitoring in the workplace has been around for years. But the technology has become more intimate, taking on tasks like listening to casual conversations between workers.
It’s also become easier for companies to use and more customized to their specific needs — and more normalized, said McGill University associate professor Renee Sieber.
De Stefano said artificial intelligence has made electronic monitoring more invasive, since “it is able to process much more data and is more affordable.”
“Employer monitoring has skyrocketed” since AI has been around, he added.
Those in the industry, however, insist there’s also a positive side.
FutureFit AI, based in Toronto, creates an AI-powered career assistant. CEO Hamoon Ekhtiari stated that it can help people navigate rapidly changing workplaces due to technology.
The AI can search for jobs, offer career advice, find training programs, or develop a plan for the future. During the hiring process, it can provide quick feedback to applicants about any shortcomings in their applications, according to Ekhtiari.
As artificial intelligence becomes widespread in Canadian workplaces, lawmakers are working on introducing new regulations.
The federal government has suggested Bill C-27, which would establish requirements for “high-impact” AI systems.
This includes systems involved in “decisions related to employment, such as recruitment, referral, hiring, compensation, promotion, training, apprenticeship, transfer, or termination,” explained Innovation Minister François-Philippe Champagne.
Champagne has expressed worries that AI systems could maintain bias and discrimination in the hiring process, affecting who sees job advertisements and how candidates are ranked.
However, critics have raised concerns about the bill not specifically addressing worker protections. Additionally, it will only come into effect after regulations implementing the bill are developed.
In 2022, Ontario started mandating employers with 25 or more employees to have a written policy outlining electronic monitoring and specifying the purposes for which the information can be used.
According to De Stefano, the proposed legislation and Ontario law do not provide sufficient protection for workers.
De Stefano also mentioned that activities like reading employee emails and tracking time are permitted, as long as the employer has a policy and informs employees about the situation.
“It's good to be aware, but if I have no way to address the use of these systems, some of which can be very problematic, then the protection is not very meaningful.”
Ontario has put forward a proposal to require employers to disclose their use of AI in the hiring process. If approved, it would make the province the first in Canada to implement such a regulation.
In theory, provincial and federal privacy laws should provide some safeguards. However, Canada's privacy commissioners have cautioned that the current privacy laws are extremely inadequate.
In October, they stated that “the recent increase in employee monitoring software” has brought to light the fact that workplace privacy laws are either outdated or completely missing.
Regulators in other countries have been taking action. For instance, in January, France fined Amazon $35 million for using an overly intrusive system to monitor employees.
Unions have also been paying attention to this issue. The Canadian Labour Congress is not satisfied with Bill C-27, and according to Bruske, employees and their unions have not been adequately consulted.
De Stefano suggested that the government should not allow employers to unilaterally decide on adopting these systems. Instead, workers should be fully informed and given the opportunity to voice their concerns.
Sieber added that governments should strive to distinguish between monitoring performance and surveillance, with bathroom-break timing falling into the latter category.
One could argue for prohibiting certain technologies completely, like tools that use 'emotional AI' to determine if a worker is happy in front of a computer or on an assembly line, she said.
Emily Niles, a senior researcher at the Canadian Union of Public Employees, mentioned that AI systems rely on data such as time logs, the number of tasks completed during a shift, email content, meeting notes, and cellphone use.
“AI cannot function without data, and it's actually our data that it operates on,” Niles explained.
“This is a significant area for the union to get involved in, to emphasize the voices and control of workers over these technologies.”