UK bill waters down protections against ‘robo-firing’ in gig economy, say experts

London: Protections for gig economy workers will be watered down should ministers succeed in pushing a controversial bill through parliament, experts have said.

The new law would weaken a relatively little-known right to force app-based firms to explain themselves when they make automated decisions, known as “management by algorithm”, before many workers had even realised they had it, they said.

Campaigners have criticised it as a “deregulatory race to the bottom” that will further disadvantage gig economy workers during the cost of living crisis.

Currently, people are able to see when companies such as Uber use the vast quantities of data at their disposal to automate decision-making. That was reaffirmed by a court in April, which found in favour of several drivers who were “robo-fired” by the taxi firm, then denied an explanation.

Yet few workers know about it, according to the Institute for the Future of Work (IFoW). And a bill that has already cleared several Commons hurdles would make it impractical for many to actually pursue it in future, the research group said.

The campaign group Connected by Data said: “This bill is part of a deregulatory race to the bottom that removes many of the current controls and safeguards over automated decision-making (ADM), data access and usage that protect UK consumers, workers and patients.”

It added: “In practical terms … gig-economy and big tech companies will be further empowered to subject workers to non-transparent ADM without human review safeguarding, more easily refuse workers access to data held on them by the company, and avoid consultation with workers around data-driven systems that affect them.”

The data protection and digital information bill proposed to make it easier for firms to charge people for access to the data used to automatically reach management decisions, and to refuse requests altogether on the grounds that they had defined them as vexatious, an IFoW spokesperson said.

He added that the spread of management by algorithm had “created real problems because, when management functions are replaced by automation, workers lose any sense of agency or redress”.

“Where it becomes complicated is where a person or groups of people are let go but, because there is no transparency, they do not know why.”

He referred to a ruling by the court of appeal in Amsterdam in the case of the UK-based Uber drivers. The judge found in favour of the workers, who claimed they had been “robo-fired” over what they called “spurious allegations of ‘fraudulent activity’” that were not meaningfully overseen by a human. Moreover, they said, they were “stonewalled” by the firm when they tried to find out how it had used its data to make the decision.

The campaign group Worker Information Exchange, which helped bring the case, said this week it received confirmation Uber would not appeal. But it claimed the firm was still not abiding by the ruling.

“The reason Uber is unlikely to go with the ruling is that they don’t want people looking at their algorithm for commercial reasons. That is a problem for workers,” said the IFoW. “Or Uber might offer data to an individual worker that is not in a form that would allow them to understand if any bias has been suffered. We have argued for collective access given, for example, to a union representative who is trained to read the data.”

Uber declined to comment on the passage of the bill. Instead it reissued the statement it offered when the Amsterdam court ruled, which read: “Uber has robust processes in place, including meaningful human review, when making a decision to deactivate a driver’s account due to suspected fraud.” This was despite the court finding its human review was not “much more than a purely symbolic act”.