top of page

Política de privacidad


Ban the use of opaque algorithmic decision-making technology to dynamically set variable pay and assign work. As platforms mature and growth slows, platforms have turned to harmful dynamic pay and pricing systems to increase their margins. Opaque algorithms set hyper-variable pay rates from hour to hour, worker to worker, leaving them unable to anticipate how much they will earn or how much work, if any, they will be offered. This creates discriminatory wage outcomes amongst workers and leaves many earning below minimum wage. In our litigation against Uber, the court found that their dynamic pay system denied UK workers the algorithmic transparency they are entitled to. Despite this, none of the platform employers are yet providing the necessary transparency on automated pay setting systems.


Ban the use of algorithmic decision-making to dismiss, discipline or sanction workers. Platform workers are often dismissed by means of an algorithmic decision without notice or explanation. They may be issued generic messages and their efforts to communicate with the platform employer are usually ignored. Workers’ first recourse to support is via an in-app chatbot which routinely misunderstands, fails to compute critical information, gives circular answers and limits routes to further support or opportunity to appeal. One worker was only able to speak to a human after threatening suicide. Workers are provided with no appeals procedures to follow, and they are often told decisions are final without any chance to discuss the allegations made against them. This is already unlawful according to the UK GDPR but these protections must also be entrenched in employment law. The Court of Appeal in Amsterdam agreed with workers that they were robo-fired with little human intervention, which the judge dismissed as “nothing more than a purely symbolic act” and rejected Uber’s argument that their lack of transparency was necessary for the protection of their trade secrets. UK platform workers should not have to go to Amsterdam to protect their rights, such rights should have a statutory footing in UK employment law. No worker should ever be fired by a machine.


Ban intrusive surveillance systems that violate the dignity of workers. In digitally mediated work, vast amounts of personal data are collected from and about workers, including when they are logged off, to surveil and create secret performance profiles. Much of the personal data collection is intrusive, disproportionate and unnecessary. We have seen across all platforms the growing use of predictive AI fraud detection systems to unjustly control and criminalise workers. Workers must not be surveilled when they are not working. The use of facial recognition systems and the collection of biometric data from workers must be prohibited. Platform employers must not record, infer or predict the emotional state or physical or psychological health of workers. The monitoring or prediction of worker trade union activities or the exercise of their legal rights must also be banned. The continuous, automated collection of personal data from workers at work must be restricted only to what is strictly necessary to discharge the work.


Workers must have on demand access to their personal data and to a meaningful explanation of algorithmic decision making affecting them. Platform employers must inform and consult workers about the use of ADM, including profiling, affecting pay and work allocation, performance and behaviour monitoring, contractual terms, working conditions or health and safety. There is a huge asymmetry in information power between global platform employers and their individual workers. Platforms have not cooperated to provide workers with all of their personal data when requested, and algorithmic explanations - when provided - are generally not meaningful or useful. For workers to have control over their digital working lives they must have full knowledge about the data collected from them, how it is used, how they may be profiled and the algorithmic decisions they are subjected to.


Establish a single status of employment for all workers except the genuinely self-employed with the burden of proof placed on the employer not the worker.. Algorithmic control is management control. Platform workers who are subject to algorithmic control over access to their work and pay, or who are subject to performance supervision, must have the statutory protection that comes with employment. Likewise, workers who are genuinely self-employed must have the full freedom of self-employment and not be forced to work under controlling management conditions.


Working time with full statutory protection must be from the time of worker log on to log off, including waiting time. Current minimum wage promises from platforms such as Uber and Deliveroo only recognise working time from dispatch to passenger drop off or package delivery with waiting time excluded. But for on-demand platforms to meet instant response times, they rely on a huge excess of labour, standing by unpaid. Workers can spend up to 50% of their working day unpaid, despite being required to remain available on the platform app where they are subordinate to management controls.


Guarantee the right of trade unions to be consulted and informed on workplace technologies. Unions must have a regular say in the design, deployment and oversight of AI and ADM in the workplace. They must also be consulted before any technology is introduced or changed that has a material impact on the working conditions and terms of employment. Specifically this includes systems that monitor performance, determine pay and work allocation, or that pose a health and safety risk. Recognising the distributed nature of a digitally managed workforce, employers should secure and private channels of communication via the employer’s core technology platform for the purposes of engagement and convening of union members. Trade unions must have the right to seek collective data access and redress on behalf of workers. Trade unions must be allowed to conduct any ballot for strike action by electronic means so that platform workers may fully enjoy the trade union rights readily accessible to other workers.


Compel platforms to share anonymized journey data on an open standards basis with the public data-stores of the cities in which they operate. This should apply in defined major metropolitan areas where a single platform makes more than 10,000 passenger or delivery services per day or 5,000 per day for regional cities with a population less than 1 million people. Journey data should include anonymised location data, distance, duration, fare and pay for every journey. Platform employers and their high-volume business models have a significant negative impact on workers and their communities. Precarious employment, over supply of vehicles relative to demand spills over to exasperate local problems of poverty, congestion and air pollution. Where local cities have attained and published anonymised journey volume data for passenger transport and food delivery services, such information has empowered workers and communities to fight for fair pay, community development and a lower environmental impact.


Empower local authorities to limit the number of autonomous & human operated vehicles platforms may deploy on our streets for passenger and delivery services. Local authorities should set capacity limits in line with what is reasonable for demand, fair pay for workers and for environmental carrying capacity in the local area. The on-demand platform business model for transport services, with attendant unpaid working time and low rates of vehicle utilisation, is very inefficient. This is set to worsen with the arrival of autonomous vehicles. Currently, local licensing authorities have no clear powers to limit licensing capacity for autonomous or human operated taxi and private hire services. Department for Transport data show that passenger journeys in England and in London have roughly halved per head of population over the last 20 years but licensing of taxi and private hire vehicles has roughly doubled in the last decade. It’s time to take back control and place limits on the unnecessary over- supply of high-volume passenger and food delivery vehicles on our streets.


Platform employers must be held accountable for violation of worker and data protection rights. Criminal prosecution, prison sentences and loss of license provide necessary deterrence. Platform employers who are found to engage in bogus self-employment arrangements, denying workers their rights and proper payment, or violating the data protection rights of workers must retroactively pay all due wages, damages, national insurance and VAT. They must face also unlimited fines per worker and, in common with other jurisdictions, company directors should personally face criminal prosecution with custodial sentences upon conviction. The government has been swift to criminalise the unlawful employment of undocumented workers but all too slow to enforce and penalise wage theft and misclassification. As a result, the government is not only affording disproprotionate leniency to employers, but is also missing out on huge amounts of uncollected national insurance and VAT. A new government must send a message to giant digital platforms that employment or data protection violations will not be tolerated.


Endorsed by:

60a1ee168d14f208fc878f15_ADCU PNG.png
WhatsApp Image 2024-06-28 at 18.51.56.jpeg
bottom of page