The Fairwork project uses three approaches to effectively measure fairness of working conditions at digital labour platforms: desk research, worker interviews and surveys, and interviews with platform management. Through these three methods, we seek evidence on whether platforms act in accordance with the five Fairwork Principles.
We recognise that not all platforms use a business model that allows them to impose certain contractual terms on service users and/or workers in such a way that meets the thresholds of the Fairwork principles. However, all platforms have the ability to influence the way in which users interact on the platform. Therefore, for platforms that do not set the terms on which workers are retained by service users, we look at a number of other factors including published policies and/or procedures, public statements, and website/app functionality to establish whether the platform has taken appropriate steps to ensure they meet the criteria for a point to be awarded against the relevant principle.
In the case of a location-based platform, we seek evidence of compliance with our Fairwork principles for location-based platform work, and in the case of a cloudwork platform, with our Fairwork principles for cloudwork (online platform work).
Each annual Fairwork ratings cycle starts with desk research to map the range of platforms to be scored, identify points of contact with management, develop suitable interview guides and survey instruments, and design recruitment strategies to access workers. For each platform, we also gather and analyse a wide range of documents including contracts, terms and conditions, and published policies and procedures, as well as digital interfaces and website/app functionality. Desk research also flags up any publicly available information that could assist us in scoring different platforms, for instance the provision of particular services to workers, or the existence of past or ongoing disputes.
The second method involves approaching platforms for evidence. We interview platform managers and request evidence for each of the Fairwork principles. This provides insights into the operation and business model of the platform, while also opening up a dialogue through which the platform could agree to implement improvements to working conditions. In cases where platform managers do not agree to interviews nor share any evidence with us, we base our scoring on evidence obtained through desk-based research and worker interviews.
Worker Surveys and Interviews
The third method involves collecting data directly from platform workers. This allows us to see workers’ contracts and learn about platform policies and practices that pertain to their working conditions. Workers who participate in our research are fairly compensated for their time and efforts, and their survey and interview responses are kept entirely confidential.
Gig Work Platforms: We employ a diverse worker recruitment strategy that incorporates both on- and off-platform methods. We recruit workers by hiring their services via the platform (e.g., by ordering an Uber), approaching workers at known worker meeting points (e.g., at spots where delivery couriers congregate), through social media sites and online forums (e.g., on Facebook, WhatsApp or Reddit) and using snowball sampling (where workers we interview refer us to their colleagues). We interview 6-10 workers at each platform. These interviews do not aim to build a representative sample, but instead seek to understand the processes of work, and the ways it is carried out and managed. In other words, data from workers is useful to understand whether problems exist; they are not used to understand how prevalent those problems are. Worker interviews also help us understand how platform policies work (for which representative samples are not needed).
Cloudwork Platforms: We employ on- and off-platform recruitment methods here as well. Our approach varies depending on how the platform is organized and how much information about workers is visible on the platform. Sometimes we invite workers to participate by listing our interviews and surveys on the platform as a ‘job’. In addition to recruiting workers via the platform, we also approach workers via social media sites and cloudworker forums, and use snowball sampling. We aim to survey at least 10 workers in each continent that a platform operates in. In this manner, we seek to develop a purposive sample that is attentive to the types of tasks that workers do, workers’ physical locations, and how highly ranked a workers’ profile is on the platform. Accordingly, we selectively filter the survey responses and interviews to construct a suitable sample. This filtering process is one way through which we protect the identity of workers who take part in our research and ensure they cannot be linked or traced to the final Fairwork scores that platforms receive.
From Data to Ratings
This threefold methodological approach allows us to cross-check the claims made by platform management, while also providing the opportunity to collect evidence from multiple sources. Final scores are collectively decided by the Fairwork team based on all three forms of information gathering. The scores are decided through a rigorous peer review process that includes the core team of researchers involved in planning and executing the research (e.g. the Fairwork Germany team in the case of the Germany platform ratings), the central Fairwork team based in Oxford, and two reviewers from other Fairwork country teams. This allows us to provide consistency and scientific rigour to the scoring process. Points are only awarded if clear evidence exists on each threshold.