By Fabian Ferrari and Mark Graham
Algorithms have become actors in society that shape how people live, love, and work. They direct and redirect economic processes, affect high school grades and college admissions, and influence how people get jobs and how those jobs are managed. Algorithms, in other words, exert and mediate power, enabling and constraining social action in a myriad of realms.
In this new paper published in Cultural Studies, Fairwork researchers Fabian Ferrari and Mark Graham contend that digital labour platforms are essential objects of analysis to grasp the contested nature of algorithmic power. Platforms utilise a double articulation of algorithmic power to govern spatially dispersed workforces in both material and discursive ways. However, algorithms do not have hegemonic outcomes, and they do not entirely strip away agency from workers.
Through manipulation, subversion, and disruption, workers bring fissures in algorithmic power into being. In the paper, the authors define fissures in algorithmic power as moments in which algorithms do not govern as intended. While these moments do not simply result in positive outcomes for workers, they show that algorithmic power is inherently only ever partial.
The paper is part of a special issue of Cultural Studies on Infrastructural Politics, which was initiated by Blake Hallinan and James N. Gilmore in 2018. As they put it, “the issue is animated by a commitment shared among founding figures of Cultural Studies, activists, and abolitionists: the capacity to critically engage infrastructure in order to improve the lived conditions of culture.” The Fairwork project, in its work to understand and counter some of the forces that disempower gig workers across the world, shares this commitment.
Algorithms shape our societies, mould our politics, and direct our economies. As they are infused into the governance of ever-more life practices, we need to ensure that accountability, transparency, and user-participation are regulated into those systems. Until then, it is worth remembering that the fissures described in this paper show that the power that algorithms exert and mediate is far from hegemonic. Fissures, and the creative, playful, and powerful forms of resistance that bring them into being, rarely render algorithms impotent, but they do open up possibilities for agency, freedom and control to be exerted beyond their reach.
Read the paper (“Fissures in Algorithmic Power: Platforms, Code, and Contestation”) on Taylor & Francis Group’s journal.
Read the open access version (This is an Accepted Manuscript of an article published by Taylor & Francis in Cultural Studies. It is deposited under the terms of the Creative Commons Attribution NonCommercial License (CC BY-NC), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited.)
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.
Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.
This website uses the following additional cookies from third party websites:
These cookies will remain on your computer for 365 days, but you can edit your preferences at any time through the "Cookie Settings" in the website footer.
Please enable Strictly Necessary Cookies first so that we can save your preferences!
This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.
Enabling this option will allow cookies from:
These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.