ECPR

Install the app

Install this application on your home screen for quick and easy access when you’re on the go.

Just tap Share then “Add to Home Screen”

In the Wake of YouTube’s ‘Ad-Pocalypse’: Biased Algorithms and Securitisation

Governance
Policy Analysis
Critical Theory
Internet
Social Media
Technology
Bronwyn Miller
University of New South Wales
Bronwyn Miller
University of New South Wales

Abstract

This paper examines the algorithmic practices of YouTube after ‘Ad-pocalypse’ through a Data Justice lens, evaluating the potential threats to social justice underlying the reactive changes made to Google’s policies and data management. Based in my Master’s thesis research, it examines changes in YouTube’s algorithmic forms of governmentality and their impact on user (in)equality. Overall, this research examines two issues: firstly, how the withdrawal of advertisers in 2017 changed YouTube’s distribution of information and capital by significantly strengthening mechanisms of control for both advertisers and YouTube’s artificial intelligence content identifiers (Google Brain) and the equality and fairness of these policy changes; and, secondly, how YouTube’s response to ‘Ad- pocalypse’ by participating in the ‘fight against terrorism’ highlights the platform’s global forms of premediative securitization in its regulation of a biopolitical body. This research uses an interdisciplinary methodology, combining aspects of multi-modality (Benson, 2017; Kress, 2010; Jewitt, 2009; Kress & Van Leeuwen, 2001) and critical analysis, and using constellation analysis (Ohlhorst and Schön, 2015) to visualize the relations between these modes. Ethical data management is becoming increasingly vital to the maintenance and promotion of a democratically principled global society, particularly within the context of the declining popularity of traditional media. Consequently, as platforms move from systems of governance to systems of control, their continual examination of is increasingly important. That is, online systems were previously ruled by a system of governance, meaning if a user’s actions violated policies or guidelines their content or account could be removed from the website/platform. An online system of control is where the ability to act in a manner contrary to a platform’s wishes is removed entirely. This shift towards control systems has a huge impact on a user’s ability to speak freely or even refute/critique a platform’s data management practices. YouTube/Google’s crucial role in people’s access to information means that its algorithm-driven decisions have already become a naturalized form of organizing and displaying information. This paper highlights how intervening in this naturalization is critical, as Google’s automated categorization of users directly impacts on people’s ability to participate in society, some more than others; and examines the algorithmic bias that promotes inequality and intensifies surveillance among specific demographics.