Research Forum 15 May: Scott Wark (CIM), Correlating Race: On Machine Learning and Digital Culture, 12.15-13.45 in Social Sciences S1.50
15 May: Scott Wark (CIM): ‘Correlating Race: On Machine Learning and Digital Culture’
Abstract:
To treat race as technology, as Wendy H. K. Chun suggests, is to acknowledge that race is something that ‘one uses, even as one is used by it’. To use and to be used in turn neatly describes our relationship with distributed online services, whose free use comes at the cost of the data we produce. In this presentation, I want to draw on recent theories of media and race to explore the role that race plays as category in this unequal exchange by focusing on machine learning.
Machine learning algorithms are inductive: rather than processing data according to a pre-created ‘recipe’, they develop their parameters out of the data sets on which they’re ‘trained’. This process of training is probabilistic and it can induce categories—including race. Machine learning doesn’t use such categories to directly exclude. Instead, the social reality of inhabiting such categories gets translated—through extant training data—into correlations that create exclusions indirectly.
This presentation will discuss two issues. First, drawing on the work of other critics, I want to outline how machine learning lends historical racism a veneer of objectivity by using past exclusions to validate future ones. Second and more speculatively, I want to argue that machine learning uses data to shift the kind of work that race can be made to do, as category and in fact.
By replacing distinctions—like ‘white/other’—with inductive categories, machine learning makes the category of race something to be correlated rather than something that’s borne. It revitalises race’s existence as category and technique. It puts race to new uses that are, I want to argue, crucial sites of digital cultural production and algorithmic politics. Asking how race is used as an inductive category is crucial to understanding its role as contemporary algorithmic technology and as opaque instrument of control. Or: how it might be put to better uses.