Archive

Monthly Archives: February 2016

Last week I attended a workshop held by the Alan Turing Institute under the general theme of “Algorithm Society” – this proved to be really useful in crystallising some of the ideas I’d been having around the PhD and I think has brought me closer to having a defined topic. Having previously only approached this area of study from a “big data” perspective, the “algorithm society” concept, which looks at the incorporation of machine learning, “algorithmic” decision making and automation into social processes was particularly useful in bringing together some of my thoughts around cybercrime and the PhD.

The workshop included presentations and discussion groups on various topics – although crime and surveillance were not focused on in their own right, they had undeniable relevance to many of the areas discussed and I felt able to make useful contributions from a criminological perspective. In particular, the “work” group, which discussed the consequences of incorporating algorithmic and machine learning technologies into the labour market and within the workplace, was particularly relevant, touching on issues of discrimination, “sorting” bias and the changing nature of work and social interaction. I was also interested by some of the discussion we had around how people “gamed” or subverted algorithmic systems, for example Mechanical Turk workers forming groups to discuss how to get the best jobs or businesses trying to artificially increase their standing on TripAdvisor.

Much of the “work” topic discussion had relevance from a criminological perspective – this was split, broadly, into the effects of algorithmic/machine learning processes on the labour market and the incorporation of them into the work people do. The first strand, discussing how these technologies were being used to make decisions about hiring, allocating work and shaping labour processes situated the human subject as “within” the algorithm, bound up in the social world which the algorithm sorted and shaped, whether that be by choosing which Uber drivers were selected for fares or micromanaging and surveiling workflow in an Amazon shipping warehouse. There was also broader discussion of how this affected work and class on a macro scale, with the potential creation of an “algorithmic working class” of workers with little to no labour rights or capacity for communal organisation. Is this just an extension of managerialism or a new “social order”?

In the second strand, algorithms were treated more as a tool, augmenting the labour of professional and skilled workers and removing the “grunt” or “bulk” elements of their work in order to reduce error or to allow them to focus on higher order processes. This tied into an earlier discussion with Donald Mackenzie around how these systems affected where “power” was located in organisations. As in many cases, the algorithm did not make the “final decision”, its role was rather in structuring and presenting information to a final decision-maker who could authorise action (or not), this had the effect of concentrating decision-making power in that individual, where previously the “grunt work” done by the algorithm would have been the product of a wider group of people who could influence elements of the decision chain.

I’m keen to write more on the workshop, but I’ll finish here for now with some potential question which this poses for the PhD. I began my research with a broad “ANT and cybercrime” scope, in particular reacting to the existing theoretical literature on cybercrime and proposing that ANT might provide a starting point to investigating the role and importance of non-human actors in cybercrime. One of the conceptual problems with this was the bracketing off of “cybercrime” as a phenomenon in its own right – this is a very broad and nebulous term which encompasses a lot of very different phenomena. In some sense, any crime committed in a high-tech society will have some “cyber” element so it might be more useful to look at a particular novel phenomenon associated with the rise of high-tech infrastructure in late-modern social spaces. “Non-human technological actors” is itself a broad and non-homogeneous group, however this does suggest a potential phenomenon – the automation of social and human processes and the insertion of non-human “algorithmic” or “machine learning” actors in decision-making processes. As this pertains to “cybercrime”, one of the most obvious applications of this kind of technology is in the incorporation and analysis of massive information flows surveillance and policing.

Distilling this down into some bullet-pointed research questions:

  • How does the presence of “algorithmic” intermediaries in the decision-making chain affect the work of surveillance and policing? What effects does this have on the experience of those making use of these systems? To what extent is this a process of “automation”?
  • How do these algorithms work and develop and what are the consequences for justice and surveillance? How do they learn/encode values and norms in their sorting and ranking processes and are there any unintended consequences? Are political or organisation decisions important for the function of these algorithms in their social/work context? If a machine learning algorithm can end up a “black box” whose operation is difficult or impossible to understand, even for its creators, what are the processes for accountability?
  • What are the consequences of these systems interacting with whole populations on a “databody” or “dataperson” level? Is there a “social sorting” effect?
  • How do people subvert these algorithms? Identity management? Malicious “tricking” of the algorithms to increase the risk scores of a target? “Air gap” work? “Systemic” subversion/attack using botnets, DDOS etc.? How does this affect the day-to-day use of the internet (and broader social interaction) by people who practice “deviant” behaviours?
  • How does this interact with the increasing automation of many types of cybercrime?

Current reading: various research papers, Surveillance as Social Sorting edited by David Lyon

Fiction: Just finished The Good Terrorist by Doris Lessing and now on the excellent Embed with Games by Cara Ellison.

 

 

Advertisements