,

Data Intermediation: Towards transparent public automated media

Jonathon_Hutchinson_Transparent_Infrastructures

I have just completed a world-wind European tour, giving lectures at some of the best media institutes this side of the planet. Thanks to all the folk who made this possible, and took the time to promote my work. I’d like to reflect on that work and the discussions I’ve had with many great people within this post as I prepare this thinking for my next book – namely who should be facilitating and innovating transparent automated media systems? I argue public Service Media (PSM).

The thrust of this latest research was to problematise the concept of the ‘black-box’ as has been argued by so many scholars as something that we have no control over and are almost helpless to its control.

I think some of the most important work in this space was undertaken by Frank Pasquale and his Black Box Society book, which highlights the role algorithms play in society from a finance, legal and economic’s perspective. His argument of how algorithms control not only finance, but our digital lives, is a call for increased transparency and accountability on those who facilitate these technologies.

I also appreciate the work of many scholars who contribute and develop this arena of scholarship. Safiya Noble has done amazing work here and here book Algorithms of Oppression is a landmark piece of scholarship that brings to bear the real world implications of how algorithms are not only bias, but racist and oppressive.

Noble’s book leads well into Tania Bucher’s also groundbreaking book If…Then, that further develops the politics of algorithms and automated systems, to offer media academics a framework to help think through some of the implications of these socio-technical constructs.

I also find Christian Sandvig’s work incredibly inspiring here. While Sandvig’s work on algorithms and discrimination is super interesting, this particular piece on Auditing Algorithms sparked a particular interest in me on how to research algorithms.

But what I have found through most of this literature are two things, and this is perhaps where my ‘application brain’ is most curious. Firstly, most scholars tend to ignore user agency in these relationships, as if we are helplessly at the mercy of mathematical equations that are determining our society. Most (some) people are aware of the algorithm, and how to work alongside it these days, if our interface with platforms like Netflix, Spotify, YouTube etc. is anything to go by. Secondly, no-one talks of who should be responsible for facilitating a better system. Should we simply make more policy that tries to calm the overlord digital tech companies of now, or should we be thinking five to ten years ahead on how that technology can be used for society’s benefit (and not in a Chineses Social Credit System sense, either)?

So that is what I have been talking about in the last few weeks, and I think it is really important to include in the automated media conversation. I have been developing a digital intermediation framework that incorporates a number of these actors, and trying to understand how the intermediation process occurs. Check this out:

Digital intermediation actors, as part of the intermediation process

This is a first parse at what will become an important tool for a facilitating organisation who should be leading and innovating in this space: public service media.

Work has already commenced in this space, and we can draw on the thoughts of Bodó et al. (2018):

Public service media have charters that oblige them to educate, inform, and sustain social cohesion, and an ongoing challenge for public service media is interpreting their mission in the light of the contemporary societal and technological context. The performance metrics by which these organizations measure the success of their algorithmic recommendations will reflect these particular goals, namely profitability, loyalty, trust, or social cohesion.

Bodó, B., Helberger, N., Eskens, S., & Möller, J. (2019). Interested in Diversity. Digital Journalism, 7(2), 206–229.

So then, how does PSM do this? One way is to embed it in editorial policies to ensure PSM employees are operating as such. Another is to undertake PSM innovation remit and start teaching its users on how to work with algorithms effectively.

I don’t think ‘cracking open the black-box’ is all that useful to operationalise. They are often complex algorithmic formulas that require specialist expertise to design and interpret. But affording a control mechanism that enables users to ‘tweak’ how the algorithm performs may be not only possible, but crucial.

This is my focus for my last few weeks while I am working as a Research Fellow here in Hamburg.