Tag Archive for: digital intermediation

Recently, I was invite to deliver a keynote for a joint session with the News and Media Research Center and the Centre for Deliberative Democracy to explore the ideas and concepts of digital intermediation.

The blurb:

How might generative artificial intelligence (AI) and automation be undertaken to produce social good? In an increasingly automated digital media world, user agency is challenged through the loss of interaction functionality on the platforms, technologies and interfaces of everyday digital media use. Instead, algorithmically designed decision making processes function for users to assist them in making sense of these environments as a means of assisting them to seek out content that is relevant, of interest and entertaining. However, if the last five years are anything to go by, these sorts of recommendations, particularly across social media, have caused anything but social cohesion and unity amongst users, and have instead spread misinformation, vitriol and hurtful media. Would our society be different had we designed systems that focused on, while still entertaining, content that places the wellbeing of humans at the forefront over content that is, for the most part, popular?

This presentation uses the lens of digital intermediation to explore how civic algorithms might be designed and implemented in digital spaces to improve social cohesion. By unpacking the technologies, institutions and automation surrounding the cultural production practices of digital intermediation, it becomes clearer how these leavers can be adjusted to nudge and encourage platforms, users and content creators to engage in improved civic processes. As a digital intermediation challenge, creating and working with civic algorithms presents as a potentially useful approach towards improving the cornerstone of our democracies by ensuring citizens have access to accurate information, are engaging in the discussions that are important and relevant to them, and are operating within digital environments that value social good alongside commercial gains.

And here’s the recording of the session, slides included:

It is with great pleasure I can share the publication of my new book, Digital Intermediation: Unseen Infrastructure for Cultural Production.

https://www.taylorfrancis.com/books/mono/10.4324/9781003177388/digital-intermediation-jonathon-hutchinson

This book offers a new framework for understanding content creation and distribution across automated media platforms – a new mediatisation process. The book draws on three years of empirical and theoretical research to carefully identify and describe a number of unseen digital infrastructures that contribute to predictive media (algorithmic platforms) within the media production process: digital intermediation. The empirical field data is drawn from several international sites, including Los Angeles, San Francisco, Portland, London, Amsterdam, Munich, Berlin, Hamburg, Sydney and Cartagena. By highlighting the automated content production and distribution process, the book responds to a number of regulatory debates emerging around the societal impact of platformisation. Digital Intermediation: Towards transparent digital infrastructure describes and highlights the importance of key developments that help shape the production and distribution of content, including micro-platformization and digital first personalities. The book explains how digital agencies and multichannel networks use platforms strategically to increase exposure for the talent they manage, while providing inside access to the processes and requirements of developers who create algorithms for platforms. The findings in this book provide key recommendations for policy makers working within digital media platforms based on the everyday operation of content production and consumption within automated media environments. Finally, this book highlights user agency as a strategy for consumers who seek information on automated social media content distribution platforms.

As with all new publications, Routledge have provided a 20% discount for all purchases – please use code AFL03.

Also, a series of book launches are underway from August through to October in Australia, so looking forward to seeing those who can travel to the following locations:

  • 9 August – News and Media Research Centre, University of Canberra
  • 20 September – Digital Media and Research Centre, Queensland University of Technology
  • 27 September – AI Governance and Trust in Digital Societies, University of Sydney
  • 19 October – RMIT University
Public media and automation

I’m super happy to announce a book chapter, co-authored with my colleague Jannick Sørensen, in The Values of Public Service Media in the Internet Society. Our chapter is titled Can Automated Strategies Work for PSM in a Network Society? Engaging Digital Intermediation for Informed Citizenry.

2020 was a tough year for everyone, all round. It was also tough in the research output space as reviews slowed, research focus was redirected, conferences stopped, and the overall productivity of our research space grinded to a turtle pace – at times driven by an increased demand on our skills in the teaching space.

What I think we will see is a slowing of research output in the next few years as we all took a hit in research access, fieldwork and overall ability to keep researching during 2020. But it is nice to see colleagues still publishing for the moment and getting back on track in 2021.

One of those outputs for 2021 is our co-authored chapter that explores the role automation plays in public service media. To approach this we have used the lens of digital intermediation to understand how user visibility plays into the overall strategies of increasing uses of automation within public service media.

As always, please get in touch if you have issues with access to the book chapter.

Hong_Kong_2019

We performed our academic FIFO (Fly In Fly Out – thanks for the insights here Jolynna) duties recently at the first University of Sydney and Hong Kong University symposium, expertly crafted by Professor Heather Horst and Dr Tom McDonald.

During the one day symposium, all researchers were asked to respond to the somewhat broad theme around the concepts of cross border media flows and social imaginaries – in thinking through these two areas, it is a lovely way to bring sociology and media studies (communication if you will) together:

Media of various forms, and the infrastructures and communities that are associated with them, have often been strongly determined by national boundaries. This is particularly the case in different countries dispersed across the Asia-Pacific region, where media organisations are often owned by government entities and/or large companies. Such media organisations also frequently have political or commercial roles that, arguably, make them less susceptible to the kinds of disruption that have been witnessed by their European and American counterparts in recent years. At the same time, the movement of people, goods, capital, information and ideas are undergoing shifts and intensifications, owing to broader geopolitical changes, state-led infrastructure projects and the aspirations of individuals and communities shaped by such regional transformations.
 
Against this context, media flows are being created, worked and reworked, facilitated by new infrastructures, imaginaries and understandings. These flows frequently cross, circumvent or come up against borders, both domestic and international. For instance, countries such as China and the US increasingly compete to export infrastructures across the region through the promotion of platforms, technologies and services. Online shopping, logistics, blockchain and fin-tech are fostering new cross-border flows of goods and money. Media content is increasingly consumed internationally, posing new opportunities and challenges for media companies, regulators and governments. Users and consumers of the media are also witnessing the reworking of their media environments because of these changes, and are adopting inventive responses to and adaptations of the media in return.
 
This symposium, and the planned journal special issue that will result from it, explores these changing circuits of media in the Asia Pacific region. We ask contributors to consider: How are media flows redefining understandings of borders? What kinds of novel communities are being created by cross-border media flows? What forms of social imaginaries accompany the emergence of new infrastructures from “outside”? How are boundaries and borders being made, unmade or remade within and across the Asia-Pacific region?

Personally, it was a unique opportunity to apply my recent thinking around digital intermediation to the concept of social imaginaries to understand how geopolitical borders are constructed, de-constructed and enforced and reimagined – there is no better place in the world than Hong Kong to get that sort of thinking on.

If you are interested in the research I have started in this space, you can access my presentation here:

But enough about me, the better work was all around! Here are some notes and reflections from the research presented:

Sylvia Martin – Imagin(eer)ing peace: Simulations and the state

  • Holograms and military uses of them
  • USC and Shoah Foundation
  • Hologram shown in front of young students and they ask him questions
  • Filmed in a multi-camera environment
  • Statistical classifier to find the best answer to the questions
  • The Girl and the Picture
  • IBM Watson to do the classifier for the woman filmed in The Girl in the Picture
  • What enables the production of survivors who have crossed the borders?
  • There is a close connection between the state and industry – building larger goals into the process
  • There are a number of agencies involved in this process
  • Leads to the ‘Imagineering’ of content – this is the link to the hologram
  • The industry in Hollywood has shifted to military content –
  • The emergence of the Silicon Beach – the increase of tech etc in Venice Beach
  • Institute of Creative Technology (ICT) – military, academia and entertainment

Joyce Nip – Friends and foes: China’s connections and disconnections in the Twitter sphere

  • While much of the social media is blocked, “foreign hostile networks taking over the regions”
  • @XHNews – one of these ‘blocked’ Chinese Twitter accounts
  • CGTN, SCMP, Xianhwa News
  • Looking at #SouthChinaSea
  • Interestingly @XHNews have set the frames around “Aircraft Carrier”
  • There may be not artificial warfare, but other computational forces at work
  • Hub account – I think this means the sorts of large betweenness centrality
  • @9DashLine and @AsiaMTI758 are the most retweeted accounts
  • What is the correlation to the US based news services then picking up the ‘new’ framing of the events?
  • Hub accounts are super important
  • So are Russians more interested in global news than other countries?

 Heather Horst – From Kai Viti to Kai Chica: Debating Chinese influence in Fiji

  • Chinese aid has been welcomed in Fiji, in anticipation of APEC 2018
  • Cable net offer from Oz around the islands, to ward off Chinese influence
  • Strong connection with the last coups between China and Fiji
  • Fiji states it is a relationship, not influence
  • The 28 WG Friendship Plaza building has difficult Chinese/Fiji relations
  • First instance of fake news in Fiji – China will take the island of Kadavu to recover the $500m debt
  • Fiji has an informal censorship process in its media system
  • The Wikipedia page has been adjusted to say a ‘Province of China’ but was changed back ‘quickly’
  • Oz support is participatory government (aid cultures), Chinese has been infrastructure support
  • Often
  • A common thread between all papers of influence through infrastructures and countries?
  • What is the broader impact of social media on the Chinese influence?

Discussion

‘Great Power Rivalry’ – some nation states are more important than others. This promotes the idea of what are we missing? What if you don’t have a ‘state’ formed around you? Jewish context and the Chinese massacres contexts. Non-state actors (not ISIS, but the anarchist forms).

China is not one – There are a number of Chinese (Mainland, New Territories, Hong Kong)

 Bunty Avieson – Minority language Wikipedias for cultural resilience

  • Privilege has moved online, through connected communication
  • Cognitive justice – beyond tolerance is something that we need
  • Localised knowledge practices contribute to cultural production – this is a form of resilience
  • Pharmacon – a cure and a killer
  • Wikipedia paints one aspect of the unity of users, knowledge,
  • Wikipedia is drawing information from Wikipedia
  • Anyone can edit is a myth – Wikipedians are white global north, Christian, under 30, technical competent
  • Oral cultures – only 7% have been written down
  • Positional superiority (Said), long tail of colonialism

Tom McDonald – One Country, two payment systems: Cross-border digital money transactions between Hong Kong and Mainland China

  • WeChat Advertising campaign that rolled out across Hong Kong during the time of protest
  • Immigration has increased significantly during this period
  • One country/two systems – the border remains constant
  • There is a focus to engage communication technologies to secure the future
  • 2016 the Money Authority gave the right to five operators to launch digital wallets (Alipay, WeChat, Octopus, OlePay, TapnGo)
  • Users are using WeChat and/or Alipay to transfer funds and then purchase things for cheaper (better rates) in Hong Kong
  • WeChat groups are emerging for money transfer

Discussion

  • Culture is always changing, cultural dynamism is a better term
  • More explanation of microplatformization, and digital intermediation
  • Can oral Wikipedia help solve the Bhutan problem?

Jolynna Sinanan – Mobile media and mobile livelihoods in Queensland’s coal mining industry

  • What access do miners have when away from home?
  • Three areas of contestation: they are not allowed to have mobiles while working, They are often in remote areas with low coverage, connection to home is no one’s responsibility
  • mobilities and families – digital media characterized by mobilities
  • Literature says: Digital media is how families do everything together, this is how users make sense of each other and their context while they are apart from each other
  • Social transformations are under-developed
  • Jhow mobilities make sense. through ‘work’ and ‘home’
  • Drops ‘cashed-up bogan’ as a term to describe the impact of the stress on the workers
  • FIFO Life as a producer of memes
  • How is this different to pilots? They fly in and out, have similar digital media tools, but are vastly different in how they react with their family?

Tian Xiaoli – No escape: WeChat and reinforcing power hierarchy in Chinese workplaces

  • WeChat users often think about superiority online – who is senior? Who is younger? This is reflective of offline lives
  • Hierarchy and behaviour studies as a background for the workplace

Jack Linchuan Qiu (Chung Minglun & Pun Ngai) – The effects of digital media upon labor knowledge and attitudes: A study of Chinese vocational-school students

  • School students from poorer backgrounds – being trained for vocational jobs (blue collar)
  • Effects study on the rights
  • The border between social classes
  • A study on human capital (Becker, 1964) – the internet economy, the knowledge economy,
  • How is the schooling process outdating, or distracting, or are they adding to the education process?
  • Passive use of internet versus active use (net potato (Kaye, 1998))
  • A process that leads to individualistic usage (Ito), hyper-individualistic
  •  Village well (Arora, 2019)
  • Increased consumerist activity does not necessarily relate to decreased labour subjectivity
  • Media literacy encourages reflective thinking
  • Is consumerist worry an elitist position?
  • What is the labour subjectivity if the user is Reflective/individualistic? for example

Tommy Tse – Dream, dream, dream: The interwoven national, orgnaisational, and individual goals of workers in China’s technology sector

  • Sociology pays more attention to the practice beyond the theoretical
  • Cultural practices and how they play out in labour practices
  • Chinese dream versus Alibaba Dream versus individual dream
Jonathon_Hutchinson_Transparent_Infrastructures

I have just completed a world-wind European tour, giving lectures at some of the best media institutes this side of the planet. Thanks to all the folk who made this possible, and took the time to promote my work. I’d like to reflect on that work and the discussions I’ve had with many great people within this post as I prepare this thinking for my next book – namely who should be facilitating and innovating transparent automated media systems? I argue public Service Media (PSM).

The thrust of this latest research was to problematise the concept of the ‘black-box’ as has been argued by so many scholars as something that we have no control over and are almost helpless to its control.

I think some of the most important work in this space was undertaken by Frank Pasquale and his Black Box Society book, which highlights the role algorithms play in society from a finance, legal and economic’s perspective. His argument of how algorithms control not only finance, but our digital lives, is a call for increased transparency and accountability on those who facilitate these technologies.

I also appreciate the work of many scholars who contribute and develop this arena of scholarship. Safiya Noble has done amazing work here and here book Algorithms of Oppression is a landmark piece of scholarship that brings to bear the real world implications of how algorithms are not only bias, but racist and oppressive.

Noble’s book leads well into Tania Bucher’s also groundbreaking book If…Then, that further develops the politics of algorithms and automated systems, to offer media academics a framework to help think through some of the implications of these socio-technical constructs.

I also find Christian Sandvig’s work incredibly inspiring here. While Sandvig’s work on algorithms and discrimination is super interesting, this particular piece on Auditing Algorithms sparked a particular interest in me on how to research algorithms.

But what I have found through most of this literature are two things, and this is perhaps where my ‘application brain’ is most curious. Firstly, most scholars tend to ignore user agency in these relationships, as if we are helplessly at the mercy of mathematical equations that are determining our society. Most (some) people are aware of the algorithm, and how to work alongside it these days, if our interface with platforms like Netflix, Spotify, YouTube etc. is anything to go by. Secondly, no-one talks of who should be responsible for facilitating a better system. Should we simply make more policy that tries to calm the overlord digital tech companies of now, or should we be thinking five to ten years ahead on how that technology can be used for society’s benefit (and not in a Chineses Social Credit System sense, either)?

So that is what I have been talking about in the last few weeks, and I think it is really important to include in the automated media conversation. I have been developing a digital intermediation framework that incorporates a number of these actors, and trying to understand how the intermediation process occurs. Check this out:

Digital intermediation actors, as part of the intermediation process

This is a first parse at what will become an important tool for a facilitating organisation who should be leading and innovating in this space: public service media.

Work has already commenced in this space, and we can draw on the thoughts of Bodó et al. (2018):

Public service media have charters that oblige them to educate, inform, and sustain social cohesion, and an ongoing challenge for public service media is interpreting their mission in the light of the contemporary societal and technological context. The performance metrics by which these organizations measure the success of their algorithmic recommendations will reflect these particular goals, namely profitability, loyalty, trust, or social cohesion.

Bodó, B., Helberger, N., Eskens, S., & Möller, J. (2019). Interested in Diversity. Digital Journalism, 7(2), 206–229.

So then, how does PSM do this? One way is to embed it in editorial policies to ensure PSM employees are operating as such. Another is to undertake PSM innovation remit and start teaching its users on how to work with algorithms effectively.

I don’t think ‘cracking open the black-box’ is all that useful to operationalise. They are often complex algorithmic formulas that require specialist expertise to design and interpret. But affording a control mechanism that enables users to ‘tweak’ how the algorithm performs may be not only possible, but crucial.

This is my focus for my last few weeks while I am working as a Research Fellow here in Hamburg.

hutchinson_predictive_media

The following passage is a thought moment, and by no means exhaustive of placing the idea within existing theories/fields. It would be interesting, and probably the published version of this will do so, to align it with media and cultural studies, queer theory or perhaps discrimination studies. That said, here is a thought process…

I have been undertaking substantial research into artificial intelligence (AI) and automation since arriving here at Hans Bredow. I am beginning to think that perhaps automation/AI isn’t the best or most appropriate way to frame our contemporary media lives. Those concepts certainly are a part of our media lives, but there may be a better way to describe the entire environment or ecosystem as I have previously written.

What I do understand at this point is that media curation/recommendation is responding to us as humans, but we are also responding to how that technology is responding and adapting to us. This is a human/technology relationship, and one that is constantly being refined, modified, adapted and changed – not by either agent alone, but collectively as any two agents would negotiate a relationship.

This type of framing, then, suggests we should no longer be thinking about algorithmic media, or automated media alone. Perhaps what we should be thinking about is the relationship of processed and calculated digital media with its consumers – for this I will use the term predictive media.

I will attempt to explain how I have arrived at predictive media.

Artificial Intelligence (AI) Media

Media certainly isn’t in an AI moment – I’m not entirely sure I align with AI to be honest (or at least I am still working through the science/concept and implications). Beyond its actual meaning, it feels as thought it is the new business catch phrase – “and put some AI in there with our big data and machine learning things”. If artificial intelligence is based on machine learning, the machine requires three phases of data to process: to interpret external data, to learn from those data, and to achieve a specific goal from those learnings. This implies that the machine has the capacity for cognitive processing, much like a human brain.

AI is completely reliant on data processing to produce a baseline, incorporate constant feedback data after the decisions have been made, and the recalculation of information to continue to improve its understanding of the data. Often, there is a human touch during many of these points placing a cloud of doubt over the entire machine learning capacity. While this iterative process is very impressive when done well, there will always be data points that are indistinguishable to a computer.

We should instead be thinking about these processes as a series of decision points, of which we also have input data.

Say for example, you are making a decision to board a bus to travel into town. AI would process data like distance, timetable, the number of people on the bus, for example and recommend which bus you should catch. What it can’t tell is if the bus driver is drunk and is driving erratically, or that the bus has advertising that you fundamentally disagree with, or that you have 10 students travelling with you. In that scenario, it will be the combination of AI processes along with your human decision making that prove to be the best interpretation of which will be the best bus to catch in to town.

As I see it, we are not in a pure Algorithmic Media moment – and this will be a long way away, if it manifests at all.

Algorithmic Media

We have also seen the rise of algorithmic media, which often presents itself as recommender systems or the like, which essentially suggests you should consume a particular type of media based on your past viewing habits or because of your demographic data.

Algorithmic media can be very useful, given our media saturated lives that have Netflix, Spotify, blogs, journalism, Medium, TikTok, and whatever else makes up our daily consumption habits. We need some help to sort, organise and curate our media lives to make the process possible (efficient).

Think of a Google search. It is often the case we search for specific information based on our needs. Google knows the sorts of information we are interested in and will attempt to return information that is relevant and useful. Of course this information result has a number of levers in operation behind the mechanics of results, for example commercial priorities, legislation, trends, etc., Further, we have also seen how algorithms can be incredibly racist, selective, indeed chauvinistic.

In some areas, developers have started addressing these areas, given the algorithms are developed by humans. But there is still a long way to go with this work.

So in that sense, I’m not algorithmic media makes a whole lot of sense due to the problems associated with it. It could be that by the time the algorithmic issues are entirely addressed, we will have moved on to our next media distribution and consumption phenomena.

Predictive Media

So if this is our background (and I understand I have raced through media and technology history, and critical studies here – I will flesh this out in an upcoming article), humans have altered their relationship with technology.

Heather Ford and I are about to (hopefully!!) have an article published that explores the human/technology relationship in detail through newsbots, but I think it is broader than bot conversations alone.

Indeed, content producers adapt and shift their relationship with algorithms daily to ensure their content remains visible. But I think consumers are now beginning to shift their relationship with how technology displays information. If not shift, we are definitely recognising these digital intermediary artefacts that impact, suspend, redirect, or omit our access to information.

Last week, Jessa Lingel published this cracking article on Culture Digitally, The gentrification of the internet. She likened our current internet to urbanisation, and made the argument that the process of gentrification is clearly in operation:

an economic and social process whereby private capital (real estate firms, developers) and individual homeowners and renters reinvest in fiscally neglected neighborhoods through housing rehabilitation, loft conversions, and the construction of new housing stock. Unlike urban renewal, gentrification is a gradual process, occurring one building or block at a time, slowly reconfiguring the neighborhood landscape of consumption and residence by displacing poor and working-class residents unable to afford to live in ‘revitalized’ neighborhoods with rising rents, property taxes, and new businesses catering to an upscale clientele

Perez, 2004, p.139

In her closing paragraphs, Jessa made a recommendation that is so obvious and excellent, why haven’t we done this before?

Be your own algorithm. Rather than passively accepting the networks and content that platforms feed us, we need to take more ownership over what our networks look like so that we can diversify that content that comes our way. 

Lingel, 2019, n.p.

It made me think about food and supermarkets – certainly in Sydney we have two (maybe three) major supermarkets. But there is a growing trend to avoid them and shop local, shop in food co-ops, join food co-ops, and change our food consumption habits entirely. If those major chains want to push inferior products and punish their suppliers to increase the bottom line, as consumers we (in the privileged Australian context) have the option to purchase our food elsewhere.

Why wouldn’t we do the same with our digital internet services? Is this a solution to bias, mismatched, commercially oriented media algorithms and the so-called AI?

Is Predictive Media the Solution?

I think we can apply the same approach towards predictive media.

We cannot consume the amount of media that is produced, suggesting we may be missing crucial information. We cannot trust automated media because it has proven to be incredibly bias. But perhaps it is in changing our relationship with technology and understanding how they work a little better, we might find a satisfactory medium.

It is not only greater transparency that is required to address our problems of automated and algorithmic media, but it is a proactive engagement with those machines to train the programs to understand us better. But changing that relationship is difficult if you don’t know that is an option. So perhaps the real call here is to establish alternative and transparent digital communication protocols that are easily accessible and decipherable for users. In education, change is possible, and this may be a defence against the current trajectory for digital media.

The combination of both increased understanding/transparency and more active engagement with training ‘our’ algorithms could be the basis for predictive media, where predictive media helps us beyond a commercial platform’s profit lines, and exposes us to more important and critical public affairs.

Original Image by Hello I’m Nik.

Jonathon_Hutchinson_Digital_First-Personality

EDIT: It is worth noting that News UK has teamed up with The Fifth to undertake exactly the point of this article. Read the Digiday article here.

In around 2017, Mike Williams and I had a few beers (can I say that?) in one of the studios at the ABC with a view to thinking through what was happening in the media at that time.

Instagram was ‘blowing up’, YouTube was going nuts, and a swag of micro platforms such as Vine, Musical.,ly, and others were fuelling the rise of branded content producers – otherwise known as solo content producers, otherwise known as influencers.

Both Mike and I had, and still do, have our favourite content producers, as I’m sure many readers do, and we often refer to their channels to see what they are doing, how they are reacting to certain global events, of what the latest trends might be.

But what we were interested in that night was understanding how this exploding creative industry was running alongside the existing media organisations, or was it all – where we both had a keen interest in how the ABC was shaping up in comparison.

One of the concepts we started throwing around was this idea that social media content producers now make their celebrity-ness online, build these massive audiences (or highly engaged audiences), and then often make the jump to traditional media. At the time, #7dayslater had just finished season 1 and I thought it was going to be a new production model the ABC would indeed pursue (but then, funding cuts).

What #7dayslater did represent however, was the praxis between online content producers and media organisations such as the ABC. And so was born the first concept of the Digital First Personality.

Of course this concept only raised more questions that night, like:

  • Why would a content producer become popular with their own style, and then switch over to somewhere like the ABC (with a remit for public service)?
  • How could they maintain their platform salary if they were to go off brand with their audience (suddenly start talking about the ABC as part of their suite of everyday-ness)
  • Should online content producers be trained by media organisations, and if so does that mean traditional celebrities should ‘learn’ social media?
  • Does the digital first personality become the new cultural intermediary?
  • Now that we have finished several beers, shall we go and have dumplings?

I’ve been thinking, researching and developing these questions for the last few years (beyond the call for dumplings), and have developed the concept of the digital first personality significantly. I first took it for a test drive with my MECO3602 Online Media students who bought into it and then also pulled the idea apart. I have presented the idea at a few conferences and have received some great feedback from colleagues along the way. Recently, I have resubmitted an article with major revisions to an A ranked journal, and am hopeful it will be published soon.

The last round of revisions with that journal really pushed me to think through some of the fundamental and theoretical concepts of the digital first personality. More broadly, I am beginning to draw connections between the digital first personality and microplatformization as part of the Digital Intermediation research project – how online content producers craft their skills as cultural intermediaries that are both experts at social influence and understanding platform automation, i.e. recommender systems. This is now starting to feed into the infrastructure work I am undertaking within the automated media space.

Here’s a basic introduction to how I am approaching the framework of the digital first personality:

Intermediation has traditionally been undertaken by a number of stakeholders including institutions, humans and non-human actors, to transfer information from one group of individuals to another. Recently, two new actors have emerged within the digital media ecology through cultural intermediation: social media influencers and automated media systems engaging algorithms. Cultural intermediation as a framework is a useful way to understand emerging social and cultural forms as a result of new media technologies. Cultural intermediation (Bourdieu, 1984) that describes how social capital can be exchanged between different stakeholder groups also incorporates market economics (Smith Maguire and Matthews, 2014) and expertise exchange. The latest iteration of cultural intermediation includes the agency of platforms, social media influencers and increasingly algorithms. Understanding this new form of cultural intermediation is crucial to enable items of public importance to remain visible.

Social influencers, which have previously been referred to as microcelebrities (Marwick, 2013; Senft 2013) and digital influencers (Abidin, 2016), are a particular subset of cultural intermediaries. Through their developed expertise to identify ‘cool’ boundary objects, they are able to engage in multiple media production practices to demonstrate the value of those objects to their large audiences. Examples of this practice include Zoella who often engages her audience with the products from her latest shopping haul (revealing the contents of one’s shopping bag), Evan’s Tube who engages his younger audience with an ‘unboxing’ of the latest Lego kit, or Fun for Louis who is often travelling to exotic locations to reveal its most appealing side. In each instance of these social influencers producing content, they engage in high levels of media literacy to transfer the value of the chosen product or service to their large fan base: a trustworthy, word of mouth news sharing technique. They will typically do this across a number of social media platforms, including their TikTok channel for the behind-the-scenes content, the Instagram platform for the ‘hype’ photo or Insta-Story, and a YouTube video to engage their largest audience.

The second emerging aspect of cultural intermediation is the algorithmic arena, which to a large extent describes how automation is undertaken across digital media platforms. As Gillespie (2014: 167) notes, algorithms “are encoded procedures for transforming input data into a desired output, based on specified calculations”. Within a media ecology that sees significantly more content produced than can be consumed, algorithms, in one sense, are seen as mechanisms to assist users in finding and consuming content that is relevant to their interests. In most cases, this manifests as a recommender system, which is represented as ‘Recommended for you’, ‘Up Next’ or ‘You will Like’ types of automated mechanisms. However, there is an increasing body of literature, which is described in detail below, that challenges the bias, power and relationships with content, society and culture that are represented by automated media systems.

Cultural intermediation that combines both social influencers and algorithms, then, acts as a process for media visibility across emerging networked platforms. What has become the process of blending private with public media (Meikle, 2016) has, as Turner (2010) highlights through the demotic turn, enabled ordinary folk to become key influential media producers. However, these key actors within cultural intermediation are typically engaging with the content production and distribution process for the social media entertainment (Cunningham and Craig, 2017) benefits such as increased social and economic capital. This cultural intermediation process is operationalised by what I argue is the digital first personality: those individuals that produce digital content for maximum visibility by engaging social influencer publication strategies that appease platform algorithms. In many cases, their media production focus is on commercial products and services to increase their social and economic capital. Within the social influencer genre that excludes fake news and disinformation, public issues, public affairs, news and current affairs, are often ignored in lieu of highly profitable alternatives.

So here is a beginning for a new area of research. I feel as though I have completed my fieldwork in digital agencies for now, but i can see a new space opening up that looks at the intersection of microplatformization and digital first personalities as the backbone of digital intermediation.

Original photo by Dean Rose on Unsplash

Jonathon_Hutchinson_Digital_Intermediation

Social media audiences consume approximately three percent of the entire amount of content published across platforms (Bärtl, 2018). Of this three percent, a small number of popular digital influencers create that content, for example Casey Neistat, Logan Paul, or Zoella that, arguably, leads to media homogenisation through the limited focus of popular themes and topics. Moreover, platform providers, such as YouTube and Instagram, operate on algorithmic recommender systems such as ‘trending’ and ‘up next’ mechanisms to ensure popular content remains highly visible. While platforms in the digital era exercise a social and political influence, they are largely free from the social, political and cultural constraints applied by regulators on the mass media. Beyond vague community guidelines, there remains very little media policy to ensure that the content produced by digital influencers and amplified by platforms is accurate, diverse to include public interest, or are indeed beneficial. 

This project will research the content production process of automated media systems that engage digital influencers, or leading social media users, who interact with extraordinarily large and commercially oriented audiences. The evidence base will assist in developing theory on contemporary digital media and society, which will consequently shape how communities access public information. Instead of harnessing this knowledge for commercial imperatives, this research project will examine the findings in the context of socially aware digital influencers who occupy similar roles to those found in traditional media organisations. Further, this project will examine how algorithms are making decisions for media consumers based on commercial executions, which are often void of the social awareness associated with public affairs and issues.  

At a time when mass media comes under scrutiny for its involvement in perpetuating misinformation around public issues, accurate media becomes increasingly crucial to the provision of educative material, journalistic independence, media pluralism, and universal access for citizens. At present, media organisations are attempting to repurpose traditional broadcast content on new media platforms, including social media, through automation built on somewhat experimental algorithms. In many cases, these organisations are failing in this new environment, with many automated media attempts appearing more as ‘experimental’. This should be an opportunity for media organisations to rethink how they produce content, and how new informed publics might be brought into being around that content. 

Instead of thinking of automation as a solution to their increasing media environmental pressures, media organisations should be looking toward algorithms to curate and publish informative media for its audiences. This moment provides a unique opportunity to research the contemporary social media environment as media organisations experiment with automated media processes. It also challenges our understanding of automated media through popular vanity metrics such as likes and shares, in what Cunningham and Craig (2017) are calling ‘social media entertainment’. Under this moniker, these scholars highlight the intersection point of social media platforms, content production, and entrepreneurial influencers who commercialise their presence to develop their own self-branded existence. Abidin (2016) refers to these users as digital influencers, to include YouTube and Instagram superstars who demonstrate an unprecedented capacity to manifest new commercially oriented publics. Digital influencers are typically young social media users who commercially create content across a host of social media platforms, which is liked, commented on and shared by millions of fans. It is estimated the top ten 18-24 year old YouTubers are worth $104.3 million collectively (Leather, 2016), indicating a burgeoning new media market. This model of exercising digital influence within automated media systems has potential to translate into the support of an informed public sphere amid a chorus of social media communication noise.  

The research is innovative in a number of ways. Firstly, it is groundbreaking through its approach of collecting and comparing datasets of contemporary social media practice from within the commercial and non-commercial media sectors. Secondly, it theoretically combines media studies, science and technology studies, sociology and internet studies to bolster the emerging field of contemporary life online: an interdisciplinary approach to everyday social media. Thirdly, methodologically it combines traditional qualitative methods such as interviews and focus groups, and blends these with contemporary digital ethnography techniques and emerging social network analysis. Fourth, this research contributes to the emerging field of automation and algorithmic culture, by providing a groundbreaking exploration of data science with traditional audience research: a field of particular importance for media organisations. Finally, the outcomes will provide innovative insights for digital agencies and leading media organisations. 

Aims and Outcomes 

The aims of the project are:  

  1. to understand how digital influencers operate across social media, in both commercial and non-commercial media environments;  
  2. to document how digital media agencies enable digital influencers to create large consumer based publics; 
  3. to examine and understand how algorithms are operating within large-scale media content production; 
  4. to identify how global media is incorporating digital influencer roles and automation (if at all) into their production methodologies; and 
  5. to provide a new theoretical framework, recommendations and a policy tool that enables media organisations to manifest and engage with its audiences on critical public issues.  

The aims will be met by engaging in digital ethnography methods that documents how digital influencers produce content and actively engage with their audiences in an online community. These users are responsible for creating discussion around a number of issues they deem to be important, yet are typically driven by commercial imperatives. These conversations inspired through influencer content production is then compounded by the digital agencies who operate as amplifying agents for those messages, by especially ‘gaming’ the exposure mechanisms of YouTube and Instagram. However, this research will seek to prove that if this model can work in the commercial media environment, can socially aware digital influencers adopt the same techniques. 

The primary research question is:  

  1. how do digital influencers operate to create large consumer based publics?  

The research subquestions are: 

  1. how does automation operate in media content production and distribution? 
  2. how do automated media systems make content distribution decisions based on behavioural assumptions? 
  3. how can media organisations incorporate the successful methods of automation and digital influencers in their publishing practice? 

Background 

Digital influencers are social media users, typically ‘vloggers’ or video bloggers, who create content about products or lifestyles on popular themes including toys, makeup, travel, food and health amongst other subject areas. Increasingly, digital influencers are using a number of social media platforms to build their brand and publish content to their niche and considerably large audiences. This process of content production and distribution is emblematic of digital intermediation through social media platforms that afford individuals to operate in a media ecology, while determined through algorithmic processes. As Gillespie (2014, p.167) notes, algorithms “provide a means to know what there is to know and how to know it, to participate in social and political discourse, and to familiarize ourselves with the publics in which we participate”. At the heart of these algorithmic platforms distributing trending and popular content are the digital influencers who are creating popular, entertaining media and represent the flow of traffic and information between increasingly large audiences. 

Media organisations have been experimenting with both digital influencers and automation to create and distribute its branded content. In many cases, commercial media have employed the services of digital influencers to boost their traditionally produced media content, while deploying, in many ways, crude experiments in automation. Media brands consistently send digital influencers products and services to integrate into their ‘lifestyle’ videos and images. Recommender systems (Striphas, 2015), such as those used for distribution platforms such as Netflix have proved most popular, where content is suggested based on an audience member’s past viewing habits. Recommendation systems have been adopted across a number of media services including Spotify, Apple iTunes, and most news and media websites. The integration of chatbots is also rising, where the most interesting experiment has emerged from the public media sector through the ABC News Chatbot. Ford and Hutchinson (forthcoming) note that the ABC News Chatbot is not only an experiment in automated media systems, but also a process of educating media consumers on how to access crucial information from within a cornucopia of media. 

The key theoretical problem demonstrated in these examples is an asymmetric distribution of agency when automated systems make ‘decisions’ that can be based on flawed normative or behavioural assumptions (Vedder 1999). At worst, there is no possibility to override the automated decision. That is why algorithmic recommendations are sensitive matters and should be explained to users (Tintarev & Masthoff 2015). But explaining and understanding recommendation systems requires deep technical knowledge as the results are produced by a series of complex and often counter-intuitive calculations (Koren et al 2009). Furthermore, recommendations are often the result of more than one algorithm applied in the online and offline processing of consumer behaviour data (Amatriain & Basilico 2015). The asymmetrical relationship this creates between users and media content providers is especially problematic due to the public complexion and social responsibility obligations that should be demonstrated by media organisations. 

Digital influencers as cultural intermediaries are tastemakers that operate across traditional media platforms such as television and radio, and have become more effective at their translation ability across social media platforms such as Instagram, Twitter and Vine for example. Digital intermediation is the next phase of this research, which builds on cultural intermediation, yet focuses on its relationship with automated media systems. 

Original by Ari He on Unsplash

Jonathon_Hutchinson_New_Blog

After a hiatus in publishing, I now have a brand new research blog!

For those of you who remember, I had my old blog at jonathonhutchinson.com. But after accidentally letting that domain name expire, some internet soul purchased that space immediately and then wanted to sell it back to me for money that it just wasn’t worth (maybe). So unfortunately, I lost about ten years worth of public research.

Some of that work including my early steps in blogging in about 2009, some foundational work with Adrian Miles on Quicktime multi video work and hypertextual media (now called vlogging), all of my Honours Research, all of my PhD research, and some of my post PhD work.

An expensive lesson, but one that I have now justified.

For a while, I thought I might switch to a YouTube Research Channel instead of a research blog, but the work in putting out research vlogs is huge (big shout outs to Zoe Glatt for her ongoing work on that format).

So…

This blog is a space is where I will publish research updates, methodology insights, conference undertakings, and some teaching and learning activity.

Most importantly, I am currently working on the research of the project, Digital Intermediation: A study of automated media influencers. Rising from the ashes of a failed DECRA application, I am now on a Research Fellowship at the Hans Bredow Institute in Hamburg Germany. I will work on my research as part of the Algorithmed Public Sphere, alongside Cornelius Puschmann and Jan Schmidt. I will recored all of the work from that research here.

Of course, if you have questions or would like to get in contact, please drop me a line.

Looking forward to many conversations in this space!

Feature photo by Danielle MacInnes on Unsplash.