In recent years, news organisations like the BBC and the New York Times have worked to offer more personalised news through their websites and apps, based on what topics people indicated they are interested in or in some cases auto-generated recommendations based in part on the content they have used in the past.
The German publishing group Axel Springer has launched Upday, a new mobile news aggregation developed in partnership with Samsung that tracks users’ reading habits and automatically selects a personalised stream of content from across the web. The Washington Post is just one organisation working on developing a chatbot to automatically deliver personalised news to individual users through messaging apps. More widely, digital intermediaries including Google, Facebook, and Twitter offer algorithmically generated personalised search results, news feeds, and timelines whether we use them for news or other purposes.
Clearly, the era in which news was a fairly stable product, published or broadcast in a single fixed form which people could pay attention to or not is gradually coming to an end. A rapidly developing combination of responsive design, automated display decisions, and increasingly detailed individual-level data allow both publishers and platforms to deliver more tailored content and services. The news you get will increasingly depend on who you are, where you are, and what the publisher or platform you access news via know about you, people like you, and people around you.
This development worries some journalists and public intellectuals. Personalised recommendations may undermine editorial control. Many journalists think they, not recommendation engines or social filters, ought to determine what news people get. Automated display decisions may lead to filter bubbles (where we only get news that confirm our existing views), algorithmic discrimination (where news is customised in ways where people miss out on important information, for example, poor people getting little or no financial news), and raise privacy concerns (as they are based on collecting individual-level data about users).
Our 2016 survey shows that ordinary people across the world share these concerns, but at the same time embrace automatically generated personalised recommendations.
Strikingly, when asked to rate three different ways of getting news – (1) having them selected by editors and journalists, (2) having them automatically selected on the basis of an individual’s past consumption, and (3) having them automatically selected on the basis of what friends consume – in almost every country, personalised recommendations based on a person’s own past consumption comes out number one.
PROPORTION WHO THINK THAT EACH SELECTION METHOD IS A GOOD WAY TO GET THE NEWS (SELECTED COUNTRIES)
Type | Editors and journalists | Automatically based on past consumption | Automatically based on friends’ consumption |
---|---|---|---|
Germany | 36% | 36% | 23% |
Austria | 33% | 37% | 23% |
Spain | 32% | 40% | 23% |
Korea | 24% | 29% | 29% |
USA | 22% | 37% | 17% |
UK | 20% | 31% | 13% |
Base: Total sample in each country.
More people say they prefer personalised recommendations as a better way to get news than when it is editorially selected or based on social recommendations. Algorithms are more popular than journalists, as long as the algorithms are based on people’s personal consumption. Editors are consistently preferred to social recommendations in almost every country. The differences are not huge, but they are consistent and significant across most countries. There are generational differences that vary somewhat from country to country, but broadly speaking, younger people are more likely to agree that personalised recommendations and social recommendations are good ways to get news online.
When it comes to getting news, most people, in short, trust themselves more than they trust journalists, and trust journalists more than they trust their wider circle of friends and acquaintances.
At the same time most of our respondents are concerned about the consequences of a move to a more personalised news environment. Across all 26 countries surveyed in 2016, people tend to be worried that more personalised news may mean that they miss out on important information and challenging viewpoints. Concerns over both filter bubbles and algorithmic discrimination are clearly widespread, even as people say that personalised news is a good way to get news. Similarly, when asked, people also worry about privacy; in Germany, Austria, and South Korea more than they worry about the other issues at stake.
PROPORTION THAT ARE CONCERNED ABOUT EACH POTENTIAL CONSEQUENCE OF PERSONALISATION
Type | Concerned about missing out on important information | Concerned about missing out on challenging viewpoints | Concerned about privacy |
---|---|---|---|
Spain | 70 | 67 | 54 |
UK | 65 | 61 | 49 |
USA | 60 | 59 | 49 |
Germany | 44 | 42 | 46 |
Austria | 43 | 44 | 51 |
Korea | 42 | 38 | 45 |
Base: Total sample in each country
In this way, people’s engagement with personalised news recommendations reflects a broader trend – many of us rely every day on digital services that deliver extraordinary functionality and great ease and convenience, even if we may have doubts and reservations about how they actually work.
This illustrates the very pragmatic way most of us are engaging with digital media, media that we use and enjoy even when we do not understand them and are aware that they are in some cases premised on trade-offs that upon closer scrutiny can seem worrying. We may not trust algorithms very much when we stop to think about how they work, but the services they enable are amazing and we would not want to be without them.
For news organisations like the BBC, the New York Times, Axel Springer, and many others working on their own automated personalised recommendations services, the results of our 2016 survey are both encouraging and worrying. Encouraging because they suggest that, as people get accustomed to algorithmically generated search results and recommendations of the ‘if you like this, you might also like’-type more widely, many want similar types of personalised recommendations for news, based on their own past consumption. Worrying because editorial selection – a key part of the identity and brand of news organisations – is not an especially highly regarded way of getting news, and because the standards by which we judge the quality of automatic personal recommendations tend to be set by large search engines and social media companies and others with more data, more engineering talent, and more resources than any news organisation. What works for Amazon may not work as well for the Arkansas Democrat-Gazette.
This suggests that news organisations that want to build personalised recommendation systems – for their websites, apps, and perhaps in the future chatbots – will need to find a balance between editorial selection, personalised selection, and an element of the generally less highly regarded social selection, as they sort out data and technology. The challenge is to deliver content to people that speaks to their individual interests and also connect them with the wider world while simultaneously demonstrating the value of getting this content from, specifically, a news organisation – rather than one of the giant digital intermediaries that are constantly advancing the frontiers of automated content and service delivery. The challenge is to do this in a world where people certainly have reservations about algorithms, but many have even more reservations about journalism. As one participant in one of our UK focus groups said: ‘I think that I would probably trust an algorithm more. If this was an editor that I didn’t know or an algorithm I didn’t know, an algorithm won’t have an agenda and it will offer a more wide range of sources’ (20–34 year old, UK Focus Group).
News organisations that believe in the value of editorial selection and have built their brand and identity around this belief thus face a two-fold challenge. How can they leverage the power of personalised recommendations without undermining the value of their editorial work, and how can they convince more people that getting news from editors and journalists can actually be a better way of getting it than getting it filtered by algorithms?