The Reuters Institute digital survey for 2018

Survey Methodology for the 2018 Digital News Report

This study has been commissioned by the Reuters Institute for the Study of Journalism to understand how news is being consumed in a range of countries. Research was conducted by YouGov using an online questionnaire at the end of January/beginning of February 2018.

  • The data were weighted to targets based on census/industry accepted data on age, gender and region to represent the total population of each country. The sample is reflective of the population that has access to the internet.
  • As this survey deals with news consumption, we filtered out anyone who said that they had not consumed any news in the past month, in order to ensure that irrelevant responses didn’t adversely affect data quality. This category was lower than 1% in Finland, averaged around 3%, but was as high as 8% in the United States.
  • A comprehensive online questionnaire was designed to capture different aspects of news consumption.
  • Face-to-face focus groups were held in the US, UK, Germany, and Brazil to explore issues relating to social media and messaging apps. These were conducted by Kantar Media.

Our survey was conducted using established online panels run by our polling company YouGov and their partners. Because this is an online survey the results will under-represent the consumption habits of people who are not online (typically older, less affluent, and with limited formal education). Where relevant, we have tried to make this clear within the text. The main purpose is to track the activities and changes over time within the digital space — as well as gaining understanding about how offline media and online media are used together. A fuller description of the methodology and a discussion of non-probability sampling techniques can be found below.

Along with country-based figures, throughout the report we also use aggregate figures based on responses from all respondents across all the countries covered. These figures are meant only to indicate overall tendencies and should be treated with caution.

In fourteen countries, respondents this year were only able to take the survey using a desktop or laptop computer. Although all other quotas were met (e.g. age, gender, region), it is possible that the figures for device use in those countries may have been affected, specifically computer/laptop figures may be inflated and in some countries smartphone and tablet numbers may be lower than expected. It is important to keep in mind that many people use multiple devices, and the dataset still contains smartphone and tablet users who also use a computer. The countries affected were Italy, Japan, Netherlands, Brazil, Spain, Canada, Czech Republic, Poland, Portugal, Austria, South Korea, Switzerland, Hungary, and Turkey. We have flagged the problem on country pages where appropriate.

More Detail on YouGov Research Methods

By David Eastbury, Director (International Research), YouGov

Internet surveys administered through panels have become a feature of the research landscape in the last decade and a half. YouGov pioneered the use of online research in the UK in 2000 and, particularly in its early adoption stages, has been pretty much a lone voice in advocating the use of the internet for social research.YouGov conducts its public opinion surveys online using something called Active Sampling for the overwhelming majority of its commercial work, including all nationally and regionally representative research. The emphasis is always on the quality of the sample, rather than the quantity of respondents.

When using Active Sampling, restrictions are put in place to ensure that only the people contacted are allowed to participate. This means that all the respondents who complete YouGov surveys will have been selected by YouGov, from their panel of registered users, and only those who are selected from this panel are allowed to take part in the survey.

YouGov’s Panel

Over the last fifteen years, YouGov has carefully recruited a panel of over one million British adults and over 6 million people worldwide to take part in their surveys. Panel members are recruited from a host of different sources, including via standard advertising, and strategic partnerships with a broad range of websites. These people cover a wide range of ages, gender, social grade, ethnicity and tenure. These panels are large enough to enable YouGov to select both nationally representative samples that reflect the actual breakdown of the population on the key demographics (such as age, gender, region etc.) and target specific samples (such as legal service users, cat owners etc.).
When a new panel member is recruited, a host of socio-demographic information is recorded. For nationally representative samples, YouGov draws a sub-sample of the panel that is representative of the population in terms of a number of different demographic variables (generally age, gender and region as a minimum), and invites this sub-sample to complete a survey.To reiterate, with Active Sampling only this sub-sample has access to the questionnaire via their username and password, and respondents can only ever answer each survey once.

Quality Sampling

Obtaining good-quality samples is a challenge for all methodologies. Response rates for telephone polls for example, have been declining in recent years – to typically below 10% – and often much lower in inner city areas. The ability to extrapolate from the under 10% of telephone respondents that pollsters can get hold of, to the 90% that they cannot, is clearly a challenge – leading to concerns over the quality of achieved samples, whether telephone or face-to-face. There are, of course, some areas where an online approach is unsuitable, and YouGov would always alert their clients to this. For example, it would be inappropriate for an online survey to be used to estimate the incidence of paying income tax online or for identifying service needs on a disadvantaged housing estate.
However, it would be unfair to say that online is ‘biased’ in a way that offline is not. The fact is, there are different biases for which all approaches have to account. Online research can be used for a wide variety of topics and given that in the Digital News Survey we are asking about usage of and attitudes towards digital news and screening out those who don’t access it, online is an appropriate methodology. Especially as the main purpose is to track the activities and changes over time within the digital space – as well as gaining understanding about how offline media and online media are used together.

Analysis of the Data

Once the survey is complete, the final data are then statistically weighted to the national profile of all adults aged 18+ (including people without internet access) or whatever the target sample has been defined as, in case of non-representative surveys. All reputable research agencies weight data as a fine-tuning measure and almost all surveys involve weighting, whether they are conducted online, face-to-face or by telephone. This is to ensure that the published results properly reflect the population they seek to measure. For example, men comprise 48% of the electorate and women 52%. The raw figures in a well-conducted survey will be close to this, but not necessarily match these numbers exactly. Suppose the raw figures contain 50% men and 50% women. YouGov’s computer would slightly “downweight” the replies given by the men (so that the replies of 500 men count as if they were 480) and slightly “upweight” the replies given by women (so that the replies of 500 women count as if they were 520).

In practice, the task is more complex than this, as other demographic variables, as well as gender, have to be considered simultaneously. This is a task for YouGov’s computer, which adjusts the raw data to take account of all these factors. At YouGov, the exact demographics vary by country (largely dependent on publicly available figures) – as an example, in the UK, YouGov weights by age, gender, social class, newspaper readership and region, as well as level of education, how respondents voted at the previous election, how respondents voted at the EU referendum and their level of political interest on occasion. Targets for the weighted data are derived from a number of sources, including:

1. The census
2. Large scale random probability surveys, such as the Labour Force Survey, The National Readership survey and the British Election Study
3. The results of recent elections.
4. Official ONS population estimates

Active Sampling ensures that the right people are invited in the right proportions. In combination with YouGov’s statistical weighting, this ensures that the results are representative of the country as a whole. Not just those with internet access, but everyone. While it is true that not everyone does have access to the internet, independent academic research shows that its widespread uptake means the views of those with access to the internet are now mostly indistinguishable from those without.

Online Approach – Additional Considerations

Interviewer bias and social distance

Another key advantage of online for this survey is the neutrality of the interview mode. Independent research has found that respondents modify their answers in the presence of an interviewer, including when the interviewer is on the other end of the phone. This lack of ‘social distance’ can mean that respondents feel compelled to give a ‘safe’ answer.
Online surveys increase social distance so respondents are more likely to disclose important and sensitive information. In addition it enables a respondent to give an answer free from embarrassment and, therefore, a truer reflection of their actual feelings.
The influence of questionnaire design on measurement error has received attention in a number of publications. Chang and Krosnick (2010) conducted an experiment, randomly assigning respondents to complete a questionnaire either on a computer or to be interviewed orally by an interviewer. They found that respondents assigned to the computer condition manifested less non-differentiation and were less susceptible to response order effects. In other words the computer surveys were more likely to extract a truer response.

Pace of interview

Online research is more convenient for respondents; they can fill in the survey in their own time, at their own pace and can formulate more considered answers. The nature of the Digital News Report survey is fairly complex and requires a great deal of time and thought on the respondents’ behalf. Therefore an online approach is ideal for this study.

YouGov Accuracy

YouGov is a leading player in the UK’s media polling and is one of the most quoted research agencies. The only way to demonstrate the accuracy of attitudinal research is to compare predictions with actual outcomes. This is why YouGov have consistently published pre-election polls, even in difficult-to-call contests such as local government elections.

YouGov has a strong history of accurately predicting actual outcomes across a wide range of different subjects, including national and regional elections, political party leadership contests and even the results of ITV talent show The X Factor.

YouGov is also part of ESOMAR – full details of the YouGov answers to the ESOMAR 28 questions can be viewed here