7 common questions when moving offline research online

Here are the seven most common questions we are asked when organisations are looking to transition their offline research to online channels.
19 October 2020
transition offline research online
ketan-sanwal
Ketan
Sanwal

Manager Modern Survey Design, India

Get in touch

Many businesses are moving their offline research online this year. Whether the reasons are due to time and money savings or as a direct impact of the COVID-19 pandemic, there has been an increase in the speed of transition.

However, when transitioning offline research online, there are specific considerations to make and unknowns. At Kantar we are committed to supporting our clients in this area and aim to help uncover the answers behind some of your concerns that arise. Here are seven of the common questions that we hear.

1. What types of project work online?

Most projects that are conducted face-to-face can be transitioned to online with some simple tweaks. From our experience, we commonly see surveys under 15 minutes being the easiest to transition online. Whilst the transition does involve making changes to existing research that enables self-completion of the questionnaire and to assist with engagement, there are tangible advantages around cost, time savings and the overall quality of the final data. Packaging tests, ad tests, brand health, U&A and pricing studies also perform well through online methodologies with the screen- mobile or desktop- allowing respondents to review visual stimulus.

2. How do I ensure online respondents are who they say they are?

The broadness of online often raises concerns of the genuine people behind the respondent ID. Choosing a partner with a robust set of quality measures applied during recruitment, pre and in-survey should give you the confidence that only validated respondents are allowed into your survey and your research is being conducted compliantly.

Kantar has unrivalled and proprietary tools across a range of methods that assess the validity of sources, respondents and their responses to keep your project data clean. This includes the use of captcha’s, identity validation, anti-bot testing and machine learning to take a proactive, 24/7 approach to removing fraudulent panellists in an always-on world, as well as our uniquely patented statistical ‘Honesty Detector’ approach that removes people who have been previously identified as over-reporters before they enter a new survey. This ensures all the data we collect is not just done so compliantly, but also reflects the real world.

3. Is online sample representative in Asia?

Five years ago, less than half the countries in Asia had internet penetration above 50%, the threshold normally needed to reach representative samples with online research. Now, over 90% of countries do, with an average internet penetration of over 70% across Asia. As the gap reduces and tailwinds in mobile adoption boost internet penetration, the ecosystem is conducive for offline to online transition.

We do recognise that there is still lower internet penetration among certain consumer groups in certain markets. In India we still hover around the 50% mark with certain demographics having access to the internet more readily than others. This doesn’t make online unachievable but does mean our approach here has to be balanced and considered when it comes to those we can reach online and how we go about this. This is where working with your supplier is important. They should highlight the challenges and work with you to devise a suitable project plan and, if the need arises, work on hybrid data sources to balance the sample.

4. How do I accurately target the right respondents for my online survey?

Sampling technology has become increasingly more sophisticated in recent years to efficiently handle the quotas through the sample invitation process. If you want 50 men and 50 women to complete the survey, the sample engine, using machine learning, will send out exactly the number of invites it thinks it needs to get 50 males and 50 females. Accompanied by carefully managed panels that are pre-profiled and recruited through a wide range of sources, you can be confident of targeting the audience suited to your specific needs.

5. How can I design my survey to engage respondents?

How you write your questions and script your surveys is key to garnering accurate, honest and considered responses. A narrative that keeps respondents engaged and challenges them to think through their answer is critical, as there is no interviewer to keep respondent interest throughout. Survey length is also important when considering engaging an online audience (under 10 minutes is ideal), as is the way you write your questions. Adapting to a different style of survey may seem daunting, but once complete is easy to replicate and tweak for future waves or new studies.

Recognising that survey participants participate from a range of devices is also important. Screen size is real estate that should be used wisely but it also changes dependent on the device they pick up that day. We recommend the scripting tools you use are device agnostic to ensure your survey can be taken on any device in any orientation.

6. Are there ways to ensure honesty and accuracy in what online respondents report?

There can be discrepancies in what people say they do versus what they actually do. Some respondents answer to maximise their chances of qualifying the survey: we call this “survey mindset”. Others are unaware of the biases in their own reactions or are driven my social acceptance and what might sound better. These can be addressed by understanding the impact of human behaviours and methodological factors and adopting the appropriate survey design techniques to minimise these.

7. What differences can I expect in my data?

As with any change in methodology, we expect to see some data variation when converting offline research online. As a pre-transitioning activity, we suggest conducting pilots to identify contributing factors and gauge magnitude of variation. Once you start to field your survey, you will then need to think about cross calibrating your data. To do this effectively, it is useful to understand all the underlying reasons why data might be different between the offline and online methodologies, as there are differences to do with the shift from human administered surveys to self-completion surveys.

The absence of an interviewer in the surveying process can also reduce social desirability. People tend to overstate socially acceptable behaviour. This can sometimes yield genuine response, yet we see variation versus offline data.

We can help map out all the most common underlying factors that cause data variations when planning your offline to online transition. This ensures you are prepared and that there is no delay or confusion during the analysis phase.

Learn more

Online research offers more streamlined, rapid access to respondents and is a future proof way of collecting consumer opinion. If you are new to online research, Kantar can help you navigate the necessary steps. For more support, download our checklist for successfully transitioning face-to-face research online using the form below.

Get in touch
Related Solutions
Survey programming and fieldwork services, from fully customised research designs to sample-only fieldwork.
Our Modern Survey Design techniques help you understand people better, by asking the right questions in the right way.
We’ve simplified, streamlined and scaled how you access respondent profiles in compliant ways.