Long Surveys are here to stay – Its the economy stupid!

Vivek Bhaskaran
President and CEO – Survey Analytics
As head of privately held Survey Analytics, Vivek is responsible for all aspects of strategy and direction. @surveyanalytics  |  facebook.com/surveyanalytics

It’s become fashionable to say – “researchers who administer long surveys have their head up their you know what” — Hell, I’ve said it in the past. The Future of Insight blog predicts that it’s definitely NOT part of the future. Sandra Rathod and Andrea la Bruna concluded in 2004 (Yes that is 6 Years ago) – that respondents “check out” after 15-20 mins. Our buddies from Survey Sampling say that again a couple of months ago.

So, I thought WHAT IS THE DEAL here. Two questions –

a) are we still administering long surveys?
b) If so – why?

As it turns out, I actually have access to some real data. As part of the group of services and companies I am a part of, we cumulatively conduct about 2 million surveys a month across thousands of clients. So, I thought, I’d mine the data for some insight into how long are the survey exactly.

In March we conducted 1.96 Million surveys. Median Question Length: 35.

Now, lets think about it – In the age of 140 Character Tweets, we are asking 2 million users to fill out a 35 question survey?

Now – lets think about some of the reasons. From our vantage point, there are 4 audiences for surveys:

a) Market Research
b) Customer Satisfaction
c) Public Opinion / Social Science Research
e) Employee Satisfaction

Of the 4 broad categories, Market Research has traditionally been the culprit. We know, Customer Satisfaction has pretty much undergone the transformation from overtly long surveys to “One Question” – aka Net Promoter Score – so the issue of long running Customer Sat surveys are our of the question. Even if you are not using Net Promoter, most companies I know use a variant of the Net Promoter — Secure Customer Index, or ACSI Index — all of which are very short – 3 questions max.
I also happen to know empirically that public opinion, social science research and employee sat. studies account for less than 20% of our customer base – so – in effect they are a non-participant here — at least in no significant way.

This leads us to Market Research. Traditional Market Research consists of 4 players in the value chain:

a) Client – Pays the bills. Has the problem. Has a checkbook. Think – BP, P&G etc.
b) MR Agency – Big agencies traditionally outsource parts of the work to smaller ones. Think GFK, Kantar, IPSOS and then over 14,000 smaller MR firms that service them.
b) Sample Provider – Provides the list of participants to the survey. SSI, e-Rewards, PeanutLabs etc.
d) Technology Providers / Data Collector  - QuestionPro, SurveyMonkey, Zoomerang, Survey Analytics, Vovici, Confirmit etc.

So – lets think about this purely from one axis – Length of the Survey.

Client:
Market Research studies generally have budget on a temporal basis – they can do X projects in a year and have budgeted for that. So, its natural to cram as many questions into each project – natural bloat. There is no significant cost differential between running a 20 question survey vs. a 40 question survey vs. an 80 question survey. The incremental cost are not 2x or 4x.

MR Agency / Sub Agency / Consultant:
Typically – there is a base cost for conducting research – then MR agencies, as with most service oriented businesses, are fundamentally dependent on time. Everyone knows the agency business ultimately is an arbitrage – Pay X and charge Y – for time. The more complicated and lengthy a project it – its better for business. Duh! – So there is no compelling reason for any agency to try and shorten the size of the study – unless they felt the study was too large for success. So, in effect there is absolutely no financially compelling reason why ANY MR Agency would advise a client to shorten the study, break the study into multiple parts. It would have the direct effect of simply reduce their billings.

Sample Provider:
OK – this entity in the value chain is probably hurt the most – however, since all panel providers are priced based on the length of the survey, they are (in theory) covered. The two axis that determine price for any sample provider is a) incidence — what kind of respondents are you looking for, and b) completion rates — directly a function of survey length. I have talked in great length to many sample providers on how much they HATE long surveys and non-interactive surveys. I understand — this is core to their business. They are about the only entity that actually cares about the respondent. However keep in mind, that relationship is not a typical Vendor-Client relationship.  Lost revenue on incorrectly priced projects as well as the bad taste that respondents have, vis-a-vis  the panel provider, are compelling enough reasons for most panel providers to hate studies that are long. However, since they have factored in time – this should be reflected in price they quote. But then competition kicks in and we all have to do what we need to do to get a deal done!

Technology Providers / Data Collector:
I obviously can attest to this part personally. Frankly, we don’t really care. The cost of storing more data (more questions) is insignificant. When we started our business, it used to matter. But once you achieve scale, it really does not matter. So, technically we are ambivalent to this — from a pure economic standpoint – it does not cost us much more to run a 30 question survey vs. a 10 question survey. Since it does not cost us (or for that matter any other technology service provider) more, we are not at a position to charge more. So, in effect we are enabling this phenomenon. I’ll admit to that. I know some tech. providers and in some of our own services, we charge on a per completed response basis – again – this has absolutely no effect on the number of questions/length of the survey.

So, when we net it out, there is actually no one who actually, from a strict economic standpoint, cares about the length of the survey. It does not even scale linearly — in fact if you model it out, its usually exponentially inverse. Its cheaper to run longer surveys (on a per question basis.)

Customer Sat. studies used to be overtly long and terrifying – until Bain, McKinsey and BCG told all the CEO’s of the world that the only thing they should care about is the single question on “How likely are you to recommend?” – Will Kantar, IPSOS post a “You shall not ask more than 20 questions at a time” rule? I doubt it. Long surveys are here to stay – and we all like it – that is unless someone asks us to take one!

Posted in Uncategorized
6 comments on “Long Surveys are here to stay – Its the economy stupid!
  1. Ivana Taylor says:

    As I read this, I’m reminded of a similar comment that’s said about marketing copy — “No one wants to read long copy.” Yet the data tell a different story. Truth is people will read the right long copy — as long as it’s about them and what they care about.

    Maybe survey length is the same way. People will answer good questions. I honestly don’t know if that’s true. If anyone else has ideas, please share.

  2. Thanks for starting the conversation. Length of survey is important for sure, but I don’t think the number of questions is really an adequate determinant. Number of question parts is a good starting point, a question part being defined as a single question or an option in a multi-part question (select all that apply or matrix / grid). One of my recent surveys had about 39 questions, 197 question parts, and took about 18 minutes to complete. Longer than I’d recommend, but it had a good response rate. When I evaluate a survey prior to programming, or to encourage a client to shorten the questionnaire, I use a secret formula based on number of question parts, complexity of the question, number of open-ends, and a few other subjective elements. My prediction is usually within 10% for those going all the way through the survey in one pass. But the results are complicated by those who stop and start the survey in the middle, whether using a capability to pause then restart from the survey link, or just multi-tasking.

    Question complexity is really important in determining time, and in general for good results. If respondents are scratching their heads over the language, they may guess, or just bail out completely. Peer review and testing the survey should help head off this issue, but sometimes you just can’t tell until the survey is piloted. In this case it is helpful to have a fall-off report, or some other way to determine which questions are causing people to terminate early. Of course, sometimes you expect terminations (as with panel disqualifications and over quotas), so don’t be fooled. Even if overall response rate looks good, reviewing the results at the end of the project might help you to improve question wording for next time.

    Complex question types can also cause problems even if the language is fine. It might look nice to have a ranking question that allows the survey taker to use arrows or drag options around, but ranking any more than a few choices is very time consuming. And it is easy to make matrix questions too long. Ask yourself if all those options are different enough to be worth including. My second example is a survey that had 45 questions but only about 170 question parts (with no open-ends), yet took over 28 minutes to complete because of too many matrices and complex wording.

    Finally, number of pages has some bearing on effective survey length, particularly if the survey takers don’t have broadband. Sitting in our well-connected offices, it is easy to forget that every new page takes its toll. For more on optimizing the number of pages, there’s an article on my site at http://www.5circles.com.

    Idiosyncratically,
    Mike Pritchard, 5 Circles Research

4 Pings/Trackbacks for "Long Surveys are here to stay – Its the economy stupid!"
  1. [...] Long Surveys are here to stay – Its the economy stupid! (questionpro.com) 41.138388 -81.863747 [...]

  2. [...] Long Surveys are here to stay – Its the economy stupid! (questionpro.com) Possibly related posts: (automatically generated)Q & A from 10-Point Checklist for Questionnaire Design Webinar held April …Secure and Control Access to Your Survey DataSurvey Analytics: Weighting and BalancingThe following table shows the proportion who answered “a… 41.138388 -81.863747 [...]

  3. [...] Long Surveys are here to stay – Its the economy stupid! – April 7, 2010 [...]

  4. [...] and automated interface to email you reports on a periodic basis. Typically this is used in a long running survey (transactional surveys) where you have a continuous data collection stream On one hand you have [...]

FREE Training

Weekly Training Webinar
Seating is Limited!