Adaptive Questions®

Get More Answers with Fewer Questions.

The Case for Shorter Surveys

Three-fourths of market researchers admit that surveys are too long according to the 2014 Annual Survey of Market Research Professionals sponsored by Market Research Careers.

About 20% of professional survey takers surveyed in 2015 say that a survey of 15 to 19 questions is “getting too long” and the same percentage won’t even consider starting a survey with 20 questions. Virtually none of these survey takers think that a survey under 10 questions is too long.

A typical Adaptive Survey® is less than five questions.

Adaptive Survey® Defined

An Adaptive Survey® is a market research technology that allows researchers to get more answers from fewer questions. The technology addresses frustrating, length concerns of respondents, cost concerns of market researchers and quality concerns of decision makers who consume market research reports.

Adaptive Survey® Benefits
Adaptive Survey® Process

The overall planning and implementation process is the same as any other online survey. For many market researchers, the process includes…

  1. Discovery and planning: Business decision definition, information needed to make that decision, sample characteristics, segments of particular interest, available data
  2. Project Development: Questionnaire design, sampling design, online survey development, testing, approval
  3. Fielding: Invitations and monitoring returns
  4. Results: Analysis, reporting and presentation
Respondent Experience

Respondents see an online survey that looks much like any other survey – except much shorter. The Adaptive Question® can be any broadly worded question and includes a randomized list of answers submitted by other respondents. The list includes a mix of high ranking answers and new answers. Respondents are asked to select answers they agree with and to drag them into priority order. There is an option to specify answers not on the list so that it looks familiar to respondents – similar to an ordinary multi-select question with an other-specify option.

Example Adaptive Question® Before Respondent Answers

Example Adaptive Question® Before Respondent Answers

Example Adaptive Question After Respondent Answers

Example Adaptive Question After Respondent Answers

How it Works

While the process looks simple to respondents, there are a multiple complex issues resolved behind the scenes.

Overall Results

CloudMR™ automatically sorts out the answers and presents the top answers on a 2x2 matrix. The priority answers that are most popular are located in the upper right quadrant. Niche answers that are important among a smaller group are in the upper left.

Overall Results

Adjustment for Number of People Who Reviewed the Answer

Answers with a large base are more certain than answers with a low base. Agree and Priority percentages are adjusted for the number of people who saw the answer using the exact confidence interval described by CJ Clopper and ES Pearson. CloudMR™ applies a 90% confidence interval to the raw Agree & Priority percentages and takes the lowest end of that interval. The values plotted are the values at the lower tail so that we are 95% confident that the actual values are this high or higher.

Crowd-Sourced Grouping of Top Answers

Sometimes respondents add answers that are duplicative. For example, one respondent might say they are concerned about ‘the color’ and another might say ‘offer it in blue.’ Researchers may want to combine these answers since they are both concerned with color. Various researchers call this process ‘coding,’ ‘netting,’ or ‘building themes’ – a fairly tedious prospect when dealing with open-ended questions Our system groups all suggestions using a patent-pending crowd-sourcing technology that will identify the general areas that need more resources.

Adaptive Survey® Technology Identifies Top Themes Even If Only One Person Mentions it

It is interesting to note that the 2nd most actionable answer was mentioned by only one person, but other respondents who saw it voted it into the top ten. This fact contrasts with traditional open-ended coding where an answer mentioned only one time is usually overlooked.

In this example, the top ten themes are the same after coding the top 25 answers and after coding all 223 answers. Coding the top 50 answers produces an identical result as coding all comments – the same top 10 answers in the same order.

Adaptive Survey® Technology Identifies Top Themes Even If Only One Person Mentions it


CloudMR™ develops an Action Score™ for every segment based on information in your survey or from uploaded data. The Action Score™ is used to sort answers in priority order.

Action Score™ Presentation

Action Scores™ are presented in a heat-map style. Scores are sorted by the total column with the best (bright green) answers at the top. In this case, the bright green ‘best’ cells indicate both high agreement and high priority that result in a high Action Score. Lower scores indicate that either low agreement or low priority or that both are low. Assuming all segments are the same, you would expect the entire chart to look similar to the total column. The display makes it easy to see areas of disagreement between segments by looking for cells that stand out or seem out of place. In this example notice the two red cells in the top row among new customers and among those who are less likely to recommend this company. Also notice the green cells near the bottom among those who are more likely to recommend now.

Action Score™ Presentation

Results of an Independent Evaluation of Adaptive Technology™

An independent evaluation was conducted by the W. P. Carey School of Business at Arizona State University. Professor Raghu Santanam conducted two parallel surveys – one traditional survey using 20 rating scales and 6 other questions and one Adaptive Survey® consisting of 1 Adaptive Questions®and 5 other questions. Both surveys were designed to determine demand for improvements of the 20 top smart phone features. The respondent profiles are statistically identical for both surveys.

One key finding is that three of the top ten answers (#1, #3, & #7) identified by the Adaptive Questions® were not even anticipated by the traditional method.

Results of an Independent Evaluation of Adaptive Technology™

In the summary of results, Professor Santanam says…

Sign Up for Free