The case began with Jim Kohl, Director of Consumer Insights at the Career Education Corportation (CEC), a postsecondary education provider with campus-based and online curricula. Over the past several years Kohl has conducted an internal "Voice of the Customer" research project and found that his students had a great online educational experience, although their online research experience was subpar. Additionally, as CEC moved forward with a tracking study, Kohl was concerned if conducting a sole online/mobile research study would be representative of his student populous. One of Kohl's main objectives was to insure the insights were indicative of their customer base.
In an effort to address these issues, Kohl partnered with Added Value, a full service Top 50 Honomichl research company. At FOCI14 both Brian Kushnir, EVP, Managing Director & Wai Leng Loh, VP of Added Value presented 3 key sampling and methodological takeaways from CEC's current online/mobile tracking study:
First, KNOW THY SAMPLE.
Members of the mobile and online world who were sampled for the tracking study were indicative of students who utilized CEC's services in terms of demographics and online behavior. Therefore, the online/mobile tracking study was based on a representative sample that did not exclude core customers and any insights derived from the tracking study would provide an accurate reflection of their target audience's opinions. Interestingly, the in-progress study found the demographics of iPhone users tend to skew slighter higher than Android users (higher income, higher age).
Second, BE DESIGN AGNOSTIC.
Survey takers should be able to seamlessly participate in surveys, anywhere, anytime, regardless of platform (online, smartphone, tablet, etc.)
Third, LET IT GO.
As researchers, we often like to ask tons and tons of questions, in order to gather as much data as possible, so that our results "stick". However, new evidence suggests mobile device users are more engaged with their devices and consequently, less willing to spend as much time taking surveys on their devices. As such, we as researchers need to "Let It Go" when we conduct mobile research by: (1) shortening the length of survey questions, (2) limiting buttons and images within the survey, and (3) reducing survey questions to basic common denominator questions within select categories.
In other words, a traditional survey will suffice in the online world. However, with mobile, the same survey should be streamlined and broken down to basic elements in order to enhance completion rates while keeping both the online and mobile portion of the study intact.
Online vs. Mobile Survey Design
Case in point: although CEC's study is still in the field, the aforementioned mobile research methodology has enhanced the user experience and improved completion rates. It will be interesting to see the final results of the online/mobile tracking study as it moves forward.
Chris Ruby is an award-winning Marketing Research & Consumer Insights Executive with Fortune 500 consulting experience. His niche is the ability to turn complex data into compelling stories that induce a call for action among key decision-makers. His work has been featured by MRA, MRIA, IIR, Norstat Times, Chadwick Martin Bailey & the Optimization Group. Keep up with Chris Ruby by following him on Twitter @ChrisRubyMRX or by reading the Chris Ruby Market Research Blog.