News | Blog | Events

Here’s the Score for In-App Mobile vs. “Mobile Optimized” Surveys

 

Blog Mobile Speed Snail 27July17

 

Imagine a basketball team that beat a rival 131 to 26. Which would you bet on the next time they played?

 

In fact, 131 to 26 is the real-life score in the showdown between mobile app use and mobile web use. According to eMarketer, the average U.S. adult smartphone user spent 131 minutes a day accessing content with an app during 2016, and just 26 minutes using the phone to connect to the web. The blowout is expected to get even worse: the estimated engagement score for 2017 is mobile apps, 145; mobile web, 26.

 

Why does this matter for market research? Because it’s an important distinction insights professionals need to understand as they decide what to do about mobile research. As you plunge into mobile, you’ll need to decide which of two different approaches to use: in-app mobile surveys, and online surveys in which a smartphone user clicks a link to connect to a survey housed on the web. The latter approach is commonly referred to in the industry as “mobile optimized” research.

 

So why does the scoreboard read mobile apps 131, and “mobile optimized” 26? There are a number of reasons, all of them having to do with how frustrating it is for mobile consumers to use their phones to connect to the web. For now we’ll focus on just one common problem: slow survey downloads.

 

In basketball terms, mobile apps enjoy a huge advantage in team speed, and it’s reflected in that 131 to 26 usage gap between in-app and mobile web.

 

It’s a truism that today’s consumers want instant experiences, and that they hate waiting for anything.  In-app survey performance gives them the speed they require.  “Mobile optimized” surveys too often will leave them twiddling their thumbs while their phones try to connect with an online survey site. When they finally get there – if they wait out a slow download – the waste of time is apt to affect their engagement with the survey.

 

Soasta, a provider of testing services, examined how load-in delays affect mobile shoppers’ engagement when they access retail websites. The data are as relevant to consumer surveys as they are to e-commerce shopping. Being forced to wait is a big turn-off, regardless what the specific task is that you’re trying to accomplish.

  • 53% of mobile visitors to online sites will leave a page that takes longer than 3 seconds to load.
  • 28% won’t return to a slow site.
  • 47% of consumers browse retail sites on their phones, but only 20% use them to complete purchases – with slowed mobile-web transaction speeds a key factor.
  • The “bounce rate,” a strong indicator of engagement, is 51.8% on the mobile web (a user is said to “bounce” if he or she doesn’t click further after reaching a site).

When it comes to speed, the study concludes, “user expectations are extremely high….[and the] user patience threshold is low….even milliseconds can matter.” The report’s takeaway: “You need to understand how real users are experiencing your site, and how even small or intermittent slowdowns could be hurting your business.”

 

The same dynamic of high expectations and low patience that’s seen among mobile shoppers applies to survey-takers as well. The outcome you’re seeking is a high response rate with high engagement that will yield reliable data without undue angst for researcher and survey-taker alike. To compensate for their lack of efficiency and engagement, online survey suppliers are compelled to compensate by sacrificing quality for the sake of volume. They need to engage in  panel-share to achieve the completes and demographic quotas a study requires. And for the client that often means poorly-validated respondents who’ve been recruited in a non-transparent, catch-as-catch-can fashion. Surveys that don’t force respondents to wait will naturally achieve far better response rates — which means faster projects that achieve better engagement and more accurate and reliable data.

 

If you’re experiencing angst over how your online surveys are being sourced and whether the panel you get is truly capable of representing what consumers actually think and feel, it could be time to look into in-app mobile. You already know the score: 131 to 26. For details on how innovative in-app, offline mobile research solutions can meet your specific needs, just get in touch at solutions@mfour.com.