In consumer research, double-dipping is a common type of fraud that inflicts a double indignity on clients. When one panelist takes the same online survey twice, the answers have no validity and the data gets tainted. The client is being gamed by dishonest survey-takers, and let down (to put it politely) by a panel provider whose loose recruitment methods allowed the dishonesty to occur.
It should go without saying that consumer panels must consist of unique individuals whose collective responses reliably reflect the overall population whose behaviors and attitudes toward a product or experience are under study. But some providers – and, apparently, some research firms and brands — seem to think that cutting corners on panel security and panel quality is no big deal. Duplicate responses are treated as a reasonable cost of filling sample needs for a price that’s nice and cheap. But what’s the cost when unwitting executives are led to rely on analysis and recommendations based on distorted survey data? Brands deserve quality data because they need to make quality business decisions.
A common recruitment method called “panel routing” opens the door wide to errors and abuses. It rises from widespread desperation to fill quotas for online surveys in the face of the boredom or alienation of a public that wants to conduct its personal business conveniently on mobile instead of being glued to a desktop or laptop. The desperation to fill online quotas is evident in the title of one article that recommends panel routing: “Can Someone Please Complete My Survey? Routing Makes it Easier.”
“Finding willing survey participants continues to be more and more difficult,” the author wrote, and routing “brings the respondent to the survey or surveys.” Putting it less delicately, routing is a cattle call that recruits survey-takers without vetting them in any serious way. After providing some quick information about themselves, at best, panelists who enter an online routing funnel are sent to the first desktop or “mobile optimized” survey for which its assumed they might qualify. And since panel routers serve many clients, they often invite respondents to keep coming back indiscriminately for more surveys, creating bias from overuse.
In a striking recent post on the GreenBook blog, Brian Lamar, an experienced data quality consultant, said he’s become so tired of sloppy panel recruitment, among other poor practices, that he had to speak out bluntly. “I see a lot of bad research, unfortunately, both in my day job [evaluating] data quality as well as when I take surveys in my spare time,” Lamar writes. “Respondents routinely answer the same question over and over as they’re routed from sample provider to sample provider. And this bad research isn’t from [the less well-known] companies you would expect – they’re from names all of you have heard of: big brands or big market research companies…It makes me sad, and…you should be sad or angry as well.”
The good news is that there is a way forward: offline, in-app mobile survey technology, combined with a unique, dedicated panel whose members take surveys solely on their smartphones. That keeps the entire process out of the uncontrollable online environment. A smooth-functioning survey app such as Surveys on the Go® attracts a large and diverse array of panelists because it meets them in the offline, in-app mobile zone that today’s consumers favor (Americans now average about 2 ¼ hours a day using mobile apps – 3 hours for Millennials). Here are a few of the specific safeguards that in-app, offline mobile provides to prevent duplicate responses and other types of panel fraud:
- Each mobile device has its own unique identification code, and the survey technology won’t allow duplicate attempts from the same phone.
- Panelists suspected of deliberate attempts to game the system will be bounced from the panel for good, along with those who give answers that are thoughtless and mechanical rather than honestly responsive. Data quality is too important to tolerate poor respondents for the sake of quota-filling numbers.
- Also unique to an in-app mobile panel is your ability to verify its members’ engagement simply by checking users’ unsolicited public ratings and reviews of the app at Apple’s App Store and Google Play.
There’s a lot more to say about data quality and how it’s endangered by duplicate responses and other forms of online panel fraud. To continue the conversation, just contact us at firstname.lastname@example.org.