Research and sample quality

In this article, you'll find information about: 

Where does Zappi source sample from?

At Zappi, we know our insights are only as good as the data we extract them from. We take data quality very seriously, leveraging world-class, ESOMAR-certified panel providers to source respondents for our surveys. We partner with both global and local panels, who manage double opt-in panels as well as sources that utilize a mix of double opt-in panels and affiliate networks/sources. 

As we utilize our network of panel partners, we rely on those partners to handle all panel management aspects. This includes panelists' recruitment and validation, incentive management, and any panelist interactions. Our partners in return ensure all Zappi work is managed to meet their panel best practices and follow their panelists' agreed-upon terms and policies.

We conduct regular research on research to evaluate the stability of the results offered by our suppliers, particularly when entering new markets. 

What are Zappi’s primary considerations when sourcing sample?

Some of our typical considerations are:

  • Must meet our minimum quality requirements, both in terms of the quality of responses we receive, but also in terms of the continuity and stability of results offered 
  • Market availability
  • Feasibility
  • Targeting capabilities
  • Measure profiling depth
  • Cost & fees
  • Support SLAs
  • Project management SLAs (if applicable)
  • Fit with Zappi requirements and typical study types
  • Team synergy and working relationships
  • Technical aptitude (if applicable)
  • Technology commitment to the roadmap (if applicable)

How does Zappi ensure high-quality insights?

Our operating model is built on three pillars: data quality, sample consistency, and survey design. We pride ourselves on producing research that has a positive impact, so we build quality into your data from the start.

Data Quality

Data Quality measures are present on every single survey we run.

We start with ensuring our partners provide the highest quality sample, then we take it a step further with our own quality measures:

  • Vetting: Our suppliers meet ESOMAR and ISO certification levels
  • Bot detection strategies: Is the respondent real?
  • GEO-IP fingerprinting: Is the respondent being honest about their location?
  • Deduplication and respondent quarantine: Is this a unique respondent?
  • Speeder analysis: Is the respondent engaged, not just speeding through?
  • Adherence to all regional regulations, including GDPR
  • Evaluation of open-ended responses to identify “gibberish” verbatim responses or repetition. We dynamically detect poor quality verbatims as our surveys complete. Examples of poor quality verbatims include:
    • Gibberish answers (where a respondent has just smashed their keyboard)
    • Illogical responses (where the words make sense, but they don’t form a coherent answer to the question)
    • Repetitive answers (where the respondent types the same thing to every response across the survey)
  • All respondents deemed poor quality ones, are referred back to our providers with the appropriate behavioral flag.

Exclusions

Due to the high-volume nature of online research, the standard recommendation for exclusions is to set exclusions (sometimes known as “lockouts”) on a project-by-project basis. Longer or more complex lockout requirements may be added within a program, or wave of work, as requested when launching.  

All surveys within order on Zappi are automatically placed in an exclusion group. This can be tailored further, e.g. based on the campaign, through manual intervention by our Solution Architects team. Customers must request this from their Customer Success representative before projects are launched.

Sample consistency

To ensure consistency and quality while utilizing multiple sources, we maintain a set sample blend by product and country. These blends are determined when launching any new product or market and are based on several factors including quality, capacity, speed, and automation ability.  

We deliver consistent sample by utilizing quota sampling, managed by our automated sample platform and integrated panel partners.

To ensure a representative and consistent sample frame, our standard practice is to set quotas by product, including controls for variables such as provider blend, respondent device type, age, and gender. These quotas along with in-survey screening questions are determined in collaboration with our clients when defining and setting up their target audiences. 

  • We use the same supplier within each combination of client, product, and country, to make sure that our customers get consistent results
  • Consistency for comparison: a consistent audience enables you to compare results across multiple surveys
  • Flexibility without sacrificing scale: With our consistent screening questions and quotas, you get personalized audiences without putting longitudinal data quality at risk.

Survey Design

Survey Design and respondent experience are crucial first steps in ensuring great quality, engaged responses. When designing surveys, we ensure the following aspects of survey design are incorporated:

  • Short length: The most engaged responses from our panels come in the first 10-12 minutes of a survey, so our surveys take no longer than 10 minutes
  • Low dropout: We mandate a 20% minimum inclusion rate in all surveys
  • Mobile optimized: All our surveys are mobile-optimized, so respondents can access the survey and have a consistent experience, regardless of the device.

We regularly ask respondents how they find their experience of completing our surveys, and spend a lot of time improving both the enjoyment and ease. Our latest research shows that, in terms of enjoyment, 68% of respondents gave a score of at least 8 out of 10, and in terms of finding our surveys easy to complete, 72% of respondents gave a score of at least 8 out of 10.