1. What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research?

mypinio independently operates its own panel, which offers a high-quality global sample. Our proprietary panel is the single technology for all of the Market Research services we offer. mypinio focuses solely on market research and does not provide other services, such as direct marketing. We have been providing innovative solutions since 2022 and continue to deliver high-quality respondents to our clients.

2. Do you have staff with responsibility for developing and monitoring the performance of the sampling algorithms and related automated functions who also have knowledge and experience in this area? What sort of training in sampling techniques do you provide to your frontline staff?

We are working with data scientists and mathematicians with strong backgrounds in statistics to continuously monitor and update our sampling algorithms to ensure clients receive the high-quality, unbiased sample they need. We are also driven to have the best panelist experience as defined by efficient, transparent, and rewarding experience every time. Our operational teams have a comprehensive understanding of research processes and requirements. We also have on our board of directors an AI expert who advises on where to implement the best and smartest practices in each step of our business including sampling algorithms and related automated functions.

3. What other services do you offer? Do you cover sample-only, or do you offer a broad range of data collection and analysis services?

We provide our clients sample sourced from our global inventory of panelists. We also provide dynamic sampling in 100+ countries through api/programmatic sampling.

4. Using the broad classifications above, from what sources of online sample do you derive participants?

Our sample source is our own Research Panel (mypinio) which offers users rewarded surveys. We build our panel from a wide array of high-quality recruitment sources.

5. Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a Buyer?

mypinio (www.mypinio.com) is our proprietary panel and our network of partners is private with exclusive and deep relationships. The percent share of each in the total sample depends on a variety of factors, including the country, sample size, target audience, demand, and time in the field. And this proportion continues to evolve as our global reach continues to grow via both our panel and partner network. While we do not break out sample percentages on a project basis, we can maintain sample source composition for data consistency. Our sample sources are leveraged according to client/project needs to ensure the right audiences are interviewed and the highest value insights are extracted. The overall sample composition depends on the project specifications. 

6. What recruitment channels are you using for each of the sources you have described? Is the recruitment process ‘open to all’ or by invitation only? Are you using probabilistic methods? Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography?

We built our panel from a broad range of high-quality recruitment sources to ensure a diverse composition of representative sample as well as audiences that are hard to reach. We use a variety of channels to recruit panelists, including integrations with a wide range of apps and websites to social media, display advertisement, affiliate marketing, and influencer marketing. Across all recruitment channels, we operate an ‘open to all’ process to recruit a diverse and representative composition of panelists. We regularly gauge the depth and diversity of our panelists to ensure we offer the best sample to our clients that can fulfill both general and niche audience requests.

7. What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are? Describe this both in terms of the practical steps you take within your own organization and the technologies you are using. Please try to be as specific and quantify as much as you can.

  • Pre-panel: vetting of recruitment sources & quality screening on sign-up
  • Unique emails identification: Panelists are not able to create multiple accounts with the same email address.
  • Unique contact details detection: At panel registration, we collect data points like First name, Last name, Postal code, phone numbers, email address, and other data points. Using this data, we can identify panelist accounts that are very likely to be duplicates and act automatically. Accounts must be active and in good standing to take surveys.
  • CAPTCHA security code validation: Prevents automated registration. It is applied at panel registration.
  • Double-opt in email confirmation: This ensures validity of the email address provided. Respondents who want to join our panel receive a confirmation link on the email address provided. Once they click on the confirmation link, they are allowed to continue.
  • Email domain validation: “Disposable email” providers (websites that generate email addresses that are available for only a few minutes or limited number of messages received) are not allowed; accounts using these domains are not allowed to register. We’re using multiple sources to detect emerging fraudulent email domains to continuously update our validation.
  • Detection of anomalies and patterns in panel registration data: Accounts having multiple elements in common are blocked for abuse. The data are aggregated at the recruitment source level and leveraged to run the same checks from third-party partners.
  • Duplicate devices detection with digital fingerprinting: a third-party digital fingerprinting tool gathers many data points from a respondent’s device, such as operating system version, browser version, plug-in, etc., and assigns a relative weight to each data point. The data gathered is put through machine learning models and algorithms to create a unique digital fingerprint of each computer. We do not allow respondents with the same digital fingerprint into the same survey.
  • Suspicious activity check (including cross-industry activity): We also deploy a pre-survey fraud probability analysis to determine if the respondent is suspicious based on device and IP address inked data points. Respondents with high-risk scores are denied entry into a survey. The data is aggregated and continuously monitored for other suspicious data patterns like country hopping, user agent masking, IP address masking, click farm activity, and other patterns that show that a respondent is suspicious. Once the system determines patterns, the respondent is removed from the panel.
  • Country Geo-IP validation: A respondent connected from an IP outside the survey country is not allowed to participate in the survey.

8. What brand (domain) and/or app are you using with proprietary sources? Summarise, by source, the proportion of sample accessing surveys by mobile app, email or other specified means.

Our global brand is mypinio (www.mypinio.com). Our single-panel technology enables us to track a panelist as a unique user irrespective of the product they use to access our panel. 80%+ of our respondents access using a mobile device.

9. Which model(s) do you offer to deliver sample? Managed service, self-serve, or API integration?

mypinio offers an API integration to access its high-quality global sample inventory.

10. If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend). Do you let buyers control which sources of sample to include in their projects, and if so how? Do you have any integration mechanisms with third-party sources offered?

mypinio uses only its proprietary panel to source sample for our clients, and we do not buy sample from third-party sources. We recruit our panelists through a wide range of high-quality recruitment sources to ensure a diverse pool of respondents. We can maintain consistency across studies by recreating a sample frame according to the requirements defined by our clients.

11. Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, Is there sample suitable for product testing or other recruit/recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktop only questionnaires? Is it suitable to recruit for communities? For online focus groups?

Our panels consist of a large number of diverse members suitable primarily for online quantitative research. We can recreate sample and recontact individual respondents as requested by our clients. Our sample is suitable for surveys of up to 25 min in length and can be targeted by device and other criteria according to our client’s requirements. 

Yes it is suitable to recruit for communities and for online focus groups.

12. Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that “looks like” the target population? What demographic quota controls, if any, do you recommend?

If the client is looking for a general population sample, we recommend that our clients develop quotas matching official census statistics for the country. We recommend using sub-quotas to generate representative quotas within a specific target population, which can be used to obtain a balanced sample. We control the number of completes for any sub-sample by setting quotas on the criteria defined together with the client. Upon achieving the quota target, the survey automatically closes in real time for respondents within the criteria of that quota.

13. What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party?

Accurate profiling information is critical to our business. We collect the most relevant, up-to-date profiling information not only for the benefit of our clients, but also so we can provide our panel members with a more personalized, relevant experience. 

For these reasons, we take great care in how profiling is messaged to respondents and how frequently we collect this data.

At least 80% of our panel members are profiled on the following—all of which are derived from questions we ask them directly: Basic demographic information (i.e. age, gender, post/zip code) Computed GEO Regions in each country (i.e. States, Provinces, Departments, MSAs, DMAs, Districts, Metro Areas, etc.) Computed segmentations (Social Grade, Generations, Parents).

Top-10 essential profiling attributes (i.e. Education, Employment, Work Position, Ethnicity and Race where applicable, Marital Status, Monthly and Annual HHI, Household Composition, Number of Children under 18, Primary Grocery Shopper).

Beyond the Top-10 essential profiling attributes, 50% or more of our panel members are profiled on our Top-25 attributes.

Each routable profiling attribute is assigned an expiration date, in number of days. We assign these expiration dates based on the importance of the question to our clients’ needs, as well as the nature of the question itself (i.e. For “Have you purchased X product in the past 3 months?” we set expiration at 90 days). 

When a data point expires, we re-ask the question via our router or make it available for answering again in our Profiling Surveys.

14. What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates?

We need the following:

  • Country(ies) needed
  • Number of completed surveys 
  • Length of survey 
  • Incidence rate 
  • Qualification criteria 
  • Demographics Quota

Any deviations within +/- 10% on the factors above are acceptable for feasibility to hold true.

15. What do you do if the project proves impossible for you to complete in field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third party sources/sub-contractors?

We developed predictive models and algorithms which allow us to make accurate assumptions about the feasibility of a project. We only take projects that we are confident we can deliver. We do not use third-party sources or sub-contractors

16. Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study? Please specify how this is done for each of the sources you offer.

mypinio developed a smart survey router that fetches and ranks surveys from marketplaces inventories based on a selected set of 15 KPIs (example: CPI, LOI, IR, etc) which then attributes a unique score to each survey and ranks it accordingly.

Respondents are either directed automatically to the top ranked survey programmatically or should they log in to our platform they can pick and choose from the different surveys that fit their demographic criteria from our offerwall.

17. Do you set limits on the amount of time a participant can be in the router before they qualify for a survey?

Yes, it is a maximum of 3 times.

18. What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer?

Information shared with the panelist about the project prior to survey initiation includes the survey length and anticipated reward that would be distributed upon completion.

19. Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice?

Yes, we allow participants to choose a survey from a selection of surveys. We only show panelists surveys they are eligible for based on their profiling information, survey qualifications, and quotas. We share the reward and the expected survey length with panelists.

20. What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during the course of a survey? If so, can this be flagged at the participant level in the dataset?

For each project, incentive levels are determined based on subject matter, time commitment (i.e. length of survey), and incidence rate. In exchange for their time and opinions. Panelists are free to select their preferred rewards from the available offers, and these are delivered in a timely manner to ensure satisfaction. We can adjust incentives during a survey and can flag this at the participant level in the dataset. Typically, this would be done to fairly reward our respondents if we notice a survey is taking longer than expected. 

21. Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)?

Yes, we do measure participant satisfaction. The completion and dropout rates are essential factors in measuring respondent satisfaction. After drop-outs, we ask participants why they did not complete a survey to better understand how to avoid the negative experience. After each survey, respondents may rate their survey experience. Upon request, we may share a respondent satisfaction report.

22. Do you provide a debrief report about a project after it has completed? If yes, can you provide an example?

We prepare a report of the respondents and results according to the client’s requirements.

23. How often can the same individual participate in a survey? How does this vary across your sample sources? What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this?

Our panel respondents cannot participate in the same survey more than once and are limited to completing five surveys in a single day. This is true across all sample sources and made possible through technology that we developed to ensure each respondent is unique and recognizable, even if they attempt to take a survey from a different source. This is aligned with ESOMAR industry practices guidelines.

24. What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/channel, etc? Are you able to supply buyers with a project analysis of such individual level data? Are you able to append such data points to your participant records?

For individual participants, we maintain records on: 

  • Date of registration 
  • Last login 
  • Last survey started 
  • Survey participation history 
  • Incentive History 
  • Site activity 
  • Profiling 

We can provide project-level analysis of this data upon request. Profiles of our panelists are appended to participant records, and third-party data can also be appended upon request. 

25. Please describe your procedures for confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router.

We continuously assess the quality of our respondents, from the time of recruitment to each point of entry into a survey. We constantly monitor more than 30 data points to confirm our participants’ identity, including Meta-Data Analysis such as Connection Type, IP-Geo Proximity, IPHopping Monitoring, User Agent, and Device ID, Fingerprinting, User-Data Analysis such as Test Questions, Profile Consistency, CR / LOI Deviation, Response Speed, Pattern Recognition, such as Similar Profiles, Similar behavior, Data correlation, and Data cluster third party tools such as Research Defender
 In addition to these automated processes, we have a dedicated quality assurance team that works solely on monitoring quality aspects on all levels.

26. How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to buyers? Can source be appended to the participant data records?

Feasibility is the key to managing sample source consistency. At the start of a project, feasibility is used to determine how we can consistently deliver completed surveys from a specific source. If there are additional waves of research, each wave is run with the same methodology as the initial assessment in order maintain source consistency. We have data about blends and sources and can share each with clients upon request. Source data can also be appended to participant profiles. 

27. Please describe your participant/member quality tracking, along with any health metrics you maintain on members/participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses?

We continue to monitor each panelist actively. Each panelist maintains an individual Quality Score which is based on several factors, including:

  • Meta-Data and User-Data Analysis 
  • Third-Party-Data (such as MaxMind and Research Defender)
  • User Activity (Panelists who are substantially more active than the average are considered a risk, even if other data does not indicate fraudulent characteristics or behavior
  • Client Feedback (Each validation and reconciliation provided by clients is matched to a panelists with a low-Quality Score are permanently removed from our panel and no longer have access to surveys.  In ambiguous cases, the panelist is temporarily blocked from taking surveys and marked for manual review by our dedicated quality assurance team, which manually reviews all data points before making a decision. We continuously monitor our panelists’ engagement and sincerity in real time. Panelists violating our strict quality rules are permanently removed from our panel. We eliminate undesired behaviors as follows: 
  • Detecting random responding: We ask closed and open-ended test questions to verify the respondent’s attentiveness and sincerity. 
  • We use natural language processing to verify the respondent has given logical answers in context with the question
  • Detecting illogical or inconsistent responding: We monitor the respondent’s answers for illogical combinations. We ask the same questions again in a different context to verify consistency
  • Detecting speeding: We monitor the time a participant takes to give responses and compare it with the mean time of other respondents on similar questions to detect unusual response times.

28. For work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviours, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item nonresponse (e.g., “Don’t Know”) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion?

mypinio does not offer programming and hosting services

29. Please provide the link to your participant privacy notice (sometimes referred to as a privacy policy) as well as a summary of the key concepts it addresses. (Note: If your company uses different privacy notices for different products or services, please provide an example relevant to the products or services covered in your response to this question).

Our Privacy Policy is available under the following link: https://mypinio.com/privacy-policy/

Respondents must agree to our Privacy Policy before participating in any survey. We ensure every user is educated about what data we collect, what the data is used for, and the rights a user has related to their data.

30. How do you comply with key data protection laws and regulations that apply in the various jurisdictions in which you operate? How do you address requirements regarding consent or other legal bases for the processing personal data? How do you address requirements for data breach response, cross-border transfer, and data retention? Have you appointed a data protection officer?

The protection of personal data is very important to us. We comply with government legislation such as the General Data Protection Regulation (GDPR) or the California Consumer Privacy Act (CCPA). We ensure the user is aware of which data we collect, why we collect this data, and which rights he has related to this data. mypinio has implemented an internal procedure for any potential data breach incident. In case of a suspected or identified data breach, all affected parties are notified immediately. mypinio has appointed a data protection officer (DPO) who can be reached on [email protected] 

31. How can participants provide, manage and revise consent for the processing of their personal data? What support channels do you provide for participants?

mypinio has employed a cookie consent tool, whereby a respondent can elect to allow or opt out from cookies. 

Additionally, we have set up a dedicated link for panelists to make requests to manage their data, including: 

  • Requests for data erasure 
  • Requests to opt out 
  • Requests for data access 
  • File a complaint 

In addition to this dedicated link, any respondent—including those from third-party sources—may contact our Data Protection Officer directly at [email protected]

32. How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants?

mypinio works with legal and compliance experts to ensure we comply with the laws and regulations of all countries that we operate in. We continuously monitor applicable laws and regulations to ensure we are up to date on the most current information available. We then pass any new or relevant regulatory knowledge to our teams when necessary.

33. What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations?

mypinio does not conduct surveys with children or people under the age of 16 years. We adhere to the standards provided by ESOMAR and comply with all relevant laws and regulations.

34. Do you implement “data protection by design” (sometimes referred to as “privacy by design”) in your systems and processes? If so, please describe how.

Data protection is an integral part of mypinio’s business. We have implemented and continue to maintain a “privacy by design” process. All new processing, systems, and products are reviewed prior to their creation and/or launch to ensure respondents’ privacy is protected. Additionally, we provide training for our staff to ensure that privacy is accounted for at the very beginning of all new processing. Special consideration is given to the principles under Article 5 of the GDPR, which provides guidance on the legal basis of processing and ensures that the rights of individuals are respected.

35. What are the key elements of your information security compliance program? Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process?

mypinio adheres to the principles defined in information security frameworks such as ISO 27001. We regularly conduct internal training and audits to verify security compliance.

mypinio ensures that respondents are not associated with any personal identifying information they might provide, and our data center is redundant to make sure that data is quickly recoverable in the event of natural disaster. 

Overall, we have an uptime rate of 99.92 percent. Scheduled maintenance and feature upgrades are announced at least 48 hours in advance and are completed during off hours. 

36. Do you certify to or comply with a quality framework such as ISO 20252?

Our holistic approach to quality is baked into all aspects of the process, including: 

  • Respondent Recruitment & Vetting Member 
  • Engagement, Maintenance & Analytics 
  • Study Design & Quality 
  • In-Field Best Practice 
  • Post-Fieldwork Quality Control 

We are constantly reviewing, investing in, and evolving this process to provide the highest quality insights for our partners. 

We also adhere to the best practices outlined in ISO 20252. 

37. Which of the following are you able to provide to buyers, in aggregate and by country and source? 

  1. Average qualifying or completion rate, trended by month
  2. Percent of paid completes rejected per month/project, trended by month
  3. Percent of members/accounts removed/quarantined, trended by month
  4. Percent of paid completes from 0-3 months tenure, trended by month
  5. Percent of paid completes from smartphones, trended by month
  6. Percent of paid completes from owned/branded member relationships versus intercept participants, trended by month
  7. Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort)
  8. Average number of paid completes per member, trended by month (potentially by cohort)
  9. Active unique participants in the last 30 days
  10. Active unique 18-24 male participants in the last 30 days
  11. Maximum feasibility in a specific country with nat rep quotas, seven days in field, 100% incidence, 10-minute interview
  12. Percent of quotas that reached full quota at time of delivery, trended by month