Using Third Party/Crowdsourcing Platforms for Participant Recruitment and Study Distribution
Purpose: To guide Ohio University researchers in submitting research protocols in Cayuse’s Human Ethics module when using third-party/crowdsourcing platforms for participant recruitment and study distribution.
Scope: This guidance applies to use of third-party/crowdsourcing platforms such as Prolific, CloudResearch, MTurk, ResearchMatch and others for the conduct of human subjects research at Ohio University.
General Recommendations:
- A vendor/technology review may be necessary to use a third-party/crowdsourcing platform. Please see Technology Review for more information.
- A submission in Cayuse Human Ethics is required for all human subjects research. Approval or acceptance by a third-party/crowdsourcing platform is not a substitute for ORC/IRB review.
- Researchers must ensure that all necessary details requested in Cayuse are included in their submission for adequate and efficient review.
- Researchers are responsible for knowing platform functionality, requirements, and data security and confidentiality measures of the platform they wish to use.
- Researchers are responsible for ensuring that the research study complies with the Terms and Conditions of the platform.
- The ORC/IRB reserve the right to withhold approval from submissions involving third-party/crowdsourcing platforms that do not adequately meet the security and ethical standards and policies set forth and determined by the ORC/IRB.
- Do not finalize your submission to the platform until your Cayuse submission has received a final determination and/or approval from the ORC/IRB.
- After the ORC/IRB has completed a final review determination, the researchers must follow the approved Cayuse protocol. If changes are needed to accommodate a third party platform, a modification must be submitted and receive approval before the changes may be implemented.
Human Ethics Module Sections in Cayuse: Below is guidance specific to completing those sections in your Cayuse submission that may be related to information provided to and the use of the third-party/crowdsourcing platform. All sections and queries within each section of your Cayuse submission, whether addressed below, are required to be completed for proper and efficient ORC/IRB review.
Section 7 Research Sites: Select “online/virtual.” In the respective text box, describe where research activities will take place. E.g., “A survey via Qualtrics will be distributed via [name of platform, e.g., Prolific].” Specify the version or “product” that will be used when using a platform that has multiple versions or “products.”
Section 8 Research Overview and Methodology: Upload all data collection instruments to be used in this section where requested, including any eligibility screening questionnaire/survey provided to the platform.
Section 10 Recruitment and Eligibility: Even though the third-party/crowdsourcing platform will recruit potential participants, it is still necessary to provide adequate and sufficient recruitment and eligibility information in this section of Cayuse for the ORC/IRB to review.
- Subject Inclusion/Exclusion Criteria: List the inclusion/exclusion criteria or characteristics that were provided to the platform and/or selected in the platform. If a screening questionnaire was provided to or requested by the platform, please upload it to Section 8. Direct the platform to exclude international participants or anyone located outside of the United States if the research does not involve international participants. Otherwise, international policies and regulations may apply.
- How will potential subjects be identified and contacted/approached for recruitment: Please note the existing instructions in Cayuse. Select “other” when asked for type of recruitment method(s).
- When asked to provide additional details about recruitment: Provide a description of the platform that will be used and how the platform will identify and/or recruit potential participants. Specify the version or “product” of the platform that will be used.
- Uploading Recruitment Material: Provide as much of the following information as possible for potential participants to be able to determine their interest and eligibility in participating:
Include appropriate researcher contact info, purpose of study, basic procedures that subjects would participate in (survey, interview, etc.), key eligibility criteria (gender, age range, specific medical condition, etc.), subjects’ expected time commitment, and location of research. Include the IRB# on all recruitment communications and identify yourself as an Ohio University researcher. Instruct potential participants to direct all questions related to the consent process and/or research study by contacting the research team directly.
In cases when not all the information above is requested by the platform, most platforms will have a study details message or similar for the researcher to input a description of the research. Input any of the above information not already requested by the platform in the study details message to the extent possible for potential participants to view.
Upload the study message/details that potential participants will first see when contacted by the platform. Most platforms allow you to preview the study details/message that will be shown to potential participants. Capture this message and upload it as a PDF.
Section 12 Informed Consent: Any informed consent forms or language provided by third-party/crowdsourcing platforms cannot substitute for the approved consent form submitted in Cayuse.
- Do not refer to your research as “work” or a “job.”
- In the Confidentiality and Records section of the consent form, inform participants about how the platform protects and secures participant and data confidentiality. Including identifiers collected by the study, inform participants what identifiers the platform itself will collect. For example, when a platform collects and provides an ID number (e.g., Prolific ID, MTurk ID), language in the confidentiality section of the consent form should clearly explain what information may be collected that could be linked to identifiable information, even temporarily. See the following example language:
“We will not ask your name or any other identifying information in this survey. For research purposes, Prolific will assign you a Prolific ID number that may be linked to your responses. However, the link between your Prolific ID number and your survey data will be deleted as soon as possible.”
- A note on returns and rejections: Platforms may allow participants to return a task or survey if they no longer want to complete it. Also, researchers can reject participation for reasons such as not completing the task/survey, timing-out, skipping questions, or failing attention checks.
Rejections can harm participants’ ratings. For example: Rejections in MTurk harm “workers” (i.e., participants’) approval ratings which may impact their ability to complete other “work” on the platform. Prolific will remove participants from the platform who have a high number of rejections.
Researchers are strongly discouraged from rejecting participation at the platform level and using any set parameters in the platform to reject participation. Instead of penalizing participants’ ratings by rejecting their participation via the platform, researchers should withdraw participation/data from their own database(s) during the data cleaning process.
Section 13 Confidentiality and Records:
- Research in which no identifiable information is asked for or collected by the researcher or data collection tool (e.g., survey) but has a user ID assigned by the platform (e.g., Prolific ID), data cannot be considered as collected anonymously. Select option number 2 (i.e., data is not collected anonymously, but no identifiers will be recorded) if the platform ID cannot readily be linked to identifiable data (e.g., Prolific ID).
- Note for MTurk users: Amazon’s survey creation tool is not designed to maintain privacy and confidentiality. MTurk has limited security measures. Collection of data by Amazon is subject to Amazon’s Privacy Policy and Terms of Service. Researchers should know any policies or procedures in place by Amazon should a data breach occur.
MTurk IDs are considered identifiable information, because they can be linked to the participant’s Amazon profile. Do not collect IDs unless necessary (e.g., to debrief participants). If the study team is receiving or collecting MTurk IDs, the research cannot be considered anonymous. If you need to collect participants’ IDs, ensure that the consent language describes the reason for their collection and the confidentiality measures in place (e.g., Your MTurk IDs will be deleted after being used and not linked to survey data).
Section 14 Costs and Compensation:
- Discuss the plan for participant compensation as described by the platform. Some platforms don’t provide the exact type or amount of compensation. In these cases, provide the range of compensation types and amounts participants may receive. Also, briefly discuss:
- Whether researchers have control over how much participants will be compensated.
- Whether researchers have control over the conditions for when a participant will be compensated.
- Whether the participant will receive compensation regardless of survey completion.
- Discuss the conditions under which the participants will not be paid (or partially paid) by the platform, such as failing attention checks, discontinuing participation, skipping survey questions, or failing to complete study tasks. E.g., can participants skip individual survey questions and still get the full amount of compensation, so long as they make it to the end of the survey?
- Remember to also discuss the compensation type and amount or range and the conditions under which participants will not be paid in the Compensation section of the consent form.