Case Study
Optimizing Facebook Ads for Survey Recruitment
Hire Insights Project
Survey Respondent Recruitment via Facebook Ads
Research Goal
To understand if and how companies were changing their hiring processes in response to the evolving political landscape. The research was intended for academics and practitioners studying how external events affect company practices, particularly concerning equity and bias in hiring.
The Challenge
Our research required insights directly from experienced hiring professionals. We aimed to collect 5,000 survey responses to ensure robust analysis. The core challenge was determining if Facebook Ads could effectively reach this specific, often hard-to-reach, population and deliver qualified responses within our budget ($1,867, July-November 2024), especially given initial recruitment difficulties and the ambitious target.
Systematic Testing & Adaptation
Method
We used Facebook's advertising platform to drive traffic to our online survey, hosted on Qualtrics. We chose Facebook because we believed experienced professionals were active on the platform, unlike typical survey-taking pools (like Prolific or MTurk) which often skew towards less experienced individuals. Our approach involved an iterative cycle: launch campaigns, analyze performance, adjust strategy based on data. Qualtrics did not have integration with Facebook's tracking pixel, limiting our ability to optimize directly for completions using certain ad types.
Process & Tools
We set up a project website ("Hire Insights Project") and a page on an affiliated site ("CBS"). Our online survey required ethical review board approval (IRB) to ensure participant protection. We ran 9 distinct Facebook Ad campaigns between July and November 2024.
Analysis
We tracked ad clicks, cost-per-click (CPC) via Facebook Ads Manager, and survey completions (tracked as "Started Survey" in the final step on Qualtrics). We compared how changes impacted these metrics.
Chronology
Initial Test (Campaign 1): We started with a small budget ($100), targeting US adults (18+) interested in Human Resources. The goal was 'Awareness' (maximizing reach). Result: No survey responses, limited performance data.
Learning & Adjusting (Campaigns 2-3): Following recommendations from Facebook's business support, we switched the campaign goal to 'Traffic' (optimizing for clicks) to get better metrics. We moved to daily budgets and focused targeting geographically on the Top 10 most populous US cities.
Hypothesis Testing (Campaigns 4a-c): We tested several variables:
Survey: Shortened it by removing individual demographic questions (to focus on company details and reduce length), explicitly stated the "8 minute" completion time.
Ads: Added more images, allowed Facebook auto-generation.
Budget: Tested different daily spends ($30 vs. $100) and time-of-day adjustments.
Result: Still very low engagement - only 2 completed surveys.
Major Pivot (Campaigns 5-6): Facing poor results and recognizing the impending 2024 presidential election might overshadow the original survey's focus, we made significant changes in a last-ditch effort to test Facebook's viability:
Survey: Drastically shortened to ~5 minutes, topic shifted to focus specifically on anticipated hiring changes post-election, and removed all initial screening questions.
Ads: Added more new photos.
Audience: Broadened significantly by removing the HR interest requirement and expanding geographically (again, per Facebook advice) to the Top 20 Designated Market Areas (DMAs).
Result: A marked improvement - 28 completed surveys.
Key Findings
Finding 1: Initial Strategy Was Ineffective
Combining specific HR targeting with a longer, screened survey on political impacts failed to gain traction (0-2 responses per campaign set).
Finding 2: Cost-Per-Click Stabilized but Wasn't the Key
After initial adjustments, the CPC settled around $0.45. While efficient for clicks, this stable (and relatively reasonable) CPC showed that click cost wasn't the barrier to completed surveys. The issue wasn't getting people to click, but getting them to finish.
Finding 3: Survey Length & Screeners Were Critical Barriers
The most significant factor impacting completions was the survey itself. The dramatic improvement after shortening it to ~5 minutes and removing all screeners strongly suggests these were the primary reasons for participant drop-off.
Finding 4: Broader Audience & Simpler Topic Increased Volume
Removing the niche HR focus, broadening the location, and simplifying the topic to a more timely (election-focused) question coincided with the jump in responses. This suggests the initial approach was likely too complex and targeted for effective recruitment via Facebook ads in this context.
Recommendations
Prioritize Brevity: For cold outreach via platforms like Facebook, surveys must be extremely short (under 5 mins ideal).
Eliminate Friction: Remove screening questions if possible; they severely hinder completion rates in this context.
Targeting Requires Care: While Facebook offers detailed targeting, reaching niche professional groups for survey participation can be difficult. Starting broad might be more effective.
Consider Platform Limitations: Without conversion tracking (like Facebook Pixel integration), optimizing ads for actual completions is challenging. 'Traffic' goals optimize for clicks, which don't guarantee completed surveys.
Platform Fit is Key: Facebook ads may not be ideal for recruiting highly specific professional samples for complex surveys without significant prior audience building. Direct outreach in niche communities (e.g., LinkedIn groups, professional forums) might be necessary.
Facebook Requires Investment: Successfully using Facebook for this type of recruitment likely requires a long-term strategy focused on building a dedicated follower base for the project page, rather than relying solely on short-term ad campaigns.
Project Impact & Next Steps
Critical Flaws Identified
The project successfully pinpointed why the initial Facebook strategy failed, providing crucial insights into the limitations of the platform for this specific research goal and audience. The 33 responses gathered were ultimately insufficient for the planned analysis.
High Cost Per Response
The resulting average cost of ~$57 per completed survey was deemed unacceptable for this type of data; such costs might be justifiable only for more in-depth qualitative data like interviews.
Informed Pivot
Based on these findings, we concluded that Facebook Ads were not a viable primary recruitment tool for this project under the given constraints. We subsequently pivoted our recruitment strategy to using Lucid, a specialized panel provider.
Demonstrated Analytical Rigor
This project showcased a methodical approach to testing, data analysis, and strategic decision-making in response to performance, ultimately leading to a well-justified change in recruitment tactics.