Taking a Data-Driven Approach to Digital Marketing at the Peace Corps
What exactly does being “data-driven” mean for digital marketing and communications practitioners in the federal government? It is easy to be awed by the new shiny platforms and services that promise to change how things are done in government. Fundamentally, though, it is thinking about analytics not just from a descriptive mindset but one of experimentation, optimization and measurement. It is about culture change so it is not as glamorous, not as easy, but necessary if you really want to be data-driven.
There are many advanced techniques like data mining and predictive modeling that are all the rage in the industry, but before you head down that path you need to pick the low-hanging fruit first, develop a framework that makes sense for you and your agency, and go from there.
To that end, I want to share a sliver of how Peace Corps takes an evidence-based approach to digital marketing and communications, and results from a content experiment tied to a recent media campaign our team conducted. Our overarching goal is straightforward: to find ways to do more with less and maximize the ROI of our investments through experimentation. The example below is a simple one but it demonstrates how we are conducting tests to achieve our goal, and optimize and measure changes to our Web properties and our marketing campaigns that are aimed to raise awareness and guide potential volunteers to apply.
The Experiment
We had two research questions:
- Will the type of messaging frames included on the landing page affect the conversion rate everything else being equal?
- Will inclusion of iconography on the landing page affect the conversion rate everything else being equal?
We ran an A/B/n test that included eight variations, testing four messaging frames with and without iconography. The experiment was set at a 95% confidence interval, 100% of traffic from the paid media was directed to the test and it ran until a winner was selected.
The Campaign
- The campaign ran from May to June 2015 and included sponsored content on Buzzfeed, Spotify, as well as paid social, search and display ads.
- Our publishers were given UTM codes that tracked the source of visitors to our campaign landing page that directed them to our experiment, which randomly selected which page variation they saw and acted on.
- The publishers themselves were optimizing the ad copy on their end, so our experiment was focused on what we could control and optimize (i.e., the campaign landing page itself).
The Findings
- We found that the type of messaging frame did in fact affect the conversion rate and the inclusion of iconography did not.
- At the end of the campaign, the winning page recorded a 5% lift in conversions, compared with the control page. We received an estimated 1,800 more conversions than what we would have received if we did not conduct the experiment.
Best Practices
When taking a data-driven approach, it is important not to lose the forest for the trees. Experimentation, optimization and measurement are important to maximize the ROI and tweak your approach, but they should not replace strategy and new bold ideas. With that said, here are a few best practices to consider when running an A/B test:
- Clearly define your research questions.
- Always test simultaneously and randomly select what variation a new visitor sees.
- Run the test long enough to collect a statistically significant sample.
- Make sure repeat visitors see the same variation.
Our experiment was focused on changes to our campaign landing page, but this approach can be applied to other channels, including advertising copy, email content, etc. Remember, there are limitations to these types of experiments, but as long as you are cognizant of the limitations and cautious, you can start making incremental steps to making your agency data-driven.Chris Rottler leads digital analytics at the Peace Corps.