In many ways, traditional campaigns haven’t changed in 30 years or more. Anyone who has ever run any kind of political campaign know that there are fairly consistent pillars of contemporary campaigns: brochures, signs, print/TV/radio ads, phone banks, canvassing and (more recently) digital. As an individual who advocates for a robust digital presence, I am constantly asked to provide evidence that money spend on online campaigning will produce concrete, measurable results.
That is understandable. Politics is about managing scarcity; so it is not a surprise that those managing a campaign would want to ensure that allocated funds are spent wisely. And one of the more compelling reasons to spend on digital is because of the ability to track and test everything you do. With standard analytics you can determine – with a great degree of accuracy – what does and does not work.
Yet, more traditional campaign tactics are not held to the same standards. With most campaigns I engage with, the efficacy of these voter contact methods are assumed and not questioned – despite mounting evidence that phone calls and TV ads don’t have much impact. In 1998, a study was done that tested the impact that phone calls and direct mail pieces had on voter turnout in the US:
“The experiment found that voters called on the phone or sent postcards were not noticeably more likely to vote than those sent nothing.”
But every election cycle, brochures keep getting printed and paid phone banks keep getting hired. If anything, the impact of both of these voter contact tools have gone down since the study was written. On both the provincial/state and national level, the “air war” takes up quite a bit of resources. With fewer volunteers each cycle, campaigns turn to broadcast (or “shotgun”) methods to reach the widest number of people in a short timeframe.
But do those TV and radio spots work? Some data tells us that they do not. Here is a quick video on the research from Vox, which is worth your time:
The TL:DR version? Voters who receive the heavy volume of TV advertising associated with presidential campaigns are no more likely to vote than voters who see barely any. And that is data from a high profile presidential contest. It only gets worse the lower the profile race.
Despite data that demonstrates that more traditional campaign methods don’t really do much (save for door-to-door canvassing, which is incredible effective), a substantial piece of the campaign budget continues to be dedicated to those methods. Meanwhile, digital campaigning must be “proven” before money is allocated.
Like most things, it is important to regularly review tactics and strategies with the eye towards identifying what works and what doesn’t. The concept of A/B testing, a very common practice in the digital space, has been around for decades: changing certain variables in an experiment to isolate the impact of that variable. Only then will you know if a certain action (or lack of one) will have a defined positive or negative impact.
This is a common technique for measuring the efficacy of digital campaigning. With tight budgets and dwindling volunteers, it’s time more traditional methods were held to the same standard.