6 May 2015 09:33am

What part do pollsters play in the general election?

As the UK goes to the ballot boxes in the most uncertain general election for a generation, Xenia Taliotis finds out what part the pollsters play in taking the public’s political temperature

In 2008, Nate Silver’s FiveThirtyEight.com blog famously predicted the results of the 2008 US presidential election (calling 49 of 50 states correctly) and the 2012 election (50 out of 50). Silver now interprets data to predict such things as Oscar winners and the geographic distribution of support for gay marriage, and is in Fast Company’s most creative people in business list. He even has a website – isnatesilverawitch.com – devoted to him. “While we on the Is Nate Silver a Witch editorial board are strict rationalists, Mr Silver’s performance has been uncanny enough to raise small but significant doubts as to whether his methodology is entirely of this world,” states the home page.

Ahead of the general election in the UK this May, the newspapers were awash with polls. In the previous parliament alone, more than 2,000 political surveys were conducted with hundreds more expected before 7 May. They indicated an electorate grown weary of the mainstream parties and looking for alternatives. In the past this would have resulted in tactical or protest voting, meaning nothing more than lost votes for the big two (three, if you want to be kind to the Liberal Democrats). Now though, voting for the Greens, the SNP, UKIP and Plaid Cymru will have cost seats.

“George Gallup, the father of modern polling, came up with the soup analogy. He said if you stir a pan of soup properly, you’ll only need a taster to find out whether it’s too salty”

This fragmented political landscape has made polling a far more tricky game to be in. It was never easy – not even in the good old days when the BBC’s swingometer only oscillated between two colours – but now that there are more parties grabbing a share of the pie the whole business has become fraught with conundrums. No one has been predicting results Silver-style.

“It will be one of the most unpredictable elections we’ve ever had,” said Dr John Curtice, professor of politics at Strathclyde University, president of the British Polling Council and a world authority on electoral behaviour. “If, as seems likely, no party wins a majority, all manner of horse-trading will take place before we find out who’ll end up behind that black door, no matter what Cameron and Miliband are saying now. There are so many possibilities – Labour and SNP; Labour and SNP and the Greens; Conservatives and UKIP; Conservatives and the Democratic Unionist Party – that it’s impossible to say who will end up with whom after 7 May.”

When economia asked leading pollsters YouGov, Populus, ComRes and Ipsos Mori – all of which poll for newspapers – to call the result, they were reticent. Speaking at Impact, the Market Research Society’s conference in March, Andrew Cooper, founder of Populus (a man considered by the late Philip Gould to be among the best political pollsters of his generation), said the only thing we could be certain of was an “extraordinary car crash”.

He added: “We don’t have a party with broad enough appeal to win a majority, which will make for a very complicated election. And we’ve got marginal seats where the winning candidate is likely to have only about 20% of the vote; that’s not good. Our next prime minister, whoever it is, will end up leading a messy government.”

Modelling an outcome from such a muddle is complicated: there are too many variables to consider, including even the fact that we’ve been told to expect a hung parliament. This, says Adam McDonnell, research executive at YouGov, will throw some of us into a flap on election day because we won’t know how to vote in response to that information. “We’re in uncharted waters – we can’t use the past to forecast the future because we’ve not gone into an election like this one before.”

The best that our pollster can do – indeed the only thing they ever promise to do – is provide us with snapshots of general trends and public mood swings in response to, say, the Budget, or a big issue story. No pollster in the country can give a definitive answer to the question of whether points can result in winning new seats. It is, says Laurence Stellings, associate director of Populus, “the million dollar question”.

When really pressed on who’ll end up with the keys to Number 10, Stellings says we’re likely to end up with “a tangled Labour-led government”. This is not because Miliband has the edge with voters (although in April he had overtaken David Cameron as the most popular political leader for the first time) but because he has a bigger pool of potential coalition partners to negotiate with. “There’s no chance of either of them winning a majority, so it will come down to who can get enough support from the minority parties,” says Stellings. “Miliband could go to the SNP, the Greens, even the Lib Dems, whereas Cameron’s looking more isolated. Even if he holds onto the Lib Dems, their share of the vote is likely to be so greatly reduced that he’ll need the support of a third or even fourth party to give him a majority. It’s going to be tough.”

The prognosis from the other polling companies is much the same, which is hardly surprising given how similar their methodologies for collecting data are.

“Pollsters acquire their samples through one of two methods; random sampling or quota sampling,” says Curtice. “Random sampling is just that – selecting people by random digit dialling or the random selection of postal or email addresses and asking those selected to take part in the survey. Quota sampling involves setting quotas that match the profile of the population and then instructing interviewers to find respondents who match that profile. We know from the most recent census, for instance, that 16% of the population is aged 65+, 83% live in England, 8% in Scotland, 5% in Wales and 3% in N Ireland, so polling companies use those percentages as quotas and ensure, by careful selection and by weighting, that their samples reflect those percentages.”

YouGov is one of the many companies that uses quota sampling. With five voting intention surveys per week, it was the most frequent assessor of public opinion; it is also the only solely online pollster. You’d think this would create problems of bias, that YouGov’s samples wouldn’t be sufficiently representative because they don’t include the 14% who, according to the latest (2013) figures released by the Office for National Statistics, have never used the internet. But McDonnell says highly-sophisticated weighting methods compensate for this, and also for the fact that its panel is made up of people who have chosen to be surveyed, rather than people selected at random.

“Our samples are specially designed to give us a representative sample,” says McDonnell. “We know who our members are, so we can select the correct proportion of men and women, young and old, Scottish and Welsh, and so on, to take part and then we weight accordingly.”
YouGov has a UK panel of 400,000 people and samples about 1,500 of them each time, which is about average – the range being somewhere between 1,000 and 2,000. It’s an interesting fact about sampling, says Ben Page, chief executive of Ipsos Mori, that 1,000 responses, providing they are random, can accurately represent the views of a nationwide, or even worldwide, population.

“What makes your poll accurate is not how many respondents you’ve got but how good your sample is and how good you are at weighting the data to match the profile of the population,” says Page. “If you’ve done your job well, have got a representative sample, have asked the right questions in the right order, your 1,000 people will give you an accurate – within a margin of error of between 1% and 3% – idea of how the other X million or X billion people think. George Gallup, the father of modern polling, came up with the best way of explaining this with his soup analogy. He said that if you stir a pan of soup properly, you’ll only need a taster to find out whether or not it’s too salty.”

Like YouGov, Ipsos Mori uses quota sampling, but based on 1,000 monthly telephone interviews that are selected through random digit dialling. “We used to do our voting intention surveys face-to-face until 2008,” says Page, “but made the switch to telephone polling because of cost, time – your data is almost out of date by the time you’ve finished and reported your field studies – and reliability of response. People tend to be more honest when they’re on the other end of the phone than when they’re talking to you in person and will more readily admit to behaviour that is perhaps considered ‘anti-social,’ such as saying they’ve no intention of voting, or are going to vote for an unpopular party.”

Speed and cost are also why more and more companies are running internet polls. “Most polls are commissioned by newspapers or broadcasters,” says Tom Mludzinski, head of political poling at ComRes, “which need headline stats very quickly, so there’s no way we can do face-to-face interviews.” ComRes was doing fortnightly surveys, polling 2,000 people online and 1,000 on the telephone, but went weekly closer to the election. “Broadly speaking we use the same methods as our competitors. We use random-digit dialling for our telephone polls and contact people who are on our panel or through random emailing for our online surveys.

“We ask about voting intention but also aim to find out about the broader picture. It’s important to see how people are responding to pertinent issues of the day and how far those issues are likely to influence voting behaviour. Top-line concerns include the NHS and immigration – so you’ll find both of those on every party’s agenda.” How you frame your questions has a huge impact on the answers you’ll get: “there’s an art, a science almost, to writing and sequencing questions for polls,” says Stellings. Like Lord Ashcroft Polls and YouGov, Populus (which was doing twice-weekly telephone and internet polls, with a sample of 2,000), was prompting for UKIP in the primary list of political parties, rather than first filing them under ‘other’.

“Our business is all about getting as close to the right result as possible, which you’re not going to get if you ask leading questions. For instance, you must never give the impression that there’s a wrong or right answer or that it’s reprehensible not to vote or you’ll get results based on lies. We stake our reputation on our accuracy. Our polls are checked against a range of demographic criteria to ensure they’re representative and we weight by sex, age, social class, work status – even how many cars the household has.”

It’s partly because their reputation depends on their accuracy (and therefore integrity) and partly because British market research is one of the best and most closely regulated in the world that no polling company can have any party bias or allegiance.

“Many polling companies do separate private surveys for the parties,” says Jane Frost, CEO of the Market Research Society, “but what goes into the public domain is completely neutral. It has to be because they stake their reputation on being right. Political polling is actually a very small part of what each of these companies – YouGov, Ipsos Mori, etc – does, but it’s a part that grabs all the headlines and gets their name into the papers each day, and into public awareness.

“That’s why the people you spoke to were reluctant to say what they thought was going to happen after 7 May. They have too much to lose by getting things wrong and they’re too professional to let their own politics skewer the results in any shape or form.”


Political polling is actually a very small part of business for analysts such as Populus, Ipsos Mori, ComRes and YouGov. It is regarded across the industry as a useful marketing tool and good for branding, but doesn’t contribute much to the revenue stream.

The big, lucrative contracts are with corporates, where the work is diverse and technically groundbreaking. As Hannah Millard at Ipsos Mori explains: “We pride ourselves on constantly innovating and this includes finding new ways to explore society and business. We use neuroscience, implicit reactions, ethnography, online communities, social media monitoring and analysis, focus groups and a wide range of survey techniques.”

Laurence Stellings at Populus says the company uses data to help organisations understand their reputations, how they are perceived, what the risks to them are, and how they can better manage them. “Lots of what I do is in applying political research techniques to the commercial world, and taking the commercial research techniques to the political world,” he says. “We’re now regularly producing ‘warbooks’, which are typically used in political campaigns, for corporate clients. They summarise huge volumes of research, data and conclusions for senior management audiences, and provide the evidence base for stakeholder and reputation management.”

*This issue of economia went to press on 24 April

Xenia Taliotis


Related articles

General election 2015: Analysis of the party leaders

General Election 2015

How to plan your themed election party