Return to Website

Number Watch Web Forum

This forum is about wrong numbers in science, politics and the media. It respects good science and good English.

Number Watch Web Forum
Start a New Topic 
Author
Comment
Election Maths : Paul Matthews has definitive background

Numberwatch regulars might be interested in that Paul Matthews blog has the definitive background on stories of how the pollsters (AND the BBCs 100s of on the ground 'journalists') called the election wrong.

Re: Election Maths : Paul Matthews has definitive background

On this bit from the Paul Matthews' blog post:

"The first thing to note of course is that everyone got it badly wrong, greatly underestimating the Conservative support. Reasons for this include
(a) the “Closet Conservative” factor – there is a tendency for people not to own up to supporting the Conservative party, and
(b) incorrect sampling by the pollsters – perhaps quiet conservatives stay at home, don’t answer the phone much and aren’t as eager as some others to express their opinions.
However, I thought that the pollsters were well aware of these factors, particularly since the 1992 election when something very similar happened, and compensated for it."

The above two reasons are pretty much all it can be - either the pollsters aren't doing their job competently (b), or the public is being dishonest to some extent (a). The pollsters would I think prefer that people believe and are more inclined to the closet conservative explanation, because it makes the pollsters look a lot more competent.

But I think you can throw out reason (a) on the basis that if there is a closet conservative effect, then there would be expected to be an even bigger closet UKIP effect, as it is even less politically correct to support UKIP than it is to support the Conservatives. The pollsters did get the UKIP share of the vote right, predicting 13% in the "BBC Poll of polls" on the day before the election, compared with the actual result of 12.6%, according to the David Speigelhalter blog (one of the links given in the Paul matthews blog post).

Another strong argument for ruling out (a) in favour of (b) is that the Conservatives apparently had their own private polling operation and successfully predicted the result according to their chief election strategist, Lynton Crosby:

link1

According to Crosby the private polling operation predicted the Conservatives would win between 306 and 333 seats, and the actual result was 331.

If I had to guess why the pollsters employed by the mainstream news media (MSM) were getting it wrong, my guess is that the incorrect sampling is probably a side effect of the massive increase in the UK of cold calling or 'nuisance calls' during the 21st Century. This article gives some idea of the possible scale of the problem:

link2

According to a survey by BT, the UK's biggest telephone company, nuisance calls are regarded as being more annoying than queue-jumpers, noisy neighbours and rude commuters, half of Brits avoid answering a landline phone during the day, and a quarter go as far as unplugging their phone. [Note this survey may itself not necessarily be accurate - it could well be an internet poll with self-selected participants].

So the problem for MSM pollsters is they tend to randomly ring up people on landline phones to carry out their polling activity, and they're having to do this in the face of a substantial shift in the attitude by the public, and possibly Conservative party supporters in particular, in the idea of being cold-called.

Politicians don't want to crack down on nuisance calls because there are about a million people working in the UK call centre industry and to do so might be seen as being anti-business, and even anti-charity, as charities do quite a bit of cold calling.

Crosby's polling operation must have been doing something that the MSM pollsters are not doing - it could be that they are avoiding telephone polling to gather their data.

Re: Election Maths : Paul Matthews has definitive background

It might be worth reviving this old thread to point out that the biggest consequence of the blunder by UK polling organisations in predicting the 2015 UK General Election result is probably that we ended up getting the EU Referendum out of it.

It seems clear from David Cameron's behaviour in 2016 that he would never really have wanted to hold the referendum on the EU in the first place. He appears to have been much less of a Eurosceptic than UK political commentators have told the public that he is for years.

The promise to hold a referendum on the EU came at the start of 2013 as I recall, as a means of reducing the threat of Conservative votes being lost to UKIP. The critical bit was whether the idea of holding a referendum managed to survive through to the Conservative party's 2015 election manifesto that would be written a few months before the election. The hung parliament situation that was being widely predicted by pollsters probably led to the commitment to hold a referendum making it into the manifesto. Cameron's best prospect in the 2015 election, according to the pollsters, seemed to be another coalition or alliance with the Lib Dems, and he could then rely on his Lib Dem colleagues to throw out the commitment to hold a referendum as part of any agreement to provide their support.

The pollsters didn't do particularly well in predicting the referendum result in 2016, but they did perform slightly better than they did for the 2015 General Election. They seemed to generate enough duff information about the referendum vote to make the Remain side overconfident.