‘House effects’ and how to read the polling tea leaves…

Mark Pickup, Will Jennings and Robert Ford

We are living through a period of tremendous political uncertainty and volatility. At the time of the European Parliament elections in May – which now seems a lifetime ago – the Conservative and Labour parties contrived to win less than 25% of the vote between them, an all-time low. The ruling party was back then polling as low as 17% in terms of Westminster voting intentions. Now, under a new Prime Minister, the Conservatives find themselves polling in the 30s with a lead over Labour. These uncertain times have also seen considerable disagreement in the numbers reported by pollsters. In the week before the European Parliament elections in May, the Conservatives were put as low as 21% (Panelbase) and as high as 28% (Survation) in Westminster polling, while support for Labour was estimated between 25% (YouGov) and 33% (Survation). Meanwhile the Brexit Party were polling as high as 25% (Opinium) and as low as 12% (Survation). While European Parliament elections often shake up national polling (as voters use them to express their dissatisfaction with the main parties), this period was marked by even more volatility than usual – due to the failure of May’s Brexit deal and establishment of the Brexit Party. In the period since, the polls have continued to tell a mixed story about the state of party support.

Our latest Polling Observatory estimates of voting intentions continue to show the fragmentation of the British party system – but with the Conservative Party having bounced back under a new leader from their low at the time of the European Parliament elections, which were only held due to the previous Prime Minister’s repeated failure to secure Parliamentary support for her Brexit deal. As of the end of August, the Polling Observatory puts support for the Conservatives at 35.5% (16.9 points above where the party stood at the end of May), Labour at 24.5% (just one point higher than May), the Liberal Democrats at 18.0% (0.2 points down), the Brexit Party at 12.1% (10 points down) and the Green Party 5.3% (one point down) – with UKIP support statistically indistinguishable from 0%.

The Conservative rebound has clearly come at the expense of the Brexit Party, while support for the other parties has remained rather more stable. Our estimates reveal distinct trends for Labour and the Lib Dems – both parties saw a substantial shift in their support around the time of the European elections, with Lib Dem vote more than doubling and Labour’s falling over 10 points. Unlike the Conservatives – who have recovered at the expense of the Brexit Party – this shift has been sustained. The two parties are now in a new holding pattern, with Labour polling much lower than before and the Liberal Democrats much higher. Changing voter perceptions about the parties’ Brexit stances – with stronger Remainers losing faith in Labour and switching to the Lib Dems – likely has played a role. These dynamics have been reinforced by the Liberal Democrats’ election of a new leader, Jo Swinson, who has attracted considerable media attention and reinforced her party’s clear Remain message, boosted by a string of defections to the party, with a string of Remain-supporting MPs joining the party in recent months from both the Conservatives and Labour.

How this might translate into seats in Westminster remains highly uncertain. (Our estimates assume that the polling industry as a whole will not be biased in a particular direction – which of course was not the case in 2015, when Conservative support was underestimated, and in 2017, when Labour support was unusually underestimated.)

UK 04-09-19 anchor on average (1)

One of the useful features of our approach to ‘pooling’ the polls is that we are able to calculate the ‘house effect’ for each polling company for every party. That is, we can say whether a pollster tends to show high or low numbers for a particular party, relative to the vote intention figures we would expect from the average pollster. This does not indicate accuracy, since it is only on Election Day that we will know whether individual pollsters or the industry as a whole have got it right. It could be that pollsters at one end of the extreme or the other are giving a more accurate picture of voters’ intentions.

In the table below, we report all current polling companies’ ‘house effects’ for each of the parties. We also report details of whether the mode of polling is telephone or Internet-based, and notable adjustments the pollsters use to calculate the headline figures. This should provide a helpful guide for interpreting the latest polling tea leaves, since one can factor in whether seemingly good numbers for the Conservatives, Labour or some other party are a product of “house effects” related to the choices that a pollster makes in conducting their survey, or reflect a real shift in electoral preferences. As Anthony Wells has noted, prompting for the Brexit Party and controlling for past vote appear currently to have significant impacts on poll numbers. In the former case, pollsters that prompt for the Brexit Party in their surveys tend, unsurprisingly, to report higher numbers for the party. There was a similar methodological debate over whether to prompt for the UKIP party in 2012-15. The use of past vote (i.e. how people voted in 2017) to weight samples to make them representative is a longstanding practice in the polling industry. However, this can introduce error through people misreporting their past vote, leading supporters of a party to be overrepresented in the poll. Recent analysis by YouGov suggested that ‘false recall’ – people forgetting they voted Labour in 2017 – led to Labour’s estimated vote being 3% higher if not adjusted for.

Our analyses cannot discern the precise methodological reasons why particular pollsters tend to show better numbers for one party or another, but do provide a useful guide.[1]

Some pollsters tend to show Labour higher: most notably, Panelbase show them 2.4% higher than the average pollster, Survation 1.8% higher, and Deltapoll 1.1% higher. In contrast, YouGov – not least due to the false recall adjustment discussed earlier – show the party 2.4% lower than the average pollster. The Conservatives, on the other hand, tend to poll better with ICM (+1.2%) and Kantar (+1.0%), and worse with ComRes (-1.7%) and BMG Research (-1.0%). The LibDems tend to do better with BMG (+2.0%), Ipsos MORI (+1.0%) and YouGov (+0.9%). Pollsters are quite divided over the Brexit Party – perhaps unsurprisingly given there are so many unknowns about support for the brand new party. Panelbase (+2.8%) and YouGov (+1.7%) tend to report higher numbers, while Kantar (-0.9%), Survation (-0.8%), Opinium (-0.7%) and Deltapoll (-0.7%) tend to show the party lower.

The picture in terms of house effects is quite mixed then, not least as the pattern is not simply symmetrical between the Conservatives and Labour. If a pollster tends to show one of the parties doing better than the polling industry on average, it does not automatically mean their estimate for the other main party will be lower than the average.

The crucial point is that each pollsters’ latest estimate of voting intentions reflects the latest information about voters’ preferences, plus some noise and the particular methodological decisions they have made regarding the electorate.

Table 1. ‘House effects’ of pollsters against the industry average

House Mode Con Lab Lib Dem Brexit Party  UKIP Green
YouGov Online 0.2% -2.4% 0.9% 1.7% -0.4% 0.3%
ComRes Online -1.7% 1.0% -0.9% -0.7% 0.6% -0.5%
Ipsos MORI Telephone 0.3% -0.8% 1.0% -0.6% -0.4% 0.4%
Survation Online -0.0% 1.8% 0.3% -0.8% -0.4% -1.0%
BMG Research Online -1.0% -0.8% 1.9% -0.1% 0.5% 1.0%
Panelbase Online -0.5% 2.3% -0.9% 2.8% 0.2% -0.5%
Kantar Public Online 1.0% -0.9% 0.6% -0.9% -0.35% 0.2%
Opinium Online -0.6% 0.1% -1.5% -0.7% 1.0% 0.2%
ICM Research Online 1.2% -0.4% 0.2% n/a -0.3% 0.4%
Deltapoll Online 0.4% 1.1% -1.6% -0.7% -0.2% -0.7%

How might the polls perform in a general election in the near future? It is difficult to tell, and the salutary lesson to take away from 2015 and 2017 was that pollsters can face distinct challenges in elections even a short time apart given the highly volatile political environment, where voters are internally conflicted and the context of the choice is rapidly changing. The most recent benchmark we have of polling performance is the May 2019 European Parliament elections. Now, European elections tend to have lower turnout than general elections and people may use such ‘second order’ elections to defect from their traditional party loyalty and register a protest vote. Nevertheless, the polls may tell us something about whether pollsters are more or less likely to reach the supporters of particular parties. Keep in mind this was a context in which the Conservatives won 9.2% of the vote and Labour 14.1% (record lows for both parties in European Parliament elections).

Nearly all pollsters overestimated Labour in the May European Parliament elections (with the exception of YouGov who showed the party 1.1% below its final vote share) and the Conservatives (again with YouGov being the only pollster to put the party significantly below the final result (-2.1%). By way of contrast, all the pollsters underestimated the Liberal Democrats – with a range from 0.3% below their final vote share (Ipsos MORI) to 7.6% (Survation). Most pollsters (with the exception of Kantar and Panelbase) also overestimated the Brexit Party vote share.

Table 2. Error of final polls against result of 2019 European Parliament Elections

House Mode Con Lab Lib Dem Brexit Party UKIP Green
BMG 2.9% 3.9% -3.3% 3.4% -1.3% -4.1%
Ipsos MORI -0.1% 0.9% -0.3% 3.4% -0.3% -2.1%
YouGov -2.1% -1.1% -1.3% 5.4% -0.3% -0.1%
Kantar 3.9% 9.9% -5.3% -4.6% 0.7% -4.1%
Panelbase 2.9% 10.9% -5.3% -1.6% -0.3% -5.1%
ComRes 2.9% 7.9% -6.3% 0.4% -0.3% -5.1%
Survation 5.2% 9.4% -7.6% 0.5% -0.2% -4.8%
Opinium 3.2% 3.4% -4.6% 7.5% -1.2% -4.8%

While we cannot be sure how the campaign for the next general election will play out, these estimates highlight the importance of reflecting on where new polling information comes from. The different methodological designs and weighting procedures of pollsters can lead their headline numbers to differ in important and significant ways. There can be no way of knowing which pollster is right before election day, but it is worth urging some caution in how these sorts of numbers are interpreted by those in politics, media and the wider public.

 

[1] What our analysis doesn’t do is model whether those house effects are time-varying – i.e. whether the methodological decisions taken by a pollster inflate its support at one point in time and depress it at another.