Erin Reddy, Dee Campbell and Helen McCarthy at the Central Count Centre in Dublin Castle, Dublin as votes are continued to be counted in the referendum on same-sex marriage.

The Southern Ireland polling companies did well with the same-sex marriage referendum – their predictions were pretty good, and one polling company RedC predicted the result spot-on. Here we look at their methodology and show how LucidTalk also use the same methodology with their Opinion Panel polls.

(left to right) Erin Reddy, Dee Campbell and Helen McCarthy at the Central Count Centre in Dublin Castle, Dublin as votes are continued to be counted in the referendum on same-sex marriage.

Congratulations to the Republic of Ireland polling companies who forecast the same sex marriage referendum result correctly.

This helps us get the polling industry back on the rails again after the ‘wobble’ of the UK election poll predictions (NB not our Northern Ireland poll predications, which were 95% correct!)

However in terms of the southern referendum – although the standard polling did slightly over estimate the ‘Yes’ vote, it’s interesting to note that one company RedC got the result exactly spot-on with their ‘Wisdom of Crowds’ polling approach.

This approach, which was used in an effort to uncover ‘shy No’ voters, meant they were able to accurately predict a Yes – 62% to No – 38% result the day before the referendum (matching the final result). This suggested that there was indeed a “shy No” voter effect, in the standard poll analysis used by the other polling companies.

The ‘Wisdom of Crowds’ is based on the idea that the collective estimation of a random crowd is superior to even the smartest people within it. How RedC did this was, as well as asking how their representative sample of voters were voting themselves, they also asked them what they thought the final result might be. This aims to take into account the conversations that voters are having with family and friends about the topic, and how they see people voting on Election Day. Once the don’t knows were excluded, this analysis predicted the final S. Ireland referendum result with 100% accuracy!

I mention this because we use a similar methodology with our Opinion Panels here in NI which accurately predicted last year’s European election (May 2014), and this year’s Westminster election results. Basically we ask three questions (a) Who the respondent is planning to vote for themselves, (b) then thinking about relatives, friends, and co-workers, who do they think they will mostly vote for, and (c) Who do you think will win the election overall (or seat etc.).

In last year’s Euro election Opinion Panel polling we often got replies to (b) and (c) like ‘I think a lot of my work colleagues will vote Jim Allister (TUV), because he’s very good isn’t he, but as I said I’ll be voting Alliance’. Yes, you’ve got it – that means this person probably voted Jim Allister as well i.e. we’d found a ‘shy TUV’ voter. We still record this persons own vote as Alliance (they said that, so we have to go with that), but we also build the other answers (i.e. (b) & (c)) into our models. This allowed us to predict the Jim Allister surge in last year’s Euro election – which all the other pundits missed.  We also got all the other results reasonably correct and some spot on e.g. Sinn Fein – our forecast: 26.2%, actual result: 25.5%, and DUP – our forecast: 20.8%, actual result: 20.9%. However, even with our modellingwe still ended up overestimating the UUP Euro election performance indicating there was a large ‘shy TUV’ vote within those who said they were voting UUP.

The probable reason for this is people are more comfortable and honest about saying what other people close to them think, and also more honest about general political conversation type questions. Once you ask them who THEY are going to vote for themselves then the question is felt as more personal. Try this out yourself with your relatives and friends – they’ll talk freely to you about Stormont (probably how bad it is!), welfare reform etc., but then ask them directly who they voted for in the recent election and the shutters come down. However, there’s a good clue in their conversation as to who they voted for, or would vote for, and it is these clues (e.g. from questions (b) and (c) above) that we build-in to our forecast modelling.

There are two Gold Standard poll questions that the big UK polling companies regularly poll in Great Britain: (1) Which political party would be best for the economy, and (2) Which party leader would make the best prime minister. These two questions are polled 2-3 times per week, every week of every year, by the big London polling companies. Since Miliband became Labour Leader in 2010 he was always been behind Cameron on No. 2, and Labour were always been behind the Conservatives on No. 1, – right up to May 7th election day. No party or leader has ever won an election with both these rules running against them, and this rule of the ‘Two Gold Standard Poll questions’ has never been broken in any election since the war. So instead of % vote and seat predictions why for goodness sake don’t the big GB polling companies in Britain get this point across to people more? – I don’t know.

But the key point is you will note that these two gold standard poll questions are general ‘conversation type’ questions, and not the more direct ‘Who are you voting for?’! They put people at their ease in a normal conversation environment i.e. the type of stuff you’d talk about with your friends in a coffee shop, or with your work colleagues.

So watch out for the big name GB polling companies increasingly using our Opinion Panel Index polling method and the ‘Wisdom of Crowds’ approach in future polling. If it had have been used in GB at the recent general election it may have connected with those ‘shy Tories’ who came out on election day and put Cameron back in Downing street. It’s a methodology that has proved its accuracy and proved its worth.