It’s no secret that polls can be wildly off the mark. The day or time in which a poll is conducted can affect the results. The decision to emphasize landlines or cellphones, the phrasing of questions and the pollsters’ ability to interpret the responses — all these and more can dramatically affect the results of a poll.
Among Israeli political pollsters, the gap between the findings and voters’ actual behavior on Election Day can be significant. In the month leading up to Election Day in the last three national elections (in 2006, 2009 and 2013), the polling firm Teleseker gave the Labor Party, on average, 2.5 Knesset seats more than what the party would go on to win at the ballot box — an error equal to 17% of the party’s final showing. The polling firm Hagal Hahadash overcounted support for Likud over the same period by an even higher average of 4 seats. And these are only averages.
In fact, pollsters pretty consistently predict too many seats for Likud, Labor, centrist parties (such as Kadima, Yesh Atid and Hatnua) and Meretz — and just as consistently predict too few for ultra-Orthodox and Arab parties.
These remarkable findings come from Project 61, an effort by the analyst Nehemia Gershuni-Aylho to give a better portrait of the electorate’s views than Israeli pollsters seem to be doing. The math can be complicated (Hebrew link), but the principle is simple enough. Gershuni-Aylho tracks polls (Hebrew link) as they are published in the Israeli press, adjusts their results according to each pollster’s average per-party error over the last three elections, and then publishes averages of these adjustments on the project’s Facebook and Twitter pages. The result is a snapshot of the political situation that is closer to the likely reality than any standalone poll.
Gershuni-Aylho, 28, has already been compared to the American number-crunching wunderkind Nate Silver, whose models ahead of the 2012 elections put veteran political pollsters from the major political parties to shame. And in fact, Gershuni-Aylho does not hesitate to give Silver credit. The Israeli’s method for rating pollsters, the bedrock of his process for adjusting the poll results, is drawn from his American counterpart, “with some adjustments for a multi-party system.” Silver explains how it’s done here.
A recent conversation with Gershuni-Aylho began with the obvious question: Some of the pollsters tend to overestimate support for the parties that hire them, The Times of Israel noted. Does he suspect the numbers are “improved” for the benefit of the clients?
“There are anomalies; the pollsters all make mistakes in particular directions,” Gershuni-Aylho agreed.
But, he added emphatically, “I don’t believe there are serious pollsters who will twist the results.”
For one thing, it’s bad for business. “If I’m a pollster [hired by Likud] who gives Likud five extra seats,” and the findings don’t bear out on Election Day, “is that good or bad for Likud? For small parties, favorable polling can lead people to consider them viable and worth their vote. But for large parties, overly optimistic polls can make voters feel they have the luxury to vote for smaller, more targeted parties.”
Indeed, one of Gershuni-Aylho’s key findings is that polling errors consistently favor large parties and underestimate support for small or sectoral ones — that is, pollsters’ errors probably hurt their largest clients at the ballot box. That’s the opposite of what one might reasonably expect to find if the polls were rigged to find favor with the clients.
So how does he explain the sometimes dramatic gaps between the polls and the ballot-box results?
“Some problems are structural” — that is, they arise from the very act of polling or from the population being polled — “and some are related to methodology,” he says.
For example, when a respondent “isn’t sure who they will vote for, there’s a natural inclination to give a better-known large party” as a way of signaling one’s general political tendencies. That means many respondents may be saying “Labor” to indicate a general left-wing leaning, but this preference may manifest itself at the ballot box as a vote for Meretz. On the other side of the aisle, one might say “Likud,” but waver between the ruling party, Yisrael Beytenu, Jewish Home and other parties that are part of a broad spectrum deemed as “right-wing.”
Such behaviors may help explain why some pollsters tend to predict an overly optimistic showing for the parties or political camps that hire them. If a pollster is hired by Labor, it is likely that the poll they conduct will ask about Labor, its policies and its leaders in far greater detail and at greater length than with any other party. In a 12-way race, the simple act of thinking more about the particular party conducting the survey increases the chance that a respondent will express support for it.
The under-representation of Arabs and ultra-Orthodox in polls flow from structural problems. It is a simple fact of Israeli political life that Arabs and Haredim are less willing to answer telephone polls.
But polling firms have tried to compensate for these behaviors. “The Arab public doesn’t answer polls, so every few weeks [polling firms] do big, serious polls [of Arab opinion] and correct their daily polls according to those findings,” explains Gershuni-Aylho.
Such “corrections” are also used for undecided voters.
“Anywhere from 20 to 50 percent of [a poll’s] respondents may say they aren’t sure who they will vote for. We know that in past elections about seven seats go in unexpected directions. For example, Tzipi Livni’s [Kadima party] rose from 23 to 28 seats at the ballot box [in 2009], the Pensioner’s Party from two to seven [in 2006], and Yesh Atid [from roughly 14-15 in polls] to 19 [on Election Day in 2013].”
Pollsters attempt to compensate for the gaps caused by uncooperative or undecided respondents with statistical models that may guess their answers based on other factors, such as gender, age, and geographic location.
With so many barriers to getting a comprehensive picture of the electorate, and so much reliance on assumptions and statistical modeling, it is no wonder that so many errors creep into the polling.
“What are your expectations for accuracy?” Gershuni-Aylho asks. “If someone makes an error of 12 seats — one seat off the mark for each of 12 parties — that’s an excellent survey.”
So which polling firm is the best? Alas, Israel’s most accurate pollster in the last three elections, according to Gershuni-Aylho’s calculations, retired in 2013: the venerable 79-year-old Prof. Mina Tzemach, who had led the polling operations of Dahaf since 1980. Panels Politics and Meno Geva (where Tzemach is now a partner) come in at second and third place in the Project 61 ranking of the ten major pollsters (Hebrew link).
Project 61 is independent of any political party or polling firm, Gershuni-Aylho affirms. “I’m not a pollster myself,” he adds, “I just analyze the numbers.”
His analyses have already earned him some fame. Over the past month, he has appeared on most of Israel’s major television and radio news outlets.
Was that his motivation for the project? Can his assertion that he’s in it for “the numbers” be taken at face value? The more one follows Gershuni-Aylho’s fascinating work, the more one begins to suspect that there is something else at play here — that the whole immense undertaking is an elaborate excuse to indulge his real love: making beautiful charts.