In the years before Defense Minister Ehud Barak abandoned the Labor Party to form his short-lived Independence party in 2011, he chose not to employ his brother, Avinoam Brog, who does political surveys and analysis for a living, as Labor’s official pollster. The only time he used Brog’s services was before the 2008 municipal election, when Labor commissioned surveys in 30 cities. Three of them were given to Brog’s company, Market Watch, and the rest to Labor’s regular pollster, Telesekker.
When the media got wind of this ostensible conflict of interest, the story caused a little scandal, with journalists accusing Barak of illicitly using his position as Labor leader to help his pollster brother. “Besides the NIS 1,600 (less than $400) we were paid, we also got a big headline,” Brog recalled sarcastically this week in an interview. “In hindsight,” he added more seriously, “this just shows how justified it was the many times we didn’t get involved with Labor.”
It’s a worn-out cliché that statistics are worse even than damned lies. But in Israel right now, where political polls are a daily feature ahead of January 22’s elections, the numbers have to be approached with special caution. And potential conflicts of interest are only a small part of the problem. According to several experts interviewed for this article, public opinion polls here are often conducted sloppily and superficially and could seriously mislead voters. While polls might be fun to read, many are so poorly done they actually say remarkably little about the likely election results.
A comparison of poll results versus actual results in previous elections confirms that voters shouldn’t believe everything they read on the statistical charts in the papers. No one foresaw the seven-seat success of the Pensioners’ party in 2006. Three years later, in the last elections in 2009, not one of half a dozen prominent forecasters correctly predicted the number of mandates of the three largest parties.
The experts cite several serious problems with the way pre-election surveys are executed and presented. Let’s start with the methodology.
In Israel, 500 or even 400 respondents are often relied upon as a large enough sample size. Mina Tzemach, one Israel’s best known and most trusted pollsters, said she agrees that the bigger the sample size the better, but adds that a recent poll she did with 1,250 respondents had results that were “so close to the one I did with 500” that it really doesn’t make a difference. Since several polls with 500 respondents get similar results, they can’t be too inaccurate, she argued.
Yet in other Western nations, it is extremely rare to see such low numbers surveyed. In the United States, for example, most national polls have 1,000 interviewees or more (it’s the absolute sample size that counts, not the size relative to total population), and several Israeli experts said that 500 might just be enough for a superficial poll but does not allow for a serious analysis.
The number of questions asked and their exact formulation are also very important. Most polls in Israeli media only speak of how many seats a given party is projected to receive, or how the public feels about a certain issue. But how the pollster arrived at his or her conclusions is almost never revealed.
“Without knowing the exact question, it is impossible to understand what the results mean,” said Evans Witt, the president of the Washington-based National Council on Public Polls. The NCPP’s principles of disclosure require disclosure of question wording for all polls discussed publicly, he said. The American Association for Public Opinion Research also calls on pollsters to release the entire questionnaire. In Israel, there is no such law or voluntary agreement that requires transparency when it comes to polling.
Why is it important to know the exact language of a survey question? Because poor phrasing could lead respondents in a certain direction. And according to industry insiders, many Israeli pollsters rush and write questions carelessly.
If a pollster asks, for example, “Do you think the prime minister is doing a bad job?” he or she implants in the respondents’ heads the idea that the prime minister might indeed be performing poorly. A better formulation would be: “How do you feel about the prime minister’s performance?”
“In the US, when quantitative polls are released, no one is scared to show you the questions, how they were asked and the order in which they were asked,” said Stephan Miller, an American-Israeli public opinion research analyst and communications strategist with campaign experience on three continents. (Full disclosure: Miller is working on an election survey for The Times of Israel.) “What is disturbing here in Israel is that there is no transparency. You have no idea, when you read the output of a poll online or in a newsmagazine, about the background of the methodology.”
Even the finest subtleties can have an impact.
“’Do you support Benjamin Netanyahu’s policy of building more settlements?” is not a good way to ask a question, noted Mitchell Barak, the founder of the Keevoon Research, Strategy & Communications firm. Rather, it should be “Do you support or oppose…”
“Are you calling it the settlements, or the West Bank, or Judea and Samaria? That’s also important,” he added.
If a pollster suspects the difference between these terms could lead to different results, he or she should split the sample and ask half of the sample about “Judea and Samaria,” half about the “West Bank” and compare the results, Barak said. “But newspapers aren’t interested in that.”
The root cause of poor methodology is the high cost of high-quality surveys, according to industry insiders. Since research institutes and polling companies charge for every question asked, media outlets — many of which are struggling financially — often commission superficial polls with as few questions as possible, especially in the busy election season.
Most of the polls Israelis have been seeing in the last few weeks only asked respondents if they are eligible to vote but not if they are also likely to vote. Since many eligible voters might stay at home on Election Day, a professional pollster should ask all respondents how likely they are, on a scale of 1 to 10, to actually cast a ballot. Only the answers of those who responded with 7 or higher should be considered when presenting the results, the experts say. But that entails another costly question.
“It’s a disgrace that public polling in Israel doesn’t ask likely voters, just because asking that extra question costs more money,” Miller said.
However, Tzemach told The Times of Israel that, thus far, it was “too early” to ask respondents whether they really intend to vote. Starting next week, she said, she will include that question in her polls.
As for conflicts of interest, many well-known pollsters conduct internal surveys for several political parties as well as for the media — which critics say can lead to skewed results — without the public knowing this.
‘I don’t think they will twist the numbers intentionally. But if you work with someone, it is very difficult not to be in a way biased in favor of the client’
A recent poll conducted for Israel Radio, for example, predicted 15 seats for the far-right Jewish Home party and 13 for Shas, slightly more than other polls published at the same time gave these parties.
Most people who heard about this poll would have had no idea that pollster also happens to work for the same two parties.
“In the US, pollsters typically work for one of the two [political] parties, or an independent media outlet,” said John Della Volpe, director of polling at Harvard University’s Institute of Politics. “Transparency is incredibly important as subtle changes in design and questions can make significant difference in results — and then public perception.”
“You have people who are doing quantitative public opinion research for newspapers, which is supposed to be an independent analysis. But then they’re also being paid by a party and often don’t release this information. This should be a crime,” said Miller. “How can you give an independent analysis if you are being paid by a party?”
Sometimes the potential conflicts of interest are more subtle, critics charge. Every Israeli pollster has friends in politics, and when Company X has a good relationship with Party Y, chances are that X will make Y look surprisingly strong, they say.
The public is being misled, agrees Prof. Tamar Hermann, the academic director of the Israeli Democracy Institute’s Guttman Center for Surveys, demanding that the law be changed to force pollsters to reveal if they are conducting polls for both the media and political parties.
“Of course they will tell you that they’re doing them separately,” she said. “And I don’t think they will twist the numbers intentionally. But if you work with someone, it is very difficult not to be in a way biased in favor of the client. It makes much more sense that if you work for a party, you can’t [also] do public opinion polls.”
But Tzemach, whose surveys are featured in Yedioth Ahronoth and who also does internal polls for the Labor, Kadima and Otzma Leyisrael parties, denies any ostensible conflict of interest distorts her results.
“The people asking for polls want accurate polls,” she said. “And I want my polls to be accurate.” In her last poll, Kadima, one of her clients, did not pass the electoral threshold, she noted. She also makes sure to work for both left- and right-wing parties, to further dispel any notion of impropriety, she said.
Full transparency is actually a mixed blessing, Tzemach posits. When a newspaper states that a poll was done by a company that also works for certain parties, it implies the company artificially inflated the results to favor its clients. “And that’s really not true,” she said.
So how reliable are the polls?
Most everybody accepts that opinion polls say a lot about the general trends of an election campaign. In the current campaign, for instance, nobody argues that the joint Likud-Yisrael Beytenu lists stands to become the next Knesset’s largest faction, although it has been bleeding seats to Jewish Home. The left-wing Meretz party will not overtake Labor.
But headlines about this party losing a seat here and that one gaining two there, the experts say, are, statistically speaking, insignificant. Polls with 500 respondents usually have a margin of error of about 4 to 4.5%, which renders the smaller fluctuations between weekly polls statistically irrelevant.
“People go crazy when a certain party moved from 12 seats to 15 seats. But within the margin of error that’s no movement at all,” Miller said. A recent Mina Tzemach poll showed that Likud-Beytenu lost two seats, which was within the margin of error, he noted. “Yet that’s a headline in this country. It’s embarrassing.”
Tzemach agrees there’s little statistical relevance to a small movements within her weekly polls. “If I were to write a scientific paper, I would look at the level of significance and check whether the gap is statistically relevant or not. But these are [surveys for] newspapers, and newspapers go after the trends.” And overall, she added, the trend of Likud-Beytenu losing ground at the expense of Jewish Home is confirmed by dozens of other polls.
Many industry insiders agree that the polls offers readers only the most superficial conclusions.
“What the papers do is some sort of informational entertainment,” said Brog, the president of Market Watch. “Not because the people doing it don’t take it seriously. But rather because the media outlet pays very little. Therefore, it’s not in the pollster’s interest to invest a lot into his poll.”
For Hermann, of the Israel Democracy Institute’s Center for Surveys, polls in the newspaper are a “kind of a game.”
“I wouldn’t rely at all on polls,” she said. “I would rely only on the trend that we do know — that most people think of themselves as right of center.”
Most people reading the papers and watching the evening news don’t take public opinion polls very seriously, Hermann argued. “I hardly think that people would change their electoral preferences based on such a poll,” she said. “Public opinion is rather immune to polls in the papers.”
This assertion, however, is open to question. Seeing a particular party ostensibly rise or fall, even within the margin of a poll’s error, might potentially lead a floating voter toward a final choice of party. And Hermann allowed that the surveys may well have an impact on support for small parties on the verge of the 2% electoral threshold — the level of support required to enter the Knesset — since people hate to waste their vote on a list that will not end up making it into parliament.
In other words: Accurate or not, the polls right now have the power to kill parties such as Kadima, Am Shalem, Otzma Leyisrael or the Greens.
Most pre-election polls are totally off, history shows
Historical precedent underlines the dubious value of public polls in Israel. In 2006, most pre-election polls shortly before election day predicted two seats for the Pensioners’ party; four polls said the party wouldn’t get any seats at all. It won seven — a substantial showing given that the Knesset has only 120 members.
The final polls ahead of the 2009 elections — published four days before the voting day opened — were all wrong to a greater or lesser degree. Four out of six polls predicted 23 seats for Likud, which ended up winning 27. Four surveys forecast Yisrael Beytenu winning 19 mandates, four more than it actually received.
According to Haaretz, the final polls before of the 2006 and 2009 elections erred by an average of 18 and 19 Knesset mandates, respectively. Nate Silvers, they were not.
Still, not all of this is necessarily the pollsters’ fault, the experts say. Israeli law forbids publishing surveys in the final five days of the campaign, during which voters presumably shift around a fair bit. In 2009, for example, rather than pollsters’ error, it may be that Kadima managed to convince a chunk of the public at the eleventh hour that it’s “Either Bibi [Netanyahu] or Tzipi [Livni],” achieving a shift to Kadima from voters who had previously thought of supporting Labor or Meretz.
A similar scenario is possible in 2013. Voters who told pollsters they would vote for the Jewish Home — an Israel Radio survey of the usual 500 respondents on Thursday showed a further rise in Jewish Home support to 18 seats (with Livni’s Hatnua plummeting to 6, and Yair Lapid’s Yesh Atid slumping to 5) — could reconsider given last-minute developments, or merely because they suddenly realized they’d rather support a larger party to grant the prime minister a freer hand in coalition negotiations. Equally, the Likud-Beytenu effort to encourage voters to “come back” to Netanyahu’s list, and to discredit Jewish Home, could have the opposite effect in the final days of the campaign, after the last polls are published
In other words, until the evening of January 22, when the first exit polls tell us who respondents actually voted for, we will not really know how the 19th Knesset is going to look. And maybe not even then. In 1996, most notoriously, Shimon Peres went to bed after seeing the election night TV exit polls, reassured that he’d still be prime minister in the morning. When he woke up, it turned out that Netanyahu had won the election.