How many polls are conducted in Canada each year?
Thousands, when you include political polls, market research polls, social behaviour polls and Statistics Canada surveys on issues such as family makeup and the unemployment rate. Determining the exact number of polls is difficult because individual surveys don't have to be registered anywhere. As well, sometimes companies conduct one "omnibus" survey that asks a number of questions on behalf of different clients. To give an idea of the scope of the Canadian industry, though, the Marketing Research and Intelligence Association says almost three-quarters of a billion dollars changes hands annually over market research activities in this country.
Chris Waddell, a professor at the Carleton University School of Journalism, says political parties will commission polls in ridings they think are undecided to see how opinion is developing so that they can formulate on-the-ground strategies. In a general election, he says, "national polls mean nothing to them." They'll poll heavily within key battleground provinces such as Ontario, Waddell says, "but I can't imagine the Conservatives are polling much in Alberta."
How much does it cost to do a poll?
"The cost varies quite a bit," says Donna Dasko, senior vice-president at the Environics Research Group. "One question on an omnibus poll of 2,000 people can cost $1,500." A more complex, customized poll with more people surveyed at greater length can cost up to $70,000.
Are all political polls alike?
No. There are at least three common types:
1. Typical random phone surveys poll people over a set period of time, until the desired sample size is reached, reflecting the gender, age and geographic breakdown of the population at large. Then the poll ends and the results are reported.
2. Rolling tracking polls give results every day during the course of a campaign. In those polls, say 400 new people are polled every night. On the fourth day, pollsters compile the results from the first three days and come up with a set of numbers using the sample size of 1,200 people. Another 400 people are interviewed on the fourth night. On the fifth day, the polling company drops the first night's results and adds the results from the fourth night. The company still has a sample size of 1,200, but the results may have changed because of events on the campaign trail. The process is repeated until the end of the election, allowing parties and voters to quickly detect possible changes in support patterns.
3. Exit polls are conducted as voters leave polling stations on election day. Interviewers ask them for whom they voted. If the voters tell the truth and vote in the same patterns throughout the day, exit polls can give an accurate picture of the election winner by the time the polls close.
Waddell says exit polls aren't a big factor in Canadian election coverage because our ballots tend to be counted very quickly. "Also, exit polls work in an environment where people are voting for the same people everywhere - in the United States, for example." Canada's riding-based system and many time zones mean that such polls wouldn't tell you much about who's going to be prime minister before the actual vote results come in, Waddell adds.
How do political pollsters get your phone number?
Most of them use automated equipment and software that randomly calls all the possible land-line numbers in the area they're polling. (They typically don't poll cellphone users.) Employees at a bank of phones in a central office sit ready to grab a particular phone line when someone answers. When a residential customer picks up the phone, there may be a moment or two of silence at first, a giveaway that a pollster or some kind of telemarketer or charity solicitor is on the other end. When the polling company's outgoing call-display panels indicate that the phone system has reached a business number, that call is terminated before it's picked up. This approach lets pollsters reach even those people whose home phone numbers are unlisted.
Is there a do-not-call list for polls?
No. The do-not-call registry legislation aimed at telemarketers that passed through the House of Commons and Senate this fall carries exemptions for charities, businesses with existing client relationships, newspaper circulation departments, political parties and polling companies.
How many people get called to collect 1,000 responses?
It varies. Big polling companies say they get between 20 and 35 usable responses for every 100 calls they make. Waddell says that number has gone down sharply since the advent of answering machines and call-display screens, thanks to Canadians' desire to escape telemarketing calls. "The response rate has gone down from at least 50 or so per 100 calls," he says.
For voter intention surveys, polling companies screen out people who aren't eligible to vote, people who work in the market research industry, members of the media and people who work on political campaigns. Because they have to be representative, they sometimes reach people when they've already filled their quota of responses from that demographic group - men over 50 or women 30 to 49, for example. So if there's nobody else in the household who fits into the demographic group they're still seeking, they have to keep trying.
To increase the chances of getting more usable responses quickly, polling companies tend to call in the evenings (until 10 p.m. or so) and in the daytime on weekends.
What kind of people tend to co-operate with polls?
Older people and those who live outside major urban areas such as Toronto, Vancouver and Montreal tend to be more reachable and thus more co-operative.
What kind of people do pollsters find it hard to reach?
Young people 18 to 25 who don't spend a lot of time at home, people who have only cellphones, people with call display who don't want to answer unsolicited calls, workers on odd shifts, homeless people or individuals who can't afford phones, and those who live in institutions such as university residences and seniors' homes.
To make up for this imbalance, pollsters often "weight" the results from the people they can reach in an underrepresented age, gender or geographic group.
What impact do political polls have on election results?
The experts argue about this. Some say there's no measurable impact and others say polls can and do change the course of political history, especially in close campaigns.
Those in the later camp talk about three types of effect:
1. A "bandwagon effect," in which people perceive a party is going to be elected and decide to vote the same way, either to ensure they have an MP on the government side or because they think, "If all those people are voting that way, they must have good reasons."
2. An "underdog effect," in which voters decide to buck the trend and cast ballots in sympathy with the second- or third-place candidate or party.
3. Strategic voting, in which voters decide to vote for their second-choice candidate in an effort to make sure a disliked frontrunner doesn't get elected.
What's a margin of error, and what factors can affect it?
"In the real world, there is no perfect random sample anywhere. The margin of error is in fact based on that assumption," says Dasko.
Say a poll has a margin of error of plus or minus three percentage points, 19 times out of 20. The same poll suggests Party A has 35 per cent support, compared to 29 per cent for Party B. The margin of error means that 95 per cent of the time (or 19 times out of 20), you can be absolutely sure that Party A's actual support is between 32 and 38 per cent and Party B's support is between 26 and 32 per cent. The parties could actually be tied, in other words.
What about the other five per cent of the time? Thanks to the random nature of polling, the pollsters just happened to talk to too many people whose views don't reflect the views of the population at large.
The larger the number of people a polling company talks to, the smaller the margin of error will be. Regional breakdowns from national polls usually have a higher margin of error because they involve the opinions of fewer people.
Why are polls so wrong sometimes?
Dasko says poor polling methods are usually to blame, rather than blips in statistical probability. "In the last campaign and so far in this one, the polls have not been very different," she points out. "They have been very consistent with each other."
She says "a famous rogue poll a few years ago in a Canadian election," which she would not name, was not a true rogue but the result of a polling company that did not take precautions to gather a truly random sample. Such problems can often be traced back to how respondents are chosen, inexperienced field workers, how questions are worded and the order in which questions are ordered.
A careless polling firm, or a partisan one looking to get a certain answer, might craft a question or series of questions that tends to draw a certain answer. For example, say a pollster asks if you are aware of the tax-cutting, environment-protecting and health-care funding promises of Candidate A, then immediately asks if you support Candidate A, B or C. You'd be more likely to answer "Candidate A" because you've just heard something positive about that person. Also, if pollsters read out a list of candidates and asks which one you support, your answer may be skewed by where the names fall in the list. So a good pollster will rotate the order of the names for every new person polled, to take away that kind of bias.
The other issue is making sure your poll doesn't miss an entire group of people. For example, conducting a poll only during daytime hours will lead to an under-representation of people who work traditional 9-5 shifts outside the home, and an over-representation of stay-at-home parents, retired people and shift workers.
There also have been polls with results that are overtaken by events by the time they're published. Polls are a snapshot of public opinion at a moment in time, but many factors can sway people's minds in a short time. Also, some voter preferences are "leaning" or "soft," meaning that people are more likely to change them before voting day.
Why isn't the CBC reporting most political polls any more?
For the last two elections, CBC and Radio-Canada have decided to concentrate on issues, platforms and personalities rather than day-to-day horse-race results. The public broadcaster guidelines for election reporters thus greatly restrict the reporting of most polls released during the campaign on the basis that "election campaigns are about much more than simply trying to predict who will win."
This time out, CBC commissioned and released a poll just before the election was called, surveying voters on party preference, but also on what issues would be most important to them. Those results have helped craft the broadcaster's coverage plans leading up to the Jan. 23 election.
There are some exceptions to the restrictions on reporting polls. We can report the results of reputable regional polls when they help illustrate meaningful trends, and we can mention poll results as part of wrap-up stories illustrating trends.
What laws cover the reporting of polls?
The Canada Elections Act says media agencies have to include detailed information about the questions asked, methodology, sample size, margin of error, dates of polling and so on if they are releasing or reporting about a new poll within the first 24 hours of its release. They must also tell people how to get the full poll methodology, by going to the polling company's website, for example. Also, Section 328 of the act says nobody can release results of exit polls or other new polls before polling stations close on voting day.