FT, Lazard resist disclosure on poll facts
July 20, 2011 — According to a recent Financial Times report on the extent of public support for green energy development based on a poll sponsored by Lazard, a major international financial advisory and asset management firm, U.S. voters cap their willingness to pay for renewable energy sources, such as wind and solar, at $10 per month. “The results,” wrote the Financial Times, “suggest that higher-cost forms of alternative energy…are likely to come under growing political pressure.” But can the poll results be trusted? Remapping Debate sought to find out, but neither the Financial Times nor Lazard would provide the basic information required to make this determination, a position seemingly at odds with the standards of professional practice of the American Association for Public Opinion Research (AAPOR) and similar organizations.
Transparency is key
“Disclosure is critical to establishing a measure of trust,” said Scott Keeter, president of AAPOR and the director of Survey Research at the Pew Research Center, in an interview with Remapping Debate. “More information about how the poll was conducted is going to be beneficial to you. It’s going to provide people with confidence that you aren’t hiding something from them, that you didn’t ask questions in the poll that you’re now suppressing, that you didn’t word the questions in a way that would tend to bias respondents toward one side of an issue or another, that you interviewed a fair, random sample of the public,” he explained.
For example, the Financial Times reported that the poll asked respondents to rank their “willingness” to pay more for alternative energy sources on a scale of 1 to 10, and noted only that 21 percent of respondents “reported a score of eight or more.” Did this mean that 21 percent were highly willing? Were people who reported rankings of five, six or seven “somewhat” willing? Did the poll have descriptors associated with different parts of the numerical range? The article did not say.
Thanks, but we’re not sharing
When we emailed Lazard for a copy of the poll, Monica Orbe, Lazard’s vice-president for global communications, replied, “We are not distributing the poll for the time being.” We followed up by clarifying that we were interested only in the information that related to the poll results reported in the Financial Times, including basic information such as the poll’s margin of error and the name of the firm that conducted the poll. We suggested that providing the language of the questions posed relating to green energy would be something that Lazard would want to share “so that readers would be able to assess how the reported result was yielded.”
Lazard would not provide any information, stating only that the Financial Times report was accurate.
Our efforts with the Financial Times were similarly unavailing. In the course of email exchanges with Ed Crooks, the author of the story, he acknowledged that, “It is of course very important in reporting a poll to know who was asked, what they were asked, and what they answered,” but said, “[Y]ou will understand that we can't hand out information like that.”
But AAPOR’s Keeter had a different view: “Once the story has been written, you really cannot claim exclusivity in the sense of not sharing the internal details of the report,” he said.
A follow-up email to the Financial Times' Crooks clarified that we were only looking for the subset of poll information that related to the published portion of the poll. We did not receive a response.
Disclosure is standard practice
The type of information that Remapping Debate sought is routinely released by pollsters in connection with the reporting of poll results. AAPOR’s Code of Professional Ethics and Practice states that, “Good professional practice imposes the obligation upon all survey and public opinion researchers to disclose certain essential information about how the research was conducted.” Information such as the sponsors of an opinion poll, the poll’s margin of error, sample size and design, and the exact wording of questions should, according to AAPOR, be made available “immediately upon release” of a poll’s results. Consistent with this, polling organizations such as Pew and Gallup post their full questionnaires and tabulated results online, and news reports commonly provide this information either in the body of the story or in a sidebar.
AAPOR is not alone in promoting the release of this information to the public: both the Council of American Survey Research Organizations and the British Polling Council have similar codes. All three organizations state that polls’ complete question wording must be released. “Question wording is probably number one,” explained AAPOR’s Keeter, “because that’s the easiest way to bias a survey.”
Tilting the scales
Micheline Blum, president-elect of AAPOR’s New York chapter and director of Baruch College Survey Research, notes that the context a pollster does or does not provide to respondents is critical to how survey respondents answer questions, and is therefore necessary for the public to know in order to be able to assess how meaningful the poll’s results are. How respondents are made to understand the issues before them “can completely distort the results you get,” Blum said, adding, “You can…make it appear that people believe something very different from what is true.”
Remapping Debate was initially interested in the poll for just this reason. Were respondents told, directly or indirectly, that the economic costs of developing green energy sources were permanently fixed at a high level, or that they would come down as those sources became more commercially viable and widely used? Were they asked about their willingness to pay the “invisible” economic costs of the pollution and climate change caused by fossil fuels? What choices were offered in terms of how much money people were willing to spend? Was “willingness” to pay defined in any way?
Blum said that it is vital in opinion polling to make these things clear both to the poll’s respondents and to the general public upon the poll’s release. “We need to be very clear in what we’re asking,” she told Remapping Debate, “and try to keep the wording fairly simple and direct — not confusing.” A poll’s style of inquiry can lead to differing results, she explained; questions with too many potential responses can cause respondents to agree with the first or the last option without thinking through the others, for example, while asking respondents to agree or disagree with statements rather than to answer a question can lead to an agreement bias.
“The order of questions also matters,” Blum said, emphasizing how the previous questions asked in a poll can create a frame of mind for subsequent questions. That’s why disclosure of question wording and question order is so important for a reader's ability to assess the poll’s results. “It’s better for everybody if you disclose [this information],” Blum asserted. “The more detail [you] describe of [your] methodology, the better we can judge it.”
Journalistic duty
Blum cited a publication by the National Council Public Polls (NCPP) called “20 Questions A Journalist Should Ask About Poll Results.” The NCPP advises journalists only to report on professional, scientific polls that disclose all relevant information. Among the criteria NCPP cites are the name of the polling organization that conducted it, the reason it was undertaken, the margin of error, and the wording and order of questions — all information the Financial Times did not report and that it and Lazard refused to provide to Remapping Debate.
Gillian Tett, the Financial Times' managing editor in the U.S., did not respond to Remapping Debate’s emailed request to explain the Financial Times’ position, including our request to explain what information the Financial Times feels is important to release about a poll that is the subject of an article in order to allow readers to assess the poll’s meaning and reliability for themselves.
Keeter’s assessment of the importance of transparency is straightforward: “I just wouldn’t attach any credibility to a survey that would not tell me the basic fundamentals.”