Financial Times, Lazard resist disclosure on poll facts

Original Reporting | By Alyssa Ratledge |

Tilting the scales

Micheline Blum, president-elect of AAPOR’s New York chapter and director of Baruch College Survey Research, notes that the context a pollster does or does not provide to respondents is critical to how survey respondents answer questions, and is therefore necessary for the public to know in order to be able to assess how meaningful the poll’s results are. How respondents are made to understand the issues before them “can completely distort the results you get,” Blum said, adding, “You can…make it appear that people believe something very different from what is true.”

Were respondents told, directly or indirectly, that the economic costs of developing green energy sources were permanently fixed at a high level, or that they would come down as those sources became more commercially viable and widely used? Were they asked about their willingness to pay the “invisible” economic costs of the pollution and climate change caused by fossil fuels? What choices were offered in terms of how much money people were willing to spend?

Remapping Debate was initially interested in the poll for just this reason. Were respondents told, directly or indirectly, that the economic costs of developing green energy sources were permanently fixed at a high level, or that they would come down as those sources became more commercially viable and widely used? Were they asked about their willingness to pay the “invisible” economic costs of the pollution and climate change caused by fossil fuels? What choices were offered in terms of how much money people were willing to spend? Was “willingness” to pay defined in any way?

Blum said that it is vital in opinion polling to make these things clear both to the poll’s respondents and to the general public upon the poll’s release. “We need to be very clear in what we’re asking,” she told Remapping Debate, “and try to keep the wording fairly simple and direct — not confusing.” A poll’s style of inquiry can lead to differing results, she explained; questions with too many potential responses can cause respondents to agree with the first or the last option without thinking through the others, for example, while asking respondents to agree or disagree with statements rather than to answer a question can lead to an agreement bias.

“The order of questions also matters,” Blum said, emphasizing how the previous questions asked in a poll can create a frame of mind for subsequent questions. That’s why disclosure of question wording and question order is so important for a reader’s ability to assess the poll’s results. “It’s better for everybody if you disclose [this information],” Blum asserted. “The more detail [you] describe of [your] methodology, the better we can judge it.”

 

Journalistic duty

Blum cited a publication by the National Council Public Polls (NCPP) called “20 Questions A Journalist Should Ask About Poll Results.” The NCPP advises journalists only to report on professional, scientific polls that disclose all relevant information. Among the criteria NCPP cites are the name of the polling organization that conducted it, the reason it was undertaken, the margin of error, and the wording and order of questions — all information the Financial Times did not report and that it and Lazard refused to provide to Remapping Debate.

Gillian Tett, the Financial Times’ managing editor in the U.S., did not respond to Remapping Debate’s emailed request to explain the Financial Times’ position, including our request to explain what information the Financial Times feels is important to release about a poll that is the subject of an article in order to allow readers to assess the poll’s meaning and reliability for themselves.

Keeter’s assessment of the importance of transparency is straightforward: “I just wouldn’t attach any credibility to a survey that would not tell me the basic fundamentals.”

 

Send a letter to the editor