PEW: What are current global attitudes about climate change?

PEW: updated Oct. 12, 2010

Our international polling shows that publics around the world are concerned about climate change. In the recent spring 2010 Pew Global Attitudes survey, majorities in all 22 nations polled rate global climate change a serious problem, and majorities in ten countries say it is a very serious problem.

There are some interesting differences among the countries included in the survey. Brazilians are the most concerned about this issue: 85% consider it a very serious problem. Worries are less intense, however, in the two countries that emit the most carbon dioxide — only 41% of Chinese and 37% of American respondents characterize climate change as a very serious challenge. Continue reading »

Global Warming Poll

Polls should provide three basic pieces of information:

1. Were the participants randomly selected? Random selection reduces bias. If I surveyed my best friends or the people in my neighborhood, it is unlikely their views would represent the views of all the people in my state.

2. How many people participated? Did they talk to just 100 people or 1,000 people? Related to this is the response rate-how many people contacted actually participated. When only a small percent of those asked complete the survey, it becomes a “volunteer” sample and you should ask whether there might be some kind of bias in who chooses to participate. In a workplace survey, for example, if only 10 percent of the employees participate, you might worry that only the most unhappy people answered; that will give a more negative view of management than might otherwise be the case if everyone had answered the survey.

3. How accurate are the results? Because researchers want to make inferences about what people in general believe about global warming, they rely on the magic of statistics to calculate something called the margin of error (and sometimes called sampling error). Assuming the participants were randomly selected, they can calculate the margin of error based in part on the number of people who participated. In polls, researchers present this information in terms of plus or minus 3% or 5%.

What was reported in the stories?

I was surprised that none of the three sources I read (the Washington Post, Democracy Now! and the Christian Science Monitor ) contained the basic information. Maybe they were all short on space that day.

Linking to the actual survey, the Washington Post gives the needed information:

This Washington Post-ABC News poll was conducted by telephone Nov. 12-15, 2009, among a random national sample of 1,001 adults including users of both conventional and cellular phones. The results from the full survey have a margin of sampling error of plus or minus three percentage points. Sampling, data collection and tabulation by TNS of Horsham, Pa.

OK. So they used standard procedures: random selection to obtain a sample size of 1,000 and estimated margin of sampling error is plus or minus 3 percent.

What does the margin of sampling error mean in practical terms? Basically it means that if they had surveyed all adults in the United States, they are 95 percent certain (this is the confidence level, which is typically at the 95 perent level) that between 75 percent (72% plus 3 percent) and 69 percent (72% minus 3 percent) would report that they believe that global warming in happening.

This becomes important when comparing data. If in the prior poll, 75 percent said they believed global warming was happening, the margins of error of both polls overlap. The margin of error for the 2nd poll would range from 78 percent to 72 percent (+/-3%) and would therefore overlap with the current poll results. In other words, there would be no statistical difference between the two polls because the difference could be explained by the margin of error in the polls.

When the ranges of the margin of error do not overlap, it is interpreted as being statistically significant—that is, there is a difference in the percent reporting they believe global warming is happening and that difference is not likely due to the error inherent in working with random sample data.

“Poll: Less Americans Believe in Global Warming”

Democracy Now! led with that headline on November 25, 2009. It was a variant of the Washington Post’s “Fewer Americans believe in global warming, poll shows.”

As written, some people might assume that that less Americans believe in global warming as compared to those who do not believe in global warming.

The Christian Science Monitor reporting the same polling data led with this headline: “Global warming: 72 percent of Americans say it’s real, poll finds.”

Does this give you a different picture of the polling results? Which headline do you think is a more accurate portrayal of the data results?

Headlines may reflect possible spin—a way to tell the story in a way to meet a particular policy agenda. Sometimes, however, the media is trying to grab our attention. Other times the headline gets distorted when the English language is crammed into a soundbite. The first two stories wanted to make the apparent decline in global warming belief the story although readers would not know that unless they read the story. The Washington Post’s lead paragraph was:

The percentage of Americans who believe global warming is happening has dipped from 80 to 72 percent in the past year, according to a new Washington Post-ABC News poll, even as a majority still support a national cap on greenhouse gas emissions.”

They are trying to make this a story with some drama and mystery—but if 72% of the people believe that global warming is happening, then it should be no surprise that a majority would favor a national cap on greenhouse gas emissions (assuming they believe that the gas emissions are a contributing factor in global warming).

What are the key questions sophisticated users should be asking?