Who Says the American Public Can’t Agree?

January 25, 2010

A Washington Post-ABC News poll, released Thursday, found that 73 percent of Americans would support “a special tax on bonuses over $1 million.” Support crosses party lines.

That same poll found that 79 percent of the American public believe that banks are to blame for the nation’s economic troubles (58% say ” greatly to blame”).

By a 72-to-19 percent margin, according to a new CBS poll, Americans now feel that the federal bailout has benefited “mostly just a few big investors and people who work on Wall Street.” Most Americans think this is true regardless of party affiliation or income level.

Happy Employees Despite Economic Downturn?

“Workplace Glass Half-Full”

The results of a Washington State employee survey was on the front page January 25th  Olympian. “Despite downturn, survey finds more satisfaction than in 2007.” The story noted, “Workers in general were slightly more satisfied working for the state last year than in 2007, a year when some workers got double digit pay increases and government was adding thousands of jobs in the midst of an economic expansion.” The context today is different. While the federal government did send some stimulus money as a temporary life raft, Washington faces a $2.8 billion deficit for 2010 and it is hard to see how they will make cuts without cutting state employees.

 The statewide average score for the 2009 survey was 3.84. Based on a 1-5 scale, this is a good score. And it was indeed higher than 3.8 average in 2007, albeit a very slight improvement.

 But there is something that does not seem to make sense as this story is framed. In the face of job insecurity, why would scores be higher?  So we need to take a look at the details of what was measured and how this survey was conducted.

Read More: Washington Employee 2009 Survey

Global Warming Poll

Polls should provide three basic pieces of information:

1. Were the participants randomly selected? Random selection reduces bias. If I surveyed my best friends or the people in my neighborhood, it is unlikely their views would represent the views of all the people in my state.

2. How many people participated? Did they talk to just 100 people or 1,000 people? Related to this is the response rate-how many people contacted actually participated. When only a small percent of those asked complete the survey, it becomes a “volunteer” sample and you should ask whether there might be some kind of bias in who chooses to participate. In a workplace survey, for example, if only 10 percent of the employees participate, you might worry that only the most unhappy people answered; that will give a more negative view of management than might otherwise be the case if everyone had answered the survey.

3. How accurate are the results? Because researchers want to make inferences about what people in general believe about global warming, they rely on the magic of statistics to calculate something called the margin of error (and sometimes called sampling error). Assuming the participants were randomly selected, they can calculate the margin of error based in part on the number of people who participated. In polls, researchers present this information in terms of plus or minus 3% or 5%.

What was reported in the stories?

I was surprised that none of the three sources I read (the Washington Post, Democracy Now! and the Christian Science Monitor ) contained the basic information. Maybe they were all short on space that day.

Linking to the actual survey, the Washington Post gives the needed information:

This Washington Post-ABC News poll was conducted by telephone Nov. 12-15, 2009, among a random national sample of 1,001 adults including users of both conventional and cellular phones. The results from the full survey have a margin of sampling error of plus or minus three percentage points. Sampling, data collection and tabulation by TNS of Horsham, Pa.

OK. So they used standard procedures: random selection to obtain a sample size of 1,000 and estimated margin of sampling error is plus or minus 3 percent.

What does the margin of sampling error mean in practical terms? Basically it means that if they had surveyed all adults in the United States, they are 95 percent certain (this is the confidence level, which is typically at the 95 perent level) that between 75 percent (72% plus 3 percent) and 69 percent (72% minus 3 percent) would report that they believe that global warming in happening.

This becomes important when comparing data. If in the prior poll, 75 percent said they believed global warming was happening, the margins of error of both polls overlap. The margin of error for the 2nd poll would range from 78 percent to 72 percent (+/-3%) and would therefore overlap with the current poll results. In other words, there would be no statistical difference between the two polls because the difference could be explained by the margin of error in the polls.

When the ranges of the margin of error do not overlap, it is interpreted as being statistically significant—that is, there is a difference in the percent reporting they believe global warming is happening and that difference is not likely due to the error inherent in working with random sample data.

“Poll: Less Americans Believe in Global Warming”

Democracy Now! led with that headline on November 25, 2009. It was a variant of the Washington Post’s “Fewer Americans believe in global warming, poll shows.”

As written, some people might assume that that less Americans believe in global warming as compared to those who do not believe in global warming.

The Christian Science Monitor reporting the same polling data led with this headline: “Global warming: 72 percent of Americans say it’s real, poll finds.”

Does this give you a different picture of the polling results? Which headline do you think is a more accurate portrayal of the data results?

Headlines may reflect possible spin—a way to tell the story in a way to meet a particular policy agenda. Sometimes, however, the media is trying to grab our attention. Other times the headline gets distorted when the English language is crammed into a soundbite. The first two stories wanted to make the apparent decline in global warming belief the story although readers would not know that unless they read the story. The Washington Post’s lead paragraph was:

The percentage of Americans who believe global warming is happening has dipped from 80 to 72 percent in the past year, according to a new Washington Post-ABC News poll, even as a majority still support a national cap on greenhouse gas emissions.”

They are trying to make this a story with some drama and mystery—but if 72% of the people believe that global warming is happening, then it should be no surprise that a majority would favor a national cap on greenhouse gas emissions (assuming they believe that the gas emissions are a contributing factor in global warming).

What are the key questions sophisticated users should be asking?