Measuring Public Opinion and Political Information

  1. 6.3

    Explain how polls are conducted and what can be learned from them about American public opinion.

Listen to the Audio for Section 6.3

Before examining the role that public opinion plays in American politics, it is essential to learn about the science of public opinion measurement. How do we really know the approximate answers to questions such as “what percentage of young people favor abortion rights,” “how many Hispanics supported Barack Obama’s 2012 reelection campaign,” or “what percentage of the public is looking for a job but cannot find one?” Polls provide these answers, but there is much skepticism about polls. Many people wonder how accurately public opinion can be measured by interviewing only 1,000 or 1,500 people around the country.19 This section provides an explanation of how polling works; it is hoped that this will enable you to become a well-informed consumer of polls.

How Polls Are Conducted

Public opinion polling is a relatively new science. It was first developed by a young man named George Gallup, who initially did some polling for his mother-in-law, a long-shot candidate for secretary of state in Iowa in 1932. With the Democratic landslide of that year, she won a stunning victory, thereby further stimulating Gallup’s interest in politics. From that little acorn the mighty oak of public opinion polling has grown. The firm that Gallup founded spread throughout the democratic world, and in some languages Gallup is actually the word used for an opinion poll.20

It would be prohibitively expensive and time-consuming to ask every citizen his or her opinion on a whole range of issues. Instead, polls rely on a sample of the population—a relatively small proportion of people who are chosen to represent the whole. Herbert Asher draws an analogy to a blood test to illustrate the principle of sampling.21 Your doctor does not need to drain a gallon of blood from you to determine whether you have mononucleosis, AIDS, or any other disease. Rather, a small sample of blood will reveal its properties.

In public opinion polling, a random sample of about 1,000 to 1,500 people can accurately represent the “universe” of potential voters. The key to the accuracy of opinion polls is the technique of random sampling, which operates on the principle that everyone should have an equal probability of being selected as part of the sample. Your chance of being asked to be in the poll should therefore be as good as that of anyone else—rich or poor, black or white, young or old, male or female. If the sample is randomly drawn, about 13 percent of those interviewed will be African American, slightly over 50 percent female, and so forth, matching the population as a whole.

Remember that the science of polling involves estimation; a sample can represent the population with only a certain degree of confidence. The level of confidence is known as the sampling error, which depends on the size of the sample. The more people that are randomly interviewed for a poll, the more confident one can be of the results. A typical poll of about 1,500 to 2,000 respondents has a sampling error of ±3 percent. What this means is that 95 percent of the time the poll results are within 3 percent of what the entire population thinks. If 40 percent of the sample say they approve of the job the president is doing, one can be pretty certain that the true figure is between 37 and 43 percent.

In order to obtain results that will usually be within sampling error, researchers must follow proper sampling techniques. In perhaps the most infamous survey ever, a 1936 Literary Digest poll underestimated the vote for President Franklin Roosevelt by 19 percent, erroneously predicting a big victory for Republican Alf Landon. The well-established magazine suddenly became a laughingstock and soon went out of business. Although the number of responses the magazine obtained for its poll was a staggering 2,376,000, its polling methods were badly flawed. Trying to reach as many people as possible, the magazine drew names from the biggest lists they could find: telephone books and motor vehicle records. In the midst of the Great Depression, the people on these lists were above the average income level (only 40 percent of the public had telephones then; fewer still owned cars) and were more likely to vote Republican. The moral of the story is this: accurate representation, not the number of responses, is the most important feature of a public opinion survey. Indeed, as polling techniques have advanced over the past 80 years, typical sample sizes have been getting smaller, not larger.

Computer and telephone technology has made surveying less expensive and more commonplace. In the early days of polling, pollsters needed a national network of interviewers to traipse door-to-door in their localities with a clipboard of questions. Now most polling is done on the telephone with samples selected through random-digit dialing. Calls are placed to phone numbers within randomly chosen exchanges (for example, 512-471-XXXX) around the country. In this manner, both listed and unlisted numbers are reached at a cost of about one-fifth that of person-to-person interviewing. There are a couple of disadvantages, however. A small percentage of the population does not have a phone, and people are substantially less willing to participate over the telephone than in person—it is easier to hang up than to slam the door in someone’s face. These are small trade-offs for political candidates running for minor offices, for whom telephone polls are the only affordable method of gauging public opinion.

However, in this era of cell phones, many pollsters are starting to worry whether this methodology will continue to be affordable. As of 2014, government studies showed that over a quarter of the population had cell phone service only. This percentage is significantly higher among young adults, minorities, and people who are transient. Because federal law prohibits use of automated dialing programs to cell phones, pollsters have to use the far more expensive procedure of dialing cell phones numbers manually. In addition, studies have shown that people are much less likely to agree to be interviewed when they are reached on a cell phone as compared to a landline. All told, Mark Mellman, one of America’s top political pollsters, estimates that it is 5 to 15 times as expensive to gather interviews from the cell-phone-only segment of the population as from landline users.22 Although big firms like Gallup have successfully made the adjustment so far, the costs of conducting phone polls are likely to further escalate as more people give up their landlines.

As with many other aspects of commerce in America, the future of polling may lie with the Internet. Internet pollsters, such as Knowledge Networks, assemble representative panels of the population by first contacting people on the phone and asking them whether they are willing to participate in Web-based surveys on a variety of topics. If they agree, they are paid a small sum every time they participate. And if they don’t have Internet access, they are provided with it as part of their compensation. Once someone agrees to participate, he or she is then contacted exclusively by e-mail. As Knowledge Networks proclaims, “This permits surveys to be fielded very quickly and economically. In addition, this approach reduces the burden placed on respondents, since e-mail notification is less obtrusive than telephone calls, and most respondents find answering Web questionnaires to be more interesting and engaging than being questioned by a telephone interviewer.”23

From its modest beginning, with George Gallup’s 1932 polls for his mother-in-law in Iowa, polling has become a big business. That it has grown so much and spread throughout the world is no surprise: From Manhattan to Moscow, from Tulsa to Tokyo, people want to know what other people think.

What are some of the best practices one should look for in a poll in order to establish that its results are reasonably accurate? What type of poll do you think is best and why would you choose this type?

The Role of Polls in American Democracy

Polls help political candidates detect public preferences. Supporters of polling insist that it is a tool for democracy. With it, they say, policymakers can keep in touch with changing opinions on the issues. No longer do politicians have to wait until the next election to see whether the public approves or disapproves of the government’s course. If the poll results shift, then government officials can make corresponding midcourse corrections. Indeed, it was George Gallup’s fondest hope that polling could contribute to the democratic process by providing a way for public desires to be heard at times other than elections. His son, George Gallup, Jr., argued that this hope had been realized in practice, that polling had “removed power out of the hands of special interest groups,” and “given people who wouldn’t normally have a voice a voice.”24

Critics of polling, by contrast, say it makes politicians more concerned with following than leading. Polls might have told the Constitutional Convention delegates that the Constitution was unpopular or might have told President Thomas Jefferson that people did not want the Louisiana Purchase. Certainly they would have told William Seward not to buy Alaska, a transaction known widely at the time as “Seward’s Folly.” Polls may thus discourage bold leadership, like that of Winston Churchill, who once said,

Nothing is more dangerous than to live in the temperamental atmosphere of a Gallup poll, always taking one’s pulse and taking one’s temperature.... There is only one duty, only one safe course, and that is to try to be right and not to fear to do or say what you believe.25

Based on their research, Jacobs and Shapiro argue that the common perception of politicians pandering to the results of public opinion polls may be mistaken. Their examination of major recent debates finds that political leaders “track public opinion not to make policy but rather to determine how to craft their public presentations and win public support for the policies they and their supporters favor.”26 Staff members in both the White House and Congress repeatedly remarked that their purpose in conducting polls was not to set policies but rather to find the key words and phrases with which to promote policies already in place. Thus, rather than using polls to identify centrist approaches that will have the broadest popular appeal, Jacobs and Shapiro argue, elites use them to formulate strategies that enable them to avoid compromising on what they want to do. As President Obama’s chief pollster, Joel Benenson, said in 2009 about his team’s work for the president: “Our job isn’t to tell him what to do. Our job is to help him figure out if he can strengthen his message and persuade more people to his side. The starting point is where he is and then you try to help strengthen the message and his reasons for doing something.”27

Yet, polls might weaken democracy in another way—they may distort the election process by creating a bandwagon effect. The wagon carrying the band was the centerpiece of nineteenth-century political parades, and enthusiastic supporters would literally jump on it. Today, the term refers to voters who support a candidate merely because they see that others are doing so. Although only 2 percent of people in a recent CBS/New York Times poll said that poll results had influenced them, 26 percent said they thought others had been influenced (showing that Americans feel that “it’s the other person who’s susceptible”). Beyond this, polls play to the media’s interest in who’s ahead in the race. The issues of recent presidential campaigns have sometimes been drowned out by a steady flood of poll results.

Probably the most widely criticized type of poll is the Election Day exit poll. For this type of poll, voting places are randomly selected around the country. Workers are then sent to these places and told to ask every tenth person how he or she voted. The results are accumulated toward the end of the day, enabling the television networks to project the outcomes of all but very close races before hardly any votes are actually counted. In some presidential elections, such as 1984 and 1996, the networks declared a national winner while millions on the West Coast still had hours to vote. Critics have charged that this practice discourages many people from voting and thereby affects the outcome of some state and local races.

In exit polls, voters are interviewed just after they have voted. These polls are used by the media to project election results as soon as the polls are closed, as well as to help the media understand what sorts of people have supported particular candidates.

Perhaps the most pervasive criticism of polling is that by altering the wording of a question, pollsters can manipulate the results. Small changes in question wording can sometimes produce significantly different results. For example, in February 2010, the New York Times/CBS News poll found that 70 percent favored permitting “gay men and lesbians” to serve in the military whereas only 44 percent favored military service by “homosexuals” who “openly announce their sexual orientation.” Thus, proponents of gays and lesbians in the armed forces could rightly say that a solid public majority favored their military service while opponents could rightly counter that only a minority favored lifting the ban on open military service by homosexuals. This example illustrates why, in evaluating public opinion data, it is crucial to carefully evaluate how questions are posed. Fortunately, most major polling organizations now post their questionnaires online, thereby making it much easier than ever before for everyone to scrutinize their work.

A nuts-and-bolts knowledge of how polls are conducted will help you avoid the common mistake of taking poll results for solid fact. But being an informed consumer of polls also requires that you think about whether the questions are fair and unbiased. The good—or the harm—that polls do depends on how well the data are collected and how thoughtfully the data are interpreted.

What Polls Reveal About Americans’ Political Information

Thomas Jefferson and Alexander Hamilton had very different views about the wisdom of common people. Jefferson trusted people’s good sense and believed that education would enable them to take the tasks of citizenship ever more seriously. Toward that end, he founded the University of Virginia. In contrast, Hamilton lacked confidence in people’s capacity for self-government. His response to Jefferson was the infamous phrase, “Your people, sir, is a great beast.”

If there had been polling data in the early days of the American republic, Hamilton would probably have delighted in throwing some of the results in Jefferson’s face. If public opinion analysts agree about anything, it is that the level of public knowledge about politics is dismally low. This is particularly true for young people, but the level of knowledge for the public overall is not particularly encouraging either. For example, in October 2008, the National Annenberg Election Survey asked a set of factual questions about some prominent policy stands taken by Obama and McCain during the campaign. The results were as follows:

If so many voters did not know about the candidates’ stands on these hotly debated issues, then there is little doubt that most were also unaware of the detailed policy platforms the candidates were running on.

Section 6.3 includes a discussion of what polls reveal about Americans’ political information. Democratic theory presumes that in a democracy people are well informed enough to guide the policies that their government pursues. Yet much political science research in the U.S. has uncovered shockingly low levels of public information about politics. Do you think the American public is well informed enough to guide the policies of the U.S. government?

Point to Ponder

Pollsters sometimes ask people about policy issues with which they are largely unfamiliar.

What do you think—should one take the findings from such polls with a big grain of salt?

No amount of Jeffersonian faith in the wisdom of the common people can erase the fact that Americans are not well informed about politics. Polls have regularly found that less than half the public can name their representative in the House. Asking people to explain their opinion on whether trade policy toward China should be liberalized, or whether research on the proposed “Star Wars” missile defense system should be continued, or whether the strategic oil reserve should be tapped when gasoline prices skyrocket often elicits blank looks. When trouble flares in a far-off country, polls regularly find that people have no idea where that country is. In fact, surveys show that many Americans lack a basic awareness of the world around them; you can see one such example in Figure 6.3.

Figure 6.3 Many Americans Show Little Knowledge of World Geography

In 2002, a major study sponsored by National Geographic interviewed a representative sample of 18- to 24-year-old Americans to assess their knowledge of world geography. The average respondent got 46 percent of the questions right. Believe it or not, 11 percent of young Americans could not even find their own country on the map. Despite the American military campaign in Afghanistan after September 11, only 17 percent could correctly place that country on the map. You can take the test yourself above.

SOURCE: Based on the test administered in National Geographic’s cross-national survey.

As Lance Bennett points out, these findings provide “a source of almost bitter humor in light of what the polls tell us about public information on other subjects.”28 For example, slogans from TV commercials are better recognized than famous political figures. And in a Zogby national poll in 2006, 74 percent of respondents were able to name each of the “Three Stooges”—Larry, Curly, and Moe—whereas just 42 percent could name each of the three branches of the U.S. government—judicial, executive, and legislative.

How can Americans, who live in the most information-rich society in the world, be so ill informed about politics? Some blame the schools. E. D. Hirsch, Jr., criticizes schools for a failure to teach “cultural literacy.”29 People, he says, often lack the basic contextual knowledge—for example, where Afghanistan is, or what the provisions of the Affordable Care Act are—necessary to understand and use the information they receive from the news media or from listening to political candidates. Nevertheless, it has been found that increased levels of education over the past five decades have scarcely raised public knowledge about politics.30 Despite the apparent glut of information provided by the media, Americans do not remember much about what they are exposed to through the media. (Of course, there are many critics who say that the media fail to provide much meaningful information.)

The “paradox of mass politics,” says Russell Neuman, is that the American political system works as well as it does given the discomforting lack of public knowledge about politics.31 Scholars have suggested numerous ways that this paradox can be resolved. Although many people may not know the ins and outs of most policy questions, some will base their political behavior on knowledge of just one issue that they really care about, such as abortion or environmental protection. Others will rely on simple information regarding which groups (Democrats, big business, environmentalists, Christian fundamentalists, etc.) are for and against a proposal, siding with the group or groups they trust the most.32 And finally, some people will simply vote for or against incumbent officeholders based on how satisfied they are with the job the government is doing.

Why It Matters to You Political Knowledge of the Electorate

The average American clearly has less political information than most analysts consider to be desirable. While this level of information is surely adequate to maintain our democracy, survey data plainly show that citizens with above-average levels of political knowledge are more likely to vote and to have stable and consistent opinions on policy issues. If political knowledge were to increase overall, it would in all likelihood be good for American democracy.

How Much Do People Know about Politics?

Let’s explore how much people know about politics. Let’s also consider what sorts of people are the best informed. In the American National Election Study of 2012, twelve factual questions were asked of a representative sample of the American public. Before we look at the results, go ahead and try to answer the questions yourself.

The American National Election Study (ANES) of 2012

In the 2012 ANES study, the average respondent got 5.8 questions correct, or about 48 percent. Factors such as differences in education, family income, age group, gender, and racial group are related to political knowledge as you can see by clicking through the slideshow below.

The Decline of Trust in Government

Sadly, the American public has become increasingly dissatisfied with government in recent decades, as shown in Figure 6.4. In the late 1950s and early 1960s, nearly three-quarters of Americans said that they trusted the government in Washington to do the right thing always or mostly. By the late 1960s, however, researchers started to see a precipitous drop in public trust in government. First Vietnam and then Watergate shook people’s confidence in the federal government. The economic troubles of the Carter years and the Iran hostage crisis helped continue the slide; by 1980, only one-quarter of the public thought the government could be trusted most of the time or always. Since then, trust in government has occasionally risen for a while, but the only time a majority said they could trust the government most of the time was in 2002, after the events of September 11.

Figure 6.4 The Decline of Trust in Government, 1958–2012

This graph shows how people have responded over time to the following question: How much of the time do you think you can trust the government in Washington to do what is right—just about always, most of the time, or only some of the time? 

When this question was written in 1958, survey researchers could not imagine that anyone would respond “never,” so the traditional wording of the trust in government question omits this option. In 2012, about 5 percent of respondents volunteered that they never trusted the government. Some pollsters have experimented with including the option of “never” and have found that as much as 10 percent of their sample will choose it.

SOURCES: Authors’ analysis of 1958–2012 American National Election Study data. As there were no election studies for 2006 and 2010 we have used the following sources for those years: December 2006 Pew Research Center poll; February 5–10, 2010 New York Times/CBS News Poll.

Some analysts have noted that a healthy dose of public cynicism helps to keep politicians on their toes. Others, however, note that a democracy is based on the consent of the governed and that a lack of public trust in the government is a reflection of their belief that the system is not serving them well. These more pessimistic analysts have frequently wondered whether such a cynical population would unite behind their government in a national emergency. Although the drop in political cynicism after September 11 was not too great, the fact that it occurred at all indicates that cynicism will not stop Americans from rallying behind their government in times of national crisis. Widespread political cynicism about government apparently applies only to “normal” times; it has not eroded Americans’ fundamental faith in our democracy.

Perhaps the greatest impact of declining trust in government since the 1960s has been to drain public support for policies that address the problems of poverty and racial inequality. Mark Hetherington argues, “People need to trust the government when they pay the costs but do not receive the benefits, which is exactly what antipoverty and race-targeted programs require of most Americans. When government programs require people to make sacrifices, they need to trust that the result will be a better future for everyone.”33 Hetherington’s careful data analysis shows that declining trust in government has caused many Americans to believe that “big government” solutions to social problems are wasteful and impractical, thereby draining public support from them. Indeed, during the debate over health care reform, President Obama’s advisers argued that the primary obstacle they faced was not persuading the public of the need for health care reform but, rather, convincing them to put sufficient trust in the government’s ability to carry out the reform.34 Obama acknowledged the problem in his 2010 State of the Union address, saying, “We have to recognize that we face more than a deficit of dollars right now. We face a deficit of trust—deep and corrosive doubts about how Washington works that have been growing for years.” In the 2012 election, Republicans tried to exploit such doubts about the trustworthiness of the federal government, arguing that their values favoring free enterprise solutions over governmental programs were more in tune with Americans’ basic values.