Research answers questions about questions and answers

In Don Dillman’s world, there is no such thing as a simple question. Context matters. Context always matters.

How else do you explain why 70.2 percent of WSU students in one Web survey said they typically study 2.5 hours or less per day, while in answer to exactly the same question, only 28.9 percent of students in a separate Web survey conducted simultaneously said they study that much or less? A third Web survey of students, answering the identical question, revealed that 42.2 percent of respondents studied 2.5 hours or less.

The answer, says Dillman, an expert in survey methodology, is in the answer. Or rather, the choices respondents were given when deciding their answer. In the first survey, students were given six possible responses in a low range, from half an hour or less to more than 2.5 hours. In the second survey, respondents were also given six choices, but in a high range, starting at 2.5 hours or less and going up to 4.5 hours or more. In the third survey respondents were not given choices, but simply supplied their own answer.

Range as an influence
In a 2005 paper titled, “Context Effects in Web Surveys: New Issues and Evidence,” Dillman and his co-authors, graduate students Jolene Smyth and Leah Melani Christian, argue that the range of possible answers provided a context for students to determine their own answers. For instance, students who believed their study habits were about average would choose an answer in the mid range, while students who believed they studied above or below average would choose answers at the high or low end of the scale, regardless of what the scale was.

“This study is about a fairly harmless type of activity,” said Smyth, but understanding how context affects the survey process is crucial to obtaining useful information. “It speaks to the science of surveying and the need to be more rigorous about it,” she said. “Are we getting the information we need?”

Open-ended questions
While asking open-ended questions would seem to elicit the least biased information, Smyth said that isn’t always practical.

Open-ended questions “are often incredibly difficult to code and summarize,” she said. If public opinion polls had relied on open-ended questions during the 2004 election season, she said, “The election would have been long gone before we ever figured out what was going on.”

Dillman, a Regents’ Professor and the Thomas S. Foley Distinguished Professor of Government and Public Policy, is one of the country’s leading authorities on the development of mail, telephone and Internet survey methods. During his 36-year career at WSU, Dillman has authored hundreds of books and publications, including what some consider the bible for conducting public opinion surveys: Mail and Telephone Surveys: The Total Design Method (1978.)

But Dillman doesn’t consider his book immutable. There is always something else to consider.

Next Story

Carson College of Business hosts Nordstrom executive for annual Walton Lecture

Theresa Craw, director of insurance and finance-risk management for Nordstrom Inc., will present “Managing Risk the Nordstrom Way” from 5–6:15 p.m. Monday, Oct. 9, in the CUB Senior Ballroom, on the WSU Pullman campus as the featured speaker of the annual Walton Lecture, free to the public. Craw has nearly a decade of risk management experience at Nordstrom. She […]

Recent News