Developing Useful Questions

Objective

After completing this lesson, you will be able to adhere to quality standards when developing survey questions

Differentiating question types

When using surveys for gathering data on change management, thorough development of questions is a topic that deserves specific attention. Before you start, it is important to be aware of the different question types and their respective pros and cons.

Closed questions

Closed questions allow the respondent to select an answer among pre-defined options. The most common choice is between "yes" and "no" (e.g., "Do you work with SAP S/4HANA in your current job role?"). In addition, closed questions can be formulated as multiple-choice, for example by providing a list with boxes to click or a drop-down menu to select relevant options. Questions then often start with interrogatives such as "what", "which", "how much", or "to which extent".

Respondents don’t have the option to provide open feedback and explain their choices. By asking closed questions, the change manager gets structured data that is easy to sort, code and quantify. However, this simplicity implies to lose out on some of the deeper insights that respondents could have provided.

Therefore, closed questions are especially suitable if

  • your audience is only able or willing to invest little time in the survey,
  • you want to collect feedback on pre-defined answers and not require additional input,
  • you need quantifiable data for subsequent statistical analysis.

Hint

Rating questions are the most common type of closed questions. The are used to ask respondents to evaluate a certain object or topic using a predefined scale along a dimension (e.g., likeliness, satisfaction, importance, agreement).

Another powerful and under-utilized type are ranking questions. Ranking questions ask respondents to order a list of items based on their preference, importance, frequency, or satisfaction. By forcing the respondent to prioritize the provided response options, ranking questions are a good option for a direct comparison of different aspects. However, they lead to a higher respondent burden, as the respondents need to invest more time and cognitive effort in providing the answer.

Open questions

Open questions might begin with "how", "why", "describe…" or other conversational phrases. It encourages the respondent to provide specific insights, ideas or concerns, for example "How can we improve the project communication?". The answers are usually given in an open text field in the respondent’s own words. Therefore, this type of questions produces unstructured data, also called "natural language data".

The data captured with open questions is rich in information. However, due to its complexity and volume, a manual data analysis is very time-consuming. Therefore, AI-supported analysis tools, such as natural language processing and sentiment analysis are often applied to extract the full value from the respondents’ answers.

Overall, the use of open questions is indicated if:

  • you assume the respondents can provide new, unexpected insights and are motivated to provide the desired answers,
  • you want to dive deeper into a topic and understand the "why" behind the facts,
  • you wish to provide an option to add new categories to a pre-defined multiple-choice list (e.g., as "Other – please specify").

Open questions should be used very carefully and be formulated as specific as possible to keep the respondent burden low and limit the analysis effort.

The advantages and disadvantages of closed and open questions are summarized in the following chart.

Comparison chart of closed vs. open survey questions, listing pros and cons. Closed questions: advantages: fast to answer, generate quantitative data, support general conclusions; disadvantages: require predefined answer options, provide limited background information, prone to response biases. Open questions: advantages: provide rich qualitative insights, explain the ‘why’ behind results, support identifying implications and follow-up actions; disadvantages: higher respondent burden and dropouts, may receive superficial answers due to anonymity concerns, require more effort to analyze.

Reflecting the phrasing of your questions

When developing survey questions, thinking about the formulation is time well-invested.

Common mistakes

The following list provides an overview of formulations that can irritate the survey participants and cause incorrect or missing responses:

  • Leading questions: Questions like "Don't you think the new system is better than the current one?" already suggest a correct answer, which potentially biases the respondents and skews the survey results.
  • Double-barreled questions: Asking for feedback on two or more aspects in one question, especially in closed questions, makes it difficult for the respondent to select an answer and for the change manager to interpret the resulting data. For example, if you ask, "Do you find the new process useful and intuitive?", it is impossible to say if the respondent rates the aspects "useful" and "intuitive" equally or calculates the average of two distinct ratings.
  • Negative or judgmental questions: Avoid questions that imply blame or criticism, for example "Why haven't you adapted to the new system yet?". Most probably, the respondent will feel uncomfortable or even offended, which can cause incorrect or lacking answers.
  • Hypothetical questions: Surveys aim to gather insights that allow the immediate derivation of concrete activities. Hypothetical questions like "If process training were available, would you probably use it?" require participants to speculate on situations that are currently not present, thus diverting the focus from real issues that need addressing.
  • Overly personal questions: Invading privacy with questions like "How has the implementation of the new system affected your individual work-life balance?" create discomfort and undermine the trust in the entire survey. It must be ensured that all questions are relevant for the organizational context and don’t overstep personal boarders.

For an overview of common mistakes when developing questions, take a look at the chart below.

Table summarizing common mistakes in survey question design, with examples and resulting risks: leading questions (bias, skewed results), double-barreled questions (hard to answer, difficult to interpret), negative/judgmental questions (uncomfortable or offensive, inaccurate answers), hypothetical questions (respondents speculate, distract from real issues), and overly personal questions (discomfort, reduced trust in the survey).

Professional wording

Considering the following advice helps you to further enhance the quality of your questionnaire:

  • Concise sentences: Make it short and avoid long, complex sentences, as this might unnecessarily confuse the respondents. Instead, use straightforward language that is easy to understand.
  • Positive phrasing: Negations are difficult to understand, and especially double negations (e.g. "I would not suggest to not offer training for the new cloud solution") unnecessarily enhance the respondent burden and can lead to misunderstandings and wrong answers.
  • Familiar terms: Try to step into the shoes of your target group when developing questions. A survey on the same topic will most probably look different if it targets IT experts or blue-collar workers. Avoid jargon and technical terms, unless you are certain that all respondents are familiar with them.
  • Consistent terminology: Ensure that you maintain consistent terminology throughout the survey. Don’t use different terms for the same topic or switch to abbreviations without explaining them, as this might irritate the respondents.
  • Neutral tone: Neutral wording helps to prevent the respondent gives biased answers. For instance, instead of using the word "challenges", you might use "aspects to consider" to keep the tone balanced. Keep in mind that a question implying judgements not only impacts the respective answer, but might also influence the overall perception of the survey and the mindset of the respondent.
  • Action-oriented framing: Frame questions positively, for example ask for improvements and ideas rather than problems and pain points. This encourages actionable feedback and helps you to derive concrete activities from the collected data.
  • Specific frequency and time horizon: Generic words like "often", "rarely" or "lately" can be interpreted very differently. Therefore, it is advisable to use concrete indications, for example "at least four times a week", "twice per day" or "during the past three months".

Gaining expertise in survey scales

As Paul wants to finalize his change management survey on measuring the attitude towards the new cloud solution, he is curious to learn more about survey scales. Play the video and watch the discussion between Paul and Mira.

Have a look at the chart below to find some examples for common scales and their respective scale points and labels:

Table with examples of common survey rating scales and their labels. It compares 5-point and 7-point scales across four types: agreement (from strongly disagree to strongly agree), frequency (from never to always), satisfaction (from very dissatisfied to very satisfied, with additional ‘somewhat’ options in the 7-point scale), and intensity (from not at all to extremely/completely)