Your first employee communication survey is done and you’re staring at the data you’ve collected. Lots of data. Mountains of it!
This is when we usually get the call, “Can you analyze the results of a survey I just completed and help me decide what to do?”
Why do communicators get stuck at this point? After all, they’ve done the heavy lifting: got approval to field the survey, drafted questions and made changes based on feedback, programmed the survey and distributed it.
There’s no doubt that analyzing quantitative data is tough and takes time. But I’ve noticed the task can be more challenging because of problems with survey design.
Here are five things you can do differently to make analysis easier:
1. Improve your questions
One of the keys to good data is questions that are specific. (Questions are trickier to write than you may think.) After I draft a question, I ask myself, “Will I be able to take action when I get the result?”
Here’s a sample question: “The town hall was effective.” Even if the majority of employees tell you they strongly disagree with this statement, you won’t know what to change. But if you get a majority who disagree with, “I understand what I need to do after the town hall,” then you know the call to action wasn’t clear.
2. Ask about issues you can or want to change
This tip is all about real estate. The more questions in your survey, the more data you’ll have to analyze and prioritize.
Stay focused on questions that will help you reach your objectives. Delete those nice-to-know questions. For example, if you don’t have budget to change the intranet, don’t ask employees what they would change.
3. Use consistent questions
If you change the questions every time you run the survey, you lose the opportunity to understand if the changes you’re making are helping or hurting. And you’ll spend more time analyzing.
Of course, it’s always good to add a few new questions, so you can address late-breaking initiatives or updates to your communication program.
4. Use a consistent scale
It’s challenging to identify trends and gaps when your scale keeps changing. Be consistent with labels (such as, strongly disagree to strongly agree) and the number of points. I like 4- or 5-point scales. Anything above that is noise.
5. Limit open-ended questions
Responses to open-ended questions take longer to analyze, so limit yourself to one or two. If you feel the urge to include more than two open-ended questions (a clue that you’re more interested in why then quantitative data), consider moving to a focus group.