17 best practices for employee listening surveys

Thousands of employers around the world understand that they can’t meet business goals unless employees understand those goals and are motivated to carry them out. The best way to make your goals and team align is to create an employee listening program; that is, survey your workforce with the right number and kinds of questions, so you can get the data you need to strengthen your organization’s culture and close any gaps in its ability to deliver business performance.

Why surveys can flop

Employee survey best practices

Employee surveys have been around since the 1920s, and some estimates say that around 50-75% of organizations use them.

Whether they are getting the most out of the surveys is another question. According to one recent online poll, just 22% of HR leaders say they are getting good results from their engagement surveys.  Poor survey design, including surveys that are too long, lack of leadership support and lack of action can all be factors in why your survey fails to move the needle.

In this guide, CultureIQ Principal Strategist Paul M. Mastrangelo will lead you through best practices for creating an employee listening survey that answers the questions you want, so you can take the actions you need to improve your culture and boost business success.

A survey is not just a survey

Overall, the tricky part about creating an employee survey is in the details. You think a survey is a survey, until you see the difference between what an expert creates versus the novice.

Carefully ask questions:

Whether you are writing a survey question or selecting one from another source, here’s what you need to do

1. Carefully consider what conclusion you would make or what action you would take if most responses were very high, very low, or in the middle. In other words, think about what insight that question will give you at the end of the process, when you need to pull information that will be used to improve things.

2. Don’t word a question in such a way that– even if people all have the same reaction–you don’t know what it means. When everyone strongly agrees with the statement “My performance review is accurate, specific, and helpful,” then life is great. However, if everyone strongly disagrees, then you really don’t know how to make improvements unless you get more information. Maybe people feel their managers don’t know their true performance, or maybe people aren’t sure what specific behaviors are important, or maybe they don’t feel they their effort will improve performanc

3. That’s why it’s often better to ask shorter questions that get at one aspect instead of asking about more than one aspect. A rule of thumb is to avoid the word “and” in the survey question. Remember, a short simple question is quicker for employees to answer and easier for managers to act upon.

Getting response rates right:

4. The biggest misunderstanding in all the survey field is the notion that having a certain percentage of responses makes the survey scores “valid.” This is not true. The mathematics are hard to explain, but what really matters is having a sample of at least 385 respondents from the population they represent. Basically, getting 385 people to take a survey makes it nearly impossible to accidentally get just the disgruntled or just the optimists. A sample of 385 will represent a population of 10 billion so accurately, that even if you could get survey all 10 billion, the score would be within 5 percentage points of the score you get from the 385. It’s like the magic number for surveying.

5. But here’s is the catch: Every group that needs its own report is like its own population that requires a sample of 385. If you want to know scores from men AND women, there should be at least 385 men and 385 women. If you need to compare sales representatives, call center staff, and installation crews, then each group should be represented by 385 people. Wait, you don’t have that many sales representatives? Ah, that’s when having a high response rate is important. If you have 100 sales reps, then a score that accurately represents all of them must be based on 80 participants, which is a response rate of 80%. If you have just 50 sales reps, then you need a 90% response rate to get that same level of accuracy.

6. When headcounts get smaller, you need to have a response rate that is even higher. So, what is the right response rate? It depends on how low you want to go. Shoot for 80% overall, but if you need to get really accurate scores from the 5 IT people in each of the 8 locations, then you need ALL OF THEM to respond. (If they don’t all respond, the scores are less likely to match what they would have been if everyone participated. So when that situation pops up, don’t withhold a bonus because a score was 14 points below the goal – that score is not accurate enough to use that way.)

7.. Bear in mind that employees can be really poor evaluators of their own context, but far better reporters of what they see around. (i.e. Responses to “I am awesome” are far more biased than responses to “My work group is awesome.” I respect other people’s differences vs. In my work group other people’s differences are respected. It’s a subtle, but important difference.

Take accurate pulses:  

8. Some people use the term “pulse” to refer to a survey that only goes to a portion of the whole company. If you want to ask a small number of employees to take a survey, then you should not promise that all managers will get a report of results and you should not promise to have a breakdown of demographic cuts. You only get the luxury of those features when you have more participants.

9. If you are inviting a small percentage of the company, then you can only make inferences about the overall company attitude. Here is another situation where a low response rate can mean trouble. Imagine you invite 25% of the full headcount of 1,000, and then your response rate is 50%. That means that just 125 people took the survey, and 125 is well below the magic survey number of 385 – the ideal sample size. Your scores in this case will not be very accurate at all. Either push for a higher response rate or invite more people.

10. Some people use the term “pulse” to refer to a small set of survey questions, often repeatedly asked over time. The common practice is to get a small set of questions from the larger set of questions that get asked once a year or so. The thought is that you get data trends to track changes. Well, okay, maybe, but most companies have trouble creating those changes, and the pulse survey scores show no change. That’s just a missed opportunity. Why not treat each pulse survey as a different part of a conversation about the changes needed? Pulse 1 can ask if managers talked with their teams about how they can support the innovation goal the company set. If scores are low, then it’s obvious what needs to happen to push this initiative. If scores are high, then what’s the next step? Pulse 2 can ask managers if a specific performance target was created for the team to support the innovation goal. Again, low scores spark manager behavior to get to the next phase. Pulse 3 can ask if managers are seriously helping teams change their behaviors to reach their team goal. The pulse can be about the action, not just a measure of the wishful outcome.

11. Don’t fall into the same-question survey rut. If you pulse frequently and notice the scores go up for awhile, then down to the baseline, it’s time to reframe your questions to get to the cause of the change. Pulsing on the same questions  won’t fix anything.  For instance, asking something like, “I am encouraged to try new things” over and over is a repeated question about what the company hopes to achieve. Are we there yet? Are we there yet? Are we there yet? Why not ask where we are and how much further instead. With different survey content, the pulse could ask about what managers did that resulted a change. That type of survey is more directly actionable and could be dynamic (e.g., if scores went down, you ask what went wrong). A static survey risks banging people on the heads with the same high level questions about the immediate work experience, with nothing specific to the change process and nothing ever changing about the “conversation” with employees.

Predictive analytics need specifics:

12. Survey scores are most predictive of performance outcomes when the survey questions are specific to that outcome. If you want the employee survey to predict same store sales, then look at employee questions that specifically deal with the customer experience. Yes, it is true that when employees are more likely to recommend their company to friends, the company is more likely to be successful with customers. However, that correlation will be small and not very helpful to the company whose employees are NOT likely to make the recommendations.

13. Consider how much better it would be to examine a question like “When a customer has a problem, I can usually fix it on my own.” That is a very specific question about customer service, and it is far more related to the company’s ability to retain a satisfied customer. And if a company has low scores on this question, you know exactly how to improve things.

Don’t over-release those releases

14. Companies want fresh data, but they don’t want to over survey their workforce. So, a good compromise is to think about the four quarters of the year, and have a survey in each quarter.

15. One of those surveys should invite everyone: a census. The other three surveys should invite just one third of the employees: a sample survey. Use the census to get as many managers their own data and to produce as many demographic cuts as needed to understand the whole picture. Then use the sample surveys to ask about the topics that need the most attention.

16. Don’t just repeat the same questions in each sample survey, but instead ask some new questions that measure what changes have taken place to eventually improve that big attention grabber from the census. Employees will only be asked to take two surveys per year, but managers will be pushed to do something about a frustration point – not just ask the same question over and over.

17. Consider adding an opt-in panel survey, where employees volunteer to be surveyed on a regular basis. It treats surveying as more of a conversation if your organization is interested in having focus-group type feedback around issues raised in your surveys.

The survey stakes are high–so get a good partner

The way in which you conduct a survey is the critical first step in understanding what you need to do to use your culture to drive success – whether you are aiming for more engagement, retention, collaboration, innovation or ROI. Getting the survey right clears your path forward.

A large part of CultureIQ’s mission is to help our clients achieve that clear path to results. As part of our core Architect, Assess, Activate strategy, we conduct a quick overview survey to help clients gauge what their ideal culture is, and a deeper dive survey to fully Assess where their culture is now, and what should be changed—and data and analysis from both surveys leads us to a place where we can help our clients Activate a plan to make effective and sustainable changes that move the success needle forward.

CultureIQ ‘s Jon Izenstark contributed to this article.

MORE RESOURCES:

Whitepaper: How to create effective pulse sureys – download it here.

Demo: See how CultureIQ turns employee listening & insights into competitive advantage.