I’ll be moderating conversations about VoC at next week’s Operations Summit and it got me thinking… It’s practically a given that every company will issue a customer satisfaction survey as part of their VoC program. But it’s NOT a given that every survey will improve customer satisfaction.
Think about your own satisfaction survey for a moment. Are you collecting accurate data? Is the data actionable? Are you able to identify clear gaps and opportunities?
Customer listening programs often suffer from a host of flaws and biases. In fact, in our recent study of point-of-purchase surveys we found that the largest US retailers pack their surveys with tired, biased, and often irrelevant questions.
And when clients come to us with their surveys, here are some of the common flaws:
- Surveys so long they alienate customers.
- Surveys that force customers to choose from irrelevant multiple-choice options.
- Surveys whose customer comments never get properly analyzed.
Good surveys produce good data, and good data reflects the experiences your customers actually have with your company. Good data also shows where you need to improve.
This 6-step process will improve your VoC program by providing a customer satisfaction survey that gets to the heart of customers’ expectations, their perceptions and how they feel about their experiences with you.
- Evaluate your current survey(s) and map your unknowns.
- Tailor your language. Think about your industry and customers. How would your customers describe their experiences with you? Ask your team:
- Develop branching logic. Consider your customers. Have you done a persona study? Does each persona interact with different touchpoints? For example, don’t force an end-user to click mindlessly through questions specific to distributors; it will result in junk data.
- Draft your questions. Iteratively. If you think a survey can be built in a day, you’re wrong. You’re asking customers to spend their valuable time taking your survey, so you’ll need to spend your valuable time building it. Questions should be put through detailed development and rigorous review processes. Return to step 1, and vet your newly drafted questions against the list of common problems. Then edit, and edit again. In fact, we recommend getting internal AND external feedback on your survey questions—before you edit one last time.
- Code and analyze the data. Once you’ve got your survey responses in, it’s time to find the signal in all that noise. Hopefully you have a large, statistically-significant, set of respondents, so your findings are predictive and forecastable. As part of your survey analysis, it is critical to code the open-ended comments. And by code we don’t mean simply read or make a word cloud. You need to scientifically parse and categorize the comments because this is how you bring that data to life in meaningful, actionable ways.
- Present your findings—graphically. To get your team on board with your VoC results, curate your metrics down to a simple few and incorporate infographics. Use a dashboard to get everyone involved with the data and next step actions. Some great customer experience metrics that we advocate for include: Quality of Customer Interaction™, Customer Effort, Competitive Edge, and Persuasion Scores.
Not all surveys are created equally. In fact, many customer satisfaction surveys are disengaging and result in inaccurate data. But our 6-step process, gives you the framework you need for a stellar survey—one that collects accurate customer feedback, motivates teams to improve in specific ways, and shows customers their voices are hear.