top of page
Search

Survey Considerations

Updated: Apr 3, 2021

Surveys are hot right now. They’re the single easiest way to access large swaths of customers and stakeholders alike. With respect to gathering feedback, they also require the least amount of effort.


Or do they?

 

As I mentioned in my previous post, surveys can be easily constructed, deployed, and analyzed. But there are considerations that need to be made prior to deployment.

 

1). Explicit Questioning & Actionable Intelligence

 

Common right now is the much favored, “How much do you agree with this statement?” convention. This question is accompanied by, “Strongly agree”, “Agree”, and then their inverses. While this is an acceptable format, it is certainly not the best. The problem here is two-fold: the question isn’t great and as a result, the available answers aren’t either.

 

Look at it this way, what is the difference between “Agree” and “Strongly agree”? How do you quantify the difference? One recent example was from a retail establishment. The question was, “How much do you agree with the following statement: The associate was able to answer my question(s).” In this case, I agreed with the statement because I had not interacted with an associate – I knew exactly what I needed at the store, found it, bought it, left. So I marked, “Strongly agree”. But therein lies the problem. Unless they provide an “N/A” choice, I am left to decide how much I agree with a statement that is not true of my situation. This is one reason I believe organizations should move away from this line of questioning.

 

And then, let’s say I did interact with an associate during my visit, what if I only marked, “Agree”? Does that mean that the associate, for the most part, answered my question(s) but I was still somehow a little dissatisfied with something? There shouldn’t be this type – or any type – of ambiguity in the responses you allow for in your survey. You can tell if there’s too much ambiguity based on if you find yourself contemplating the difference between "Strongly agree" and "Agree".

 

Possible alternatives to their question: Did you leave the store completely satisfied with your experience? Or maybe the store really wants to know if customers are being offered help when they’re in the store. So the question, “Did an associate approach you and offer help at any time during your experience?” This question can be answered with a "Yes" or "No" and very explicitly answers the surveyor's question.


Survey software nowadays often allows for conditional questioning. In the above case, if an individual selects the “No” option, they should be redirected to additional questioning that will elicit more information. So if they answer no, that's your opportunity to really start understand what's going on. This can be applied to any survey.


This example [left] comes from a huge organization. I chatted with one of their customer service reps and then was given this survey.


As a consumer, when I see these types of mistakes, I assume the organization isn't actually using my feedback for any legitimate purpose. After all, the answers don't match the question's convention, so what use would it be anyway?

Don't make these mistakes with your customers.

 

These are just some examples of less than good questions which are accompanied by less than good answers. The point here is to construct questions that really get to the heart of what you're trying to find out.

 

The term ‘Actionable Intelligence’ refers to information that can be acted upon. You will derive actionable intelligence from your surveys only if you know what you're looking for and compliment that with good questions and answers.

 

No survey initiative should be undertaken unless you know what you’re looking for. A hypothesis and/or a question needs to be established prior to creating the survey. Without one of these in place, as I’ve said in the past, your feedback is barren. In other words, it means nothing.

 

One thing I love about Amazon’s customer service survey is its simplicity. If you have an online chat with one of their customer service reps, you’ll receive an email shortly thereafter that goes like this:


Did I solve your problem?

 

Then you’re given a “Yes” button and a “No” button. Unfortunately, I’ve never selected “No” so I can’t speak to what happens after. But its simplicity and explicit questioning and answering is exactly what I was talking about in my first point above. The primary goal of the customer service rep is to, not surprisingly, solve problems. So, I assume Amazon wants to know if that’s what’s happening. Their simple answering convention gives them that information with little to no ambiguity.

 

2). Who’s Who


The demographic to which you deploy your survey should be vetted prior to engagement. In a recent survey project, I had to determine if the group I had available would be the best respondents. After looking through our list, it became clear that our established group wouldn’t likely provide us with the best data. This was due in large part to the fact that those individuals – because of the nature of their positions – didn’t interact with a specific part of our software frequently enough.

 

So we amended the group by acquiring the contact info of those individuals’ subordinates, the folks that work on the front line. Sure enough, our final numbers showed two interesting trends:


-The frontline personnel responded in greater numbers than the other group normally does

-They also responded most explicitly. What I mean by this is they responded with the most emotionally illustrative responses while their supervisors – the initial demographic – responded with the most neutral responses.

 

3). Let Your Response Rate Speak to You

 

I’ve personally deployed good surveys and bad ones. How do I know if they were good versus bad? The response rate.

 

A good survey will have a higher response rate than a bad one. This is a result of the survey’s ease of use. The more complex and lengthy the survey is, the more fatigued your respondents become and they subsequently abandon the survey. Every surveyor has a different threshold, but I like keeping my surveys to less than 15 questions. I’m not entirely sure if there’s a magic number out there, I just try to look at my surveys as if I was the respondent and at what point would I be annoyed.

 

Keep in mind, however, the fewer questions you have, the more pointed they must be for the survey to be effective.


Your response rate can also tell you something about your questions, format, etc. If you have a poor response rate, don't be afraid to reach out to the respondents who did respond and ask for feedback about its format. You could even be extra annoying and send a survey out asking about that survey - like a Matryoshka doll.

 

4). Timing Has an Effect

 

I deployed a survey the week of Christmas last year. I had to assume I would have a limited response rate for at least the first couple weeks after its deployment. Sure enough, when I sent the link out, my inbox exploded with auto-replies from folks who were out of town. I set a follow-up date for a month later. Consider the length of your survey and if it'll cross paths with any major holidays or events that may limit your responses.

 

Be sure to time box your survey, as well. Don’t leave it open permanently. Set a date for closure, stick to it.

 

There’s also some anecdotal evidence that suggests Tuesday – Thursday afternoons are the best times to send out a survey. I haven’t personally seen any discernible difference in my response rate based on the days I’ve deployed them.

 

5). Preemptively Prototype

 

Send your survey out to your department and/or team before you send it to customers. Sure, they’re not customers. But they can spot things you can’t. Even if it’s just grammar, spelling, simple things like that. On one occasion, my team made it clear that one of my questions, which I thought was great, was actually a bungled mess and they didn’t know how to answer it. Boom, that’s why you send it to other people. My ego was blinding me from seeing the problem with the question.

 
 
 

Commentaires


bottom of page