What do college exams have to do with client surveys?

Signature recently completed our 2015 annual client survey.  Getting our survey results is much like it was for me back in college, when I knew my exam had been graded.  I was excited.  I was nervous. How did I do?  Based upon the exam results, what did I need to do now?

If you have read one of my previous blogs about our survey process at Signature Worldwide, you know a little bit about how this works for us.  For those who are not familiar, we annually survey our clients and ask them two questions – “how likely are you to recommend Signature Worldwide to a friend or colleague?” and “why did you give us the rating you did?.”

Much like in college, the first thing I wanted to know from the survey is “how did we do?”  The thing I noticed right away was the number of survey respondents was up 48%, which made me happy.  If you are reading this and took our survey, thank you!

In planning for this year’s survey, we strategized on how we could increase participation.  One way we worked to drive participation was by embedding our initial survey question in the email survey itself.  I recommend this approach if you are have the ability to do so with your customer surveys, because it creates ease for those you want to take the survey.  For 2016 we will continue to consider ways to increase our respondents to ensure we are getting the best representative sample we can.

Next, I did our high level calculation of the survey results to compare it to the numeric goal we set.  Unfortunately, the calculation showed that we were short of the promoter score goal set for 2015.  While I was disappointed with that result, it was time to dial it back and ask “what did it mean?”  Being a number cruncher by nature, I sent my assistant off to get last year’s results, so we could compare the data to this year.  What do people say? “There is always a story behind the numbers.”  I am definitely one of those people.

The first layer of the onion we peeled back showed us that the ratio of customers who were highly likely to recommend us stayed about the same as last year.  And, since this was a majority of our customers, it was not necessarily a bad thing.  We will always strive to grow that group until we achieve 100%.
Next, we noted those respondents who were not as likely to recommend us.  At this point, it was less about the actual rating, which was not ideal, but more about why they rated us this way.  It’s a good thing we asked that follow up question in our survey.

Although the calculation provided us with our “exam score,” so to speak, it was now time to do some heavy lifting by reading through the comments to the questions of why the survey respondents rated us the way they did.  This was similar to reviewing my exam and learning from the graded answers.  To do this, I printed out the “mega” spreadsheet of responses; grabbed a fresh cup of coffee and put on my “readers”.  Readers are glasses for those of you who have perfect eyesight or who don’t hang out with people in their late 40’s and 50’s, trying to read menus at restaurants.

Being a stable analytic, I went straight to the constructive feedback of those survey respondents that did not rate us as highly likely to recommend; thinking, what needs fixed?  As I read through the comments, what I found were similar challenges we have heard before, about delivering mystery shopping.  The great thing about the feedback was there were many instances where people took the time to be very detailed in how we could improve.  And, while I don’t like hearing that we fell short of our customers’ expectations, I knew we had some actionable data to help us get better.

Now it was time to enjoy the comments from those highly likely to recommend.  At this point, I was on my third cup of coffee, readers still intact and the more I read the bigger my smile got.  You see, in my day-to-day, and maybe yours as well, we are continually jumping from one thing to the next, and rarely take time to stop and reflect on the great things going on in our companies.  As I read through the comments, I noticed on-going themes about how our program drives high levels of customer service with our customers and the value their management places on their dealings with Signature Worldwide to help them manage their business.  But, probably the coolest comments were those where the respondents took a little extra time to single out one of our employees by name saying how much they appreciate working with them.  That was tremendous and was passed along to those employees.

In the end, receiving the voice of the customer through surveys ratings is a lot about achieving the goal set or exam score.  But, it is more about what is learned from reviewing exam results in detail and what information is used going forward.  It’s a chance to improve upon things our customers want to be better and a chance to enjoy our accomplishments in hitting the mark.

Scroll to Top