We Got Actionable Data…What Do You Get?

Be careful what you ask for in your customer surveys, because you may get some actionable data.  Of course, this is exactly what we were hoping for when we sent our customer surveys, and our customers did not disappoint.

You see, we are specific in how we survey our customers.  Way back in 2014, we chose to start surveying our customers, focusing on the specific services we provided to them.  We felt this would provide us with more actionable data.  Some organizations call this the “transactional” approach.

Like many companies these days, we grouped our survey responses, tied to customers, into three buckets:

  1. Those very likely to promote Signature Worldwide (promoters)
  2. Those on the fence (passives)
  3. Those who would not promote Signature (detractors).

The percent of customers in each category is important to us.  A high percent in the promoter category keeps us in business, but what helps us to continue to improve are the reasons behind the numbers – why they are willing or not to promote Signature to someone they know, based on a specific service we delivered to them.

For those customers who are very likely to promote us or recommend our service to another, we are considering the reason as something we can try and replicate elsewhere in our business.  And for those customers who said they were not likely to recommend us to others, based upon the service they received, it is important to understand why and rectify the specific situation, and of course, fix something in our processes that may be broken. Here is some of what we discovered from our experience-specific surveys.

Our customers love our training account managers and their delivery.  It was not a surprise to me that people love our training delivery; and therefore, are very likely to recommend Signature Worldwide, based upon their experience in the training class.  Some of the comments customers made focused on the trainer’s ability to really connect with the participants and both motivate and help them to perform their jobs better.  Internally, we spend a lot of time with our trainers to encourage them to collaborate and share ideas and best practices with each other, to ensure their training delivery continues to be relevant and we know, based on the survey comments, this is a practice we need to continue and probably even expand.

Similarly, we had a large percentage of customers who would recommend us just from our Coaching-on-Demand service.  The reasons spanned from describing “coaching” as an invaluable tool to the importance of getting direct feedback during their interaction with our coaches.  On the opposite side of the ledger, those that were less likely to recommend us commented, as an employee of a customer, they did not like that they were made to call into our coaching team.  While this reason was given for those less likely to recommend us, I don’t consider this a bad thing.  At Signature Worldwide, we like it when the supervisors and managers make their employees call in.  This represents an opportunity to help their employees practice and make our training stick even more.   However, we do see this as an opportunity for our training account managers to work with program drivers – instead of “making” their employees call into Coaching-on-Demand, positioning coaching as a job aid or help for their employees turns a negative into a positive.

The third area of service (e.g. transaction) we survey our customers, focuses on mystery phone shopping.  While the percentage of customers highly likely to recommend Signature Worldwide for this service has not been as high as the training delivery or Coaching-on-Demand, it has been the area we have made the most improvement.  Over the last year or so, we have been getting great feedback from those customers who wanted more from our mystery phone shopping.  As a result of getting actionable data from our customers survey responses, we have been able to improve upon our service in the area of mystery phone shopping, making double digit percentage increases in those highly likely to recommend.

The two main areas we needed to focus, as identified by our customer surveys, were consistency in scoring and shoppers being recognized.  Making improvements with these actionable reasons was not an overnight change.  Our call center team has made both these reasons a topic of discussion in their bi-weekly huddles and instituted more and regular scoring calibration sessions.  Over the last six to nine months, we have reduced scoring errors to a fraction, improved shopping scenarios in order to better camouflage our shoppers, and modified our shop call distribution to shop during times when we previously did not.

The feedback has really helped us improve our process and reinforce activities our customers feel are beneficial.  It is great knowing we have engaged customers willing to share their feedback.  And, because we ask our survey question centered on one of our three primary service areas, it becomes actionable to enhance our services.

Is the feedback from your voice of the customer surveys organized in a way that it is actionable?

Scroll to Top