I’ve been involved in the Voice of the Customer and customer experience industry for close to a decade now, and to be honest, I can’t quite believe I’m writing this blog in 2016. Because I thought we KNEW this. I thought we were DONE with this. By “this” I mean the interminably long post-transaction surveys which leave the customer rolling their eyes and probably swearing a bit.
But we’re not. So here we go.
Last week I stayed at a hotel. It’s one of a large chain of hotels in the UK whom I shall not name. The stay was brief and it was fine. I was lucky enough to be going out for dinner somewhere rather fancy so the hotel was primarily so we didn’t have to drive home. The hotel was well-located, efficient and entirely fit for purpose.
The same cannot be said for the survey I just completed.
We talk a lot about best practices for Voice of the Customer programs. Depending on the complexity of your business, number of touchpoints, inclusion of employee feedback, action management strategies and more – there is a lot to consider. But there are also a couple of really basic rules that this survey seemed designed to break at every turn.
- Don’t ask questions if you’re not going to actively use the responses
- Don’t ask questions to which you already know the answer
- Just don’t ask too many questions!
I didn’t count the number of questions in the survey, but it was around 50, and I very much doubt that anyone is now running around planning to take action on them. I dutifully provided scores regarding, among other things, the cleanliness of the room, the bathroom (and all the key facilities located within it individually), check-in procedure, wifi, pillow softness and much, much more.
The level of detail was excruciating, and most of my responses were barely considered (and often wrong because the poor mobile rendering meant that selecting the intended, tiny, radio button was too difficult).
And that brings me onto the other sin committed here. They asked me to provide details on the time I checked in, whether I’d used their hotels before, if I used wifi, which type of room I stayed in and my room number. ALL of which must be information available to them in their booking or CRM systems.
I truly appreciate that businesses want to get the big picture from their customers, but this isn’t the way to go about it. Most of the data will simply be lumped in with (equally un-thought-out) responses from guests in order to build charts about overall percentage of people who thought the pillows were too hard. Or a geographical chart showing that people staying in Wales enjoy a softer bed than those in North Yorkshire.
The survey was issued by a third party Market Research agency which makes the situation both more frustrating – and yet more understandable. Frustrating, because surely professional Market Researchers should know better, but more understandable because it becomes clear that fundamentally, this was an MR exercise, not a VoC one. To be clear, there is nothing wrong with a survey being an MR activity – there is still very much a need for wider studies like this, but as a new customer, was this really the right time for this particular survey? There remains a huge opportunity for MR businesses to refine and apply their expertise to provide VoC guidance to their clients.
I can’t claim that I’m less likely to stay with this hotel chain again, but I certainly won’t complete another survey. Ultimately, this exercise felt like it was entirely about the hotel chain. What they could get from me to help them fill in some charts for their year-end report. If I’m wrong, and they are currently using every single answer to make changes to their hotels, then I apologize and will happily eat one of their (perfectly serviceable) duvets.