Lest you confuse this with the multitude of advice on, ‘how to increase your survey response rate,’ let me divulge. This guide is about showing you how to boost the quality and quantity of responses from your customer feedback survey. A good survey yields high quality data, which gives you actionable insight into your customer satisfaction levels. Anyone can write a basic questionnaire and get a deluge of irrelevant responses. The hard part is getting meaningful responses from the clients who need to be heard.
Chapter 1 Make sure your branding is prominent
You’ve all witnessed some shirtless member of the public waving a squeegee at a stop light and feared you’re about to be car-jacked. Yet when you see a Pizza Hut man wobbling his sign back and forth, you actually consider buying a pizza. Why?
Because the majority of us know, recognise and have a positive emotional affiliation with the brand, Pizza Hut. When you’ve already got a recognisable image and reputation, you should use it to back up the credibility of your survey page. The benefits of branding far outweigh the effort required to publish a survey using your colours, logos, and quality graphics and visuals.
A study by Nielsen (1997) revealed a website’s credibility is pivotal to whether the user will interact with it. Especially since there are so many web-pages whose origins can’t be trusted. Think about the last time a telemarketer called you from an obscure company for an interview. If you didn’t instantly hang up, congratulations, you are a very lovely person. If you completed the survey, how much effort did you put into considering your responses?
If your customers know you, they’ll recognise your brand and you can bank on the emotional connection to give you more meaningful feedback. Because of your forerunning relationship, your clientele will take the time to craft more meticulous responses, yielding you more accurate data.
Chapter 2 Make your survey accessible
Realistically, today’s generation is unwilling to spend over a minute searching for anything online, let alone a customer feedback survey. Making your survey accessible means a higher volume of responses, and since you need a representative sample in order to get accurate results, you want as many of your clients participating as possible.
Remember science? A ‘representative sample’ is one that accurately reflects the members of an entire population (Fink, 2010), or in this case, your current client base. The more quality responses you have, the more reliable your data will be. Don’t forget that making your survey easy to find actually supplements your customer service. Your participants will be relaxed, at-ease, and willing to put more effort into their responses.
Studies by comScore Media show that 15% of your site visitors are now coming via mobile devices. The average percentage of mobile traffic (traffic from “non-PC” devices) to local sites grew to 27 percent in Q4, 2012. Your customers are on their mobiles, so make sure your customer feedback surveys are accessible on mobile devices.
Wouldn't it be great if your survey 'responded' to the device it was being accessed from, and presented itself in the most usable and readable way? In web development this is called 'Responsive Design', and its fast becoming the norm for websites, surveys and even emails.
"From our own research we have found that 10.2% of Client Heartbeat survey respondents were using a mobile phone to submit their survey. If your survey is not optimised for these users, you could be missing out on a higher response rate."Saxon Fletcher, Lead Designer - Client Heartbeat
Chapter 3 Keep your survey short
If your customer feedback survey is a three page interrogation, customers will avoid it like a bad smell. Keep your survey length short, and you will keep them focussed. According to Neilsen, only 16% of today’s web users do more than pick out individual key words and sentences when reading information online. So unless you only want 16 meaningful responses out of every 100, you need to limit the length of your survey.
My advice is to treat your customer the way you would a very young child. You never bombard a child with questions because it leads to less ‘focussed attention’ to each question. Likewise, a wordy questionnaire will bore and confuse your customers into focussing on some questions more carefully than others, leading to a lapse in the overall quality of their responses. For example, a study by Deutskens, Wetzels & Oosterveld (2004) showed the longer the survey was, the higher the number of ‘don’t know’ responses they received, suggesting people focussing on quality responses if the survey drags on for too long.
As a good rule of thumb, Ross recommends sticking to less than ten questions, and limiting the number of text areas the customer has to complete. While it’s true open-ended questions are useful when the intricacies of an issue are unknown, or when you want unanticipated answers, they’re also difficult to compare and interpret (Fink, 2010). Since your clients are as time-stricken as you are, their written responses are often plagued with typos. Not only do these complicate your analysis, text areas make your survey look time consuming.
How your client perceives the survey is crucial, so you need to be mindful of how your survey will appear on a computer monitor. One of the few disadvantages of online surveys, is what could normally be displayed on one sheet of A4 paper may require a decent amount of scrolling online. Presently 85% of people use screen resolutions of 1024 x 768 or higher, which normally allows for less than one A4 page, depending on settings. Keeping your survey to less than two pages means you’ll keep your client engaged enough to draw meaningful feedback for each question.
Everyone falls asleep in front of their computer at some point in their life. Don’t let it be your clientele while they’re reading your customer feedback survey. Restrict your survey length and you’ll get a higher response rate and higher quality of responses, and your customers will be thanking you for it.
Chapter 4 Make your objective explicit
In a perfect world, the title ‘Customer Service Feedback’ is enough to explain the purpose of your customer satisfaction survey to your participants. But for those of us, who’ve worked at Hungry Jacks and had to tell customers ‘No, only McDonalds have Big Macs,’ we know better. You need to make your objective explicit.
Consumers don’t realise satisfaction surveys are disseminated to initiate change within your company. Devoting one or two sentences, ‘We want your feedback to improve our service,’ gives them an incentive for giving you a meaningful response. Some businesses rack up unnecessary costs with voucher and lottery incentives, but making your objective explicit is motivation enough for most. Once customers realize they’ve been given a platform to express their needs and desires to you, they give more thought to their responses. When the responsibility of enacting change lies within their hands, the end result is a higher quality of response, if not quantity as well.
Doubly important is being aware of your own reasons for creating the customer feedback survey. Your objective lets you select certain questions (Fink, 2010), and organise them towards a specific outcome. Many responses to questions can be analyzed independently, or in conjunction with others, and if you don’t understand your own objective, this unnecessarily complicates your analysis ( Gunther and Wyatt, 2002). If your customers become wise to the fact you don’t know what you’re focussing on, you can guarantee the loss of their concentration too.
Simply put, your consumers are real people just like you. Most people need to understand why they are doing something before they do it well. The objective explains ‘why’ your survey is worthwhile, and promises more meaningful results for your business.
Chapter 5 Personalise your survey
If you are trying to produce high-quality, actionable data, stay away from anonymity. Ignore the copious amounts of unsubstantiated information you find online saying anonymous surveys lead to a higher response rate and more frank, honest reponses. Those sources have not done their research.
The majority of literature concerning anonymous and non-anonymous surveys for non-sensitive subjects shows no significant difference in the response rate they produce. A study by Campbell and Waters (1990) found a 2% gap in the response rate, favoring anonymous surveys. While a study also testing the same hypothesis, found a 1% difference in the opposite direction ( Latima, O’Brien, Vasquez, Medina-Mora, Rios-Bedoya, Floyd, 2008).
Many online survey experts will claim anonymous surveys lead to more honest responses. While I can’t dispel this completely, I would argue, honest feedback from your customers is a lot more beneficial than getting anonymous feedback from who knows what (might be your furious competitor, four-year-old child or a well-meaning grandma). In addition, Lelkes, Krosnick, Marx, Judd, and Park (2011) found complete anonymity consistently compromised measurement accuracy, because it decreased accountability and motivation to answer thoughtfully. If you personalize your surveys, you’ll find you receive more valuable feedback, because your clients are accountable to their opinions.
Don’t get me wrong, I understand the temptation to use anonymous volunteer and ‘blanket-email’ surveys. They are so much easier to implement. But volunteer surveys only sample a certain type of customer and invite the possibility of hoax responses. Whilst blanket-email surveys resemble spam and are usually blocked from your client’s inbox before they even see it (Evans and Mathur, 2005). Both anonymous volunteer and blanket-email surveys place the quantity and quality of your data in peril.
The whole point of a customer feedback survey is to accurately measure and identify the key metrics where you need to improve. Personalizing surveys means you know who you’ve sampled, and how relevant their responses are to your business. So what’s the moral of the story? Remove anonymity, and maintain the integrity of your data.
Chapter 6 Measure and track your feedback over time
Amassing quality data without knowing ‘when’ it corresponds to is about as useful as knowing your daughter has a birthday and not knowing the date. Knowledge by itself is great, but it’s only useful when you can act on it.
Let me explain. Your customer’s sentiments towards your business are constantly changing. Whether it’s due to the business climate, new employees or support requirements, you need to be tracking these changes over time, or your data won’t be actionable.
Tracking changes allows you to identify the point in time their attitudes shifted, so you’re able to rectify perceived failures, maintain what you’re doing well and avoid past mistakes. A great way to do this is to release quarterly, monthly or yearly surveys, closing submissions before the next period begins. Once you’ve got a historical context, you can monitor trends in your data and make decisions about refreshing or replacing certain services. For example, if a customer is exhibiting the same trend of behaviour you’ve seen in customers you’ve lost in the past, you can act pre-emptively to avoid a repeat performance.
This may sound like a lot of work, but your business will continue to benefit exponentially, for as long as you continue to issue customer feedback surveys. Tracking your data gives you the power to see where you were, where you are, and where you’d like to be at all times.
Chapter 7 Five Customer Feedback Surveys Reviewed
Firstly, let’s discuss what the casino did well. The branding makes it undeniable this survey belongs to the Red Wind Casino.
With the positives out of the way, I can explain why you will never use this as a template for your customer feedback survey. Firstly, this survey is too long. If I have to scroll down four pages to complete a voluntary survey, that’s three pages too many. Few customers are loyal enough to complete a thirty question survey, loaded with open-ended questions and extensive categorical options. For example, there are four questions with text boxes, three questions requiring categorical responses, and 23 questions requiring a rating. The question I have to ask here is, ‘Do you have a professional anthropologist working for you?’ because if you don’t, you won’t be able to quantify this data.
Secondly, this survey lacks an explicit objective. The title says, ‘Customer Survey,’ but doesn’t explain why responding is rewarding for either business or customer. The questions aren’t tailored towards a specific purpose, suggesting Red Wind Casino is fishing in the dark. The scope of the questions is too general, addressing advertising, game preferences and customer satisfaction. Obviously the casino is trying to gather as much information as is humanly possible, without realising how this affects the overall sample.
Because the casino’s survey is so long, they’ll lose crucial feedback from a group known as ‘conditional advocates’. These are customers who like you, but won’t go out of their way for you unless there’s an incentive (Walters, 2013). Unfortunately, these are the customers the casino needs to hear from the most. Since they’re only getting responses from their best clientele, they’re missing out on opportunities to gain more customers and keep the ones they have.
Recommendations for Red Wind Casino
- It’s not all doom and gloom. While the Red Wind Casino’s survey has an excessive number of questions, a cloudy objective and no personalisation, these problems are by no means irresolvable. I’ll show you how the casino can optimize their customer feedback survey and thereby accurately measure their customer satisfaction.
- Streamline the number and type of questions. Remove unnecessary questions and text fields, and collapse repeated questions. Some of the casino’s questions have been repeated so many times, I would also recommend splitting them up around an objective.
- Identify a specific objective, and organize questions around it. For example, the topic, ‘Friendliness of Staff’ appears seven times in the survey. Create a seven question survey around this focus.
- Use a program like Client Heartbeat to distinguish a time period when feedback corresponds to. Closing submissions after said time period means you can reopen the customer feedback survey at another time, to identify any changes in customer sentiments.
My first reaction to this survey was to shut my browser. I was looking for a legitimate customer satisfaction survey to review, not a potentially dangerous virus to download. Closer inspection revealed this was indeed the Hilton Hotel’s customer satisfaction survey, albeit a foreign branch.
So why the confusion? Some of you may be familiar with Survey Monkey. While this program is a useful online survey tool, their free service doesn’t allow customised branding. See for yourself the lack of logo, colours and images. Why the Hilton Hotel wouldn’t shamelessly exploit their reputation is beyond me. Without their identifying signature, the website has no credibility, and even the most loyal customer wouldn’t stick around to complete this customer feedback survey.
It’s a shame the survey blunders in this regard, because it’s an appropriate length for maximising response rate and response quality. Another problem I have with their survey is there is no explicit objective. Explaining your survey’s purpose gets customers’ on your side. Artlessly put, if your customers don’t know why they are giving feedback, why should they care? Combine the lack of branding with a seemingly pointless list of questions and you’ll have potential participants disappearing like a mirage.
Missed customer fake responses and fake responses create an overall taint in the quality of your data. It’s an annoying reality all surveys are accompanied with a loss of information due to nonresponses (Fink, 2010). Lamentably, the Hotel can’t know who they’ve missed, and how many hoax responses they’re getting while they’re using an anonymous format. Indeed, I was able to respond to the survey, despite never having the pleasure of staying at this particular Hilton Hotel. While the Hilton does ask for demographic details, this doesn’t show them information about nonresponse in their final analysis.
To the hotel’s credit, question three implies survey responses will be analyzed in the context of their previous experience. Although how they’ll efficiently accomplish this is hard to envision. It’s a long-winded process, which involves analyzing individual text areas and slotting them into time periods for monitoring. I can tell you that takes a long time.
Recommendations for Hilton Hotel
- Presently, the lack of branding, objective and potential for nonresponse and hoax responses leads me to believe the Hilton Hotel’s customer feedback survey won’t yield much high quality, actionable insight into their customer satisfaction. However, I’ll explain how they can improve these areas in the next couple of points.
- The Hilton needs to take advantage of their stellar reputation. The company received four corporate awards last year and has branded over 530 hotels across the globe. They need to brand the survey, use the logo and colours from the official website to generate credibility and encourage participants. My long-range advice for the Hilton Hotel is to avoid free solutions from online surveys tools like Survey Monkey, and move on to sophisticated programs like Client Heartbeat, which brands the survey for you.
- Integrate the customer feedback survey into the new client process. When you sign up new customers, tell them your focus is excellent customer service, and they’ll need to answer a mandatory survey after their stay. Gordon Tan writes more about this in his 9 tips to increasing survey response rates.
- Display the objective so clients understand the purpose for their action. Here is an example by Client Heartbeat:
- Since the Hilton undoubtedly has client records, it’s a simple step to personalize their customer feedback surveys and issue them directly to their clientele’s inboxes. Manually personalizing surveys takes time and effort, so I’d recommend using Client Heartbeat. This intelligent online survey software eliminates the likelihood of human error, and saves you hiring staff to complete the lengthy process for you.
- I’d also recommend using Client Heartbeat to track and monitor changes in customer feedback over time. Not only does the program automatically issue surveys every given period, it also flags changes in customer sentiment requiring your attention.
Straight away, I knew who I was dealing with and why. Telstra’s logo and colours are all over the survey page, and their objective is summed up by the title, ‘Share your feedback, Help us improve’. The ‘ Monty Hamilton’, at the end of the objective adds credibility to the site, especially since a Google Search reveals he is a real person.
Telstra has an exceptionally short four question customer feedback survey. It’s a great length to maximise the quantity and quality of responses. Of its four questions, one requires optional text input, and ensures you are in and out of this survey in thirty seconds.
A minor criticism of the customer satisfaction survey is the superfluous text before the survey begins. There is significant disparity between the number of times I read the bulk of text and the amount of information I actually retained. Nonetheless, the fact that consumers might not actually read the paragraphs prefacing the survey would not detract from the overall quality of responses this survey would yield.
Unfortunately, preserving the anonymity of participants means the validity of the data may be less accurate. While this won’t render Telstra’s survey results useless, it does increase the risk of hoax responses and diminish the quality of their results ( Gunther and Wyatt, 2002). For example, I am a Vodafone customer, and I was able to respond to the Telstra survey, despite never having used their service. Who knows who else has responded?
What differentiates Telstra from the other surveys I’ve reviewed is the fact they use the word, ‘today.’ This suggests Telstra is staying on top of ‘when’ the customer service was experienced, and their results will be relevant.
Recommendations for Telstra
- Telstra’s only real misfire is the lack of personalisation. While the survey will still attract a high volume of quantifiable responses, I question how valid the conclusions drawn from it will be. With exceptional branding, length and a solid objective, Telstra’s survey meets my standards for readability. However, I recommend incorporating a personal element to the survey to ensure a higher quality of responses.
- Assign an Issue ID to clients when they bring up problems, so you know when the feedback relates to.
- Create compulsory texts fields for personal details like name and email.
- The easiest and most efficient method is to use an online survey tool that automates the process and sends the surveys directly to your customer’s inboxes.
Looking at Shopify’s survey brings me back to that time I was standing in the grocery store trying to decide between Kellogg’s Cornflakes and Woolworth’s Home Brand. I wanted to trust the non-branded, cheap-looking cereal box, but there was no guarantee I wouldn’t eat breakfast the next day and hate myself.
Shopify’s brand has enormous pulling power. Yet you look at their survey, and it’s nearly completely black and white, has no branding, logo or images and as a by-product has very little credibility as an independent site.
If that’s not enough to turn their customers away, this survey has way too many questions and answering options. There are 34 questions total, of which four require text input, and an unnecessary change in parameters from question four to question five. Too many shifts in answering options gets confusing. Studies have shown participants get frustrated and exit surveys when answering instructions are too complex ( Ray and Tabor, 2003). A good survey question asks what it needs in an unambiguous way (Fink, 2010). Complicated questions lead to reduced attention and comprehension, which in turn lead to less meaningful responses.
Add on the lack of objective and Shopify has successfully sealed itself in a coffin, from which it may never hear from its clientele again. There is no objective listed on the survey, and therefore customers are given little motivation to commit to such a long-winded feedback form.
To be fair, most of the issues with Shopify’s customer feedback survey stem from their choice of survey tool, Polldaddy. Like Survey Monkey, the free plan allows businesses to upload a non-branded survey to get a snapshot of their customer satisfaction. Using Polldaddy means Shopify is missing out on making their customer feedback survey a brandable and personalized experience. These are two common of the seven customer feedback survey mistakes companies should avoid doing..
Recommendations for Shopify
- No branding or objective equals unmotivated survey participants providing inaccurate data. In addition, a lack of personalization means their results may be compromised by hoax responses. But there is a light at the end of the tunnel. In this next section, I’ll discuss how Shopify can rejig their customer feedback survey to get the information they need.
- Brand, brand, brand. See the difference a little magic in paint can do:
- Make the customer satisfaction survey accessible through the membership page.
- Change the response options in questions four and five to a rating scale. For example:
- Avoid fake responses by asking for a receipt number when customers complete the survey. A less complex option is to use an online survey tool that sends surveys directly to client’s inboxes. Shopify already has a membership system, which means all contacts can be emailed personalised surveys with minimal effort.
- Sending out surveys once a month, quarter or year will give Shopify a realistic reflection of how their customer satisfaction is tracking.
R & G Technologies
R & G Technologies is using Client Heartbeat as their customer feedback survey tool. At first glance, I could immediately tell the survey was from R & G.
Accessibility isn’t even an issue, because according to Operations Manager at R & G, Mimi Tan, Client Heartbeat only samples from within a business’s customer base. The program embeds the survey in a personalized email to R & G’s clients, along with two follow up emails across 14 days. A study by Campbell and Water’s (1990) found reminders increased participation from 52% to 72%. R & G boasts a survey response rate of 71%, as opposed to the traditional 10-15% rate you can expect from using traditional methods online (Tenopir, King, Edwards, Wu, 2009).
The objective is also a brief two sentences, keeping the survey short and sweet. Headlining the objective is a greeting, which uses the customer’s name, ‘Hi John Smith’. I can’t count the number of times I’ve opened a dodgy-looking email in my account, just because it called me by my first name. Client Heartbeat exploits the power of using someone’s name to get their customers’ attention, and you should too. A study by Sinclair, O’Toole, Malawaraaarachchi and Leder (2012) found personalizing surveys more effective than generic surveys as a way of recruiting participants. Since R & G’s participants are only selected from their client base, they’ve made their customers accountable to their responses and allowed for follow up emails to be sent.
The benefit of using Client Heartbeat is it automatically posts your surveys as often as you desire, saving you the effort of manually sending them out every month, quarter or year. It also saves on labor and time costs because you don’t have pay staff to manually import all your data from a spread sheet. This feedback survey software also analyzes your data and alerts you when something requires your attention, i.e. a customer’s feedback ratings have dropped.
Recommendations for R & G Technologies
Since this is a guide produced by Client Heartbeat, I have no recommendations for R & G Technologies (I don’t want to get fired).
Boise, I. (2012). Casino Customer Satisfaction Surveys are Waste of Time and Money! New White Paper Reveals Why. Retrieved from: http://www.prweb.com/releases/casinocustomerservice/casinotraining/prweb9353447.htm%5C
Campbell, J., & Waters, W. (1990) Does anonymity increase response rate in postal questionnaire surveys about sensitive subjects? A randomised trial.Journal of Epidemiology & Community Health. Retrieved from: http://jech.bmj.com/content/44/1/75.short
Deutskens, E., Ko, d. R., Wetzels, M., & Oosterveld, P. (2004). Response rate and response quality of internet-based surveys: An experimental study. Marketing Letters, 15 (1). Retrieved from http://search.proquest.com/docview/204485383?accountid=13380
Eppler, J., & Muenzenmayer, P. (2002). Measuring Information Quality in the Web Context: A Survey of the State of the Art Instruments and an Application Methodology. Retrieved from: http://mitiq.mit.edu/ICIQ/Documents/IQ%20Conference%202002/Papers/MeasureInfoQualityinTheWebContext.pdf
Evans, J. R., & Mathur, A. (2005). The value of online surveys. Internet Research, 15 (2). Retrieved from http://search.proquest.com/docview/219855644?accountid=13380
Fink, A. (2010). Survey Research Methods, International Encyclopaedia of Education (Third Edition), pp152-160 http://www.sciencedirect.com.ezp01.library.qut.edu.au/science/article/pii/B9780080448947002967
Gunther, E., Wyatt, J. & McKenzie, B. (2002). Using the Internet for Surveys and Health Research . Journal of Medical Internet Research, 4,(2). Retrieved from: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1761932/
Lake, L. (2013). What is Branding and How Important is it to Your Marketing Strategy? Retrieved from: http://marketing.about.com/cs/brandmktg/a/whatisbranding.htm
Lelkes, Y., Krosnick, J., Marx, D., Judd, C., & Park, B. (2011). Complete Anonymity Compromises the Accuracy of Self-Reports. Retrieved from: http://www.stanford.edu/dept/communication/faculty/krosnick/docs/2012/Anonymity%20JESP%20FINAL%20June%202012.pdf
Latima, W., O’Brien, M., Vasquez, M., Medina-Mora, M., Rios-Bedoya, C., &Floyd, L. (2008). Adolescent Substance Abuse in Mexico, Puerto Rico and the United States: Effect of Anonymous versus Confidential Survey Formats. Journal of Child & Adolescent Substance Abuse, 16 (1). Retrieved from: http://www.tandfonline.com/doi/abs/10.1300/J029v16n01_06#.UcPjjOew350
Nielsen, J. (1997) How User’s Read on the Web. Retrieved from: http://www.nngroup.com/articles/how-users-read-on-the-web/
Ray, N.M. and Tabor, S.W. (2003) Cyber surveys come of age. Marketing Research, Spring, pp. 32-7.Retrieved from: http://www.websm.org/db/12/1110/rec/
Sinclair, M., O'Toole, J., Malawaraarachchi, M., & Leder, K. (2012). Comparison of response rates and cost-effectiveness for a community-based survey:
Postal, internet and telephone modes with generic or personalised recruitment approaches. BMC Medical Research Methodology, 12(1). http://www.biomedcentral.com/1471-2288/12/132
Sterling, G. (2013). Report: Mobile Traffic To Local Sites Growing Faster Than To Total Internet, Now At 27 Percent. Retrieved from http://searchengineland.com/report-mobile-traffic-to-local-sites-growing-faster-than-total-internet-now-at-27-percent-158139
Tenopir, C., King, D., Edwards, S., & Wu, L. (2009) Electronic journals and changes in scholarly article seeking and reading patterns. Aslib Proceedings, 61 (1). Retrieved from: http://www.emeraldinsight.com/journals.htm?articleid=1766871&show=abstract
Tybout, A. M., Sternthal, B., Malaviya, P., Bakamitsos, G. A., & Park, S. (2005). Information accessibility as a moderator of judgments: The role of content versus retrieval ease. Journal of Consumer Research, 32 (1). Retrieved from http://search.proquest.com/docview/215033944?accountid=13380
Walters, J. (2013). How Do You Define Customer Advocacy? Retrieved from: http://360connext.com/how-do-you-define-customer-advocacy/