I must add even though it's now easy to throw together a survey, not everyone should attempt it. Research done incorrectly produces unreliable results. That's not the topic of today's column, but it's a danger that must always be remembered.
One of the most frequently asked questions on any survey is some form of, "How can we improve our product?" Survey respondents enter free-form comments. Some write paragraphs, others just a few words. These verbatim comments are often the most overlooked part of the results. They're tedious to read. The spelling and grammar are atrocious. Some of the respondents are clearly mentally unbalanced.
But the comments can be the most interesting, and revealing, part of a survey's results. In fact, they're so important I devote a great deal of time to them. Why? Because they help me see my customers as individuals rather than en masse.
Obviously the point of most surveys is the opposite: Tally the individuals so they can be organized into measurable groups that share something in common. Comments are a way to do the reverse: Get in touch with the members of these groups as individuals.
Verbatim comments help you uncover all sorts of information you might not find otherwise. Potential usability issues, reasons a particular product or site feature is underused, new features only a regular user would dream up, and more. How often do you have access to a customer's specific thoughts, suggestions, and complaints?
Reading the comments is not enough. It's a mind-numbing process, so I know it's hard. But you must also focus while you read, thinking of the context or motivation a person had for writing a certain comment. If you dismiss complaints as grumbling, you're making a mistake. Why are they complaining, and what are they complaining about? Is there anything you can do about it? And if so, should you?
Try this: Pick one comment and read it, then stop and ponder what it means. Nine times out of ten, you'll draw deeper connections than if you'd simply read the comment and moved on. For example, when someone writes, "Your pages take forever on a dial-up connection," my first thought is, "Welcome to the Internet. Hope broadband comes to your area soon."
But think what it means to your business. What sales might you be losing because your pages load too slowly? Could some minor tweaks to the graphics cut out a lot of the wait? Now you may see that person sitting, frustrated, trying to purchase something and take the complaint more seriously. I warned you this would be hard!
But don't stop there. Now comes the really tedious part. On customer satisfaction surveys, it's helpful to list the respondent's overall satisfaction and average monthly spending next to his comments. (Note: The spending component is only available for surveys in which you know the customer's identity -- which is easy to achieve in the online world.)
Next, categorize the primary topics of each comment. The result might look something like this, for a group of people who all commented on a similar issue:
|Average Revenue ($)||Comment Topic||Comment|
|10||5,000||Slow page load||Your pages load too slowly.|
|8||1,000||Slow page load||Slow; too many graphics on the page.|
|2||10,000||Slow page load||Takes forever on a dial-up connection.|
That we're looking at the customer's satisfaction level along with the comments isn't meant to imply the two are specifically related. We can't assume the comments are always used to vocalize the reason for satisfaction or dissatisfaction. Include the satisfaction score so you know whether the respondent is writing from a feeling of overall satisfaction. Feel free to add other demographic information, whatever you have that will help you see the whole picture.
Include the customer's spending for several reasons. It catches the eye and makes sure you pay special attention to your biggest customers. It allows you to quantify the total dollar value, as well as the number, of customers who comment about similar issues. And it's an excellent way to get executives to take the time to read at least a few comments.
This exercise takes a lot of time and brainpower, and the results aren't necessarily reliable from a statistical point of view. But marketers make decisions all the time with far less information. For me, this exercise is an invaluable source of information.
If you are on a limited budget, with no experience in statistical software, but want nonetheless to understand who your customers are and what they think, read on. Here are the steps to take, if you're ready to take the plunge. After your next online survey results are in:
- Pull the verbatim comments and overall satisfaction rating for each respondent into a spreadsheet. If you have access, import the customer name/number and some meaningful spending figure, such as average monthly revenue. Add any other demographic information you think would be helpful.
- Create a column for the primary comment topic. As you read the comments and digest the meaning, assign each a category, such as "improve response time" or "pricing complaint: too high."
- Group your respondents by the primary topic. Sum (or average) the revenue, and count the number of customers in each group. It might look like this:
Comment Topic Number of
Slow page load 50 1,000 Good job 100 5,000 Prices too high 20 10,000
- Select a few representative comments (including a healthy selection from your best customers) for a brief report. Add the summary information by comment topic, and give this information to your busy colleagues. They'll be able to get the gist of it without expending much time or energy. You, on the other hand, will be bleary-eyed and perhaps stuttering by this time.
Ryan Massie, business analyst in the marketing department at CareerBuilder.com, had the perhaps unfortunate experience of serving as my research analyst years ago (at another company). He says:
I knew whenever the survey results were in that my weekend was over. I'd read every single comment and categorize the responses, flagging any that I thought should be shared. But we were able to perform more detailed analysis by comparing comments to satisfaction and other customer demographic information. In the end it was worth the effort, but it did mean some long hours.
I suspect some will disagree with me about value versus effort. But marketers with little to no budget must get creative when it comes to gathering insight and information about their customers.