Tag Archives: Surveys

Thanks For Not Asking

In the unending quest to “provide better service” or “get to know the customer,” a similarly unending army of organizations is asking us (visitors, prospects, customers, former customers, and innocent bystanders) about our experiences, desires, preferences, and opinions.

In the good old days, surveys were limited by the expense of field interviewers, printing and postage, and most effectively by limits in processing and analytical capacity. That was in the age of Little Data.

Now, thanks to the likes of SurveyMonkey, Zoomerang, ForeSee, Qualtrics, and so many more, anyone with a mouse and keyboard has a license to impose and irritate by survey. Without warning, they can and do open fire by text, email, robocall, and pop-up.

This ability to inquire is not inherently bad. In practice it is often the lazy marketer’s way to bother and alienate, while learning nothing useful in the bargain. Aside the common errors of badly worded questions in a badly designed format. The offenses of the survey happy fall into familiar categories.

They ask:

Too early – The prospects or new customers are asked about experiences they haven’t yet had. Visit the site of some organizations and after a few clicks a pop-up window will offer to survey you about your “experience” on site. This before you’ve found what you came for or bought what you came to buy, and were satisfied or frustrated.

Too often – Asking more frequently doesn’t yield better data. It may however be a measure of a flaying marketer’s attempt to address customer irritation and brand erosion.

Too much – As long as they’re at it, why not ask another question or two or twelve? Wouldn’t it be nice to know… I’m reminded of a nonprofit, on whose website I was making a donation. Their pop-up survey had 37 questions across multiple pages and all had to be answered before betting to the next page. The only alternative was to abandon their survey and their site.

Too vague Diffuse or inappropriate questions, which could not be reasonably understood, let alone answered .

All of these violations come before analysis and action on the data. When the response rates are low for all of the above reasons there is a tendency to abandon the study and compound the problem by trying yet again and perpetuate the whole silly cycle.

Marketers, unfortunately, lack the equivalent of a Hippocratic oath, viz., “first do no harm.”

A number of organizations do get this right. Consider a recent message to clients of the Vanguard Group. Rather than barging ahead, it included a button asking if they would answer two questions. Those who clicked, were presented with two and only two multiple choice questions.

Skewed Sentiment in the Twittersphere

Twitter, the popular micro-blogging and messaging service, has become a big deal – and twitter-bird-blue-on-whitenot just among social media. By its sixth birthday in 2012, it had an estimated 500 million “active” users worldwide, 140 million of whom are in the US. Based on its latest round of financing, its implied market capitalization would be $ 9.9 billion.

Activity on Twitter has become a leading indicator and Twitter sentiment is often reported in newscasts. The very volume of Twitter activity has become newsworthy. For example, the Academy Awards telecast generated 8.9 million Tweets and the Super Bowl garnered 24.1 million. Of course, many minor events are tweeted about – it seems that every conference, meeting, or presentation I attend – begins with announcing its hashtag.

A number of services such as Twitturly, Twitscoop, and Monitter allow you to track Twitter activity about your brand, organization, or cause. This can be interesting, but how relevant is it? Twitter measurement is relatively cheap and easy. Still, should you judge a marketing campaign or prospects for your latest product by reception on Twitter?

The folks at Twitter seem to think so and want to sell you a “promoted account,” which solicits more followers for you. Presumably, more followers will lead to more mentions.  If that won’t prime the pump, you can also pay for Promoted Tweets, which you can send to audiences who don’t follow you.

The question still remains, is Twitter sentiment a good proxy for what the Universe thinks about your organization or product? An interesting study just released from the Pew Research Center points to significant selection bias in  reactions on Twitter, compared to responses in statistically representative surveys. For example, the 2013 State of the Union Address was substantially more positively rated in polls (42%) than by Tweets (21%). According to the Pew researchers ” Twitter users are not representative of the public.”

This may not be shocking, but it should give one pause. How many of your users not only have Twitter accounts, but are actively engaged. Unless you’ve restricted your market to Twitterati, you may be wise to do some of the heavy lifting involved in market research.

Please Tweet me your thoughts @4ourth.

Your Opinion is Very Important to Us (really)

Like many businesses and households, I’ve been shopping a lot on line, and not just on CyberMonday. After the checkout or in the confirming email, we’ve also noticed a proliferation of “customer satisfaction” surveys.

The Stones still have this one nailed:

Neither you nor your customers will get satisfaction, if you imitate many of the ham-handed requests for feedback, which increasingly follow online ordering.

It is now very easy, and often free, to create a web, mobile, or email survey or feedback form with tools from bizrate, SurveyMonkey, SurveyTracker, google.docs and many many others. The ease of creation is perhaps part of the problem.

In principle, gaging the quality of your customer’s experience is desirable. But not at the expense of creating a poor experience. The effort to add an extra question or page of questions of nice to know information is negligible. Since you have your attention, you might be tempted to ask you about other suppliers, products they didn’t buy, or try to sell them something else. Don’t.

Incredibly firms, whose web sites enable a customer to find and buy a product in 2 or 3 clicks ask the same customer to spend ten or more extra minutes completing a satisfaction survey.

Four Ways To Irritate Your Customers:

I’ve seen multiple examples of all of these in the last two weeks:

  1. The Long String-Along. Unless you have a deeper relationship with your customer and are compensating them in some way keep it short. Avoid pile on “nice to know” questions. Sending your respondents to a survey with a continue button at the bottom of the page, and the following page, etc. can lead to the customer abandoning the survey or speeding through it and giving unconsidered responses.
  2. Require answers to all questions, regardless of whether they may pertain to a given customer or your offerings. For example a seller of camping equipment included questions about categories it does not and never has sold.
  3. Require identifying information, such as an email address. If you want to identify a respondent, offer to send them some sort of tsatske. You’ll have to mail it to them and hence need contact information. Of course, this does not give you the right to pester them.
  4. Over Quantification. You are an expert on your product and believe you can distinguish nuances such that a 10 point rating scale makes sense. This pseudo-precision not only makes your surveys difficult to read, it also reduces the reliability of your results.

Instead keep your interaction to what fits on a postcard. Or shorter. Ask less upfront and you will learn more as well as keep the goodwill of your customers.

I recently had to call the tech support line of my ISP. I was asked if I would stay on the line to rate the service I had just received. It consisted of a single automated question – rating the service on a five point scale with a single press on the phone keypad.

Now I got satisfaction.