Analytics vs. User Surveys

Making analytics-lead changes brought us to a local maximum. Running a user survey helped me take the business to the next level.

A while back just before I left Bikesoup we conducted our first user survey. The company was at something of a crossroads, and even though we’d spent some time digging into our websites analytics we hadn’t unearthed any real gems.

Study of the data identified a lot of low hanging fruit, like pages that were slow to load and errors that only occurred in certain browsers; the sort of things you’d expect to see in this data. The problem was that responding to these findings only created slight incremental improvements as we approached a local maximum.

What we had been looking for was clear insight into how our users felt about the site and the service it offered. We wanted to know what wasn’t working and what we needed to do to take the business to the next level.

Whilst you can configure your analytics to track really well defined user journeys, you’re still left guessing what your users motivations were when they were using your site. Doing your best Sherlock impression you try to create theories from the data.

However, it’s likely that you’ll only end up finding data that supports the assumptions you’ve already made. You get the relationship between a user’s motives and the data their actions produce the wrong way round.

Drawing meaningful conclusions from website analytics is a definite skill. It’s one that I’ve slowly been learning, but there were so many facets to Bikesoup’s data that I found it hard to tease out anything that would really move the needle.

Conducting a user survey

Having read Nick Disabato’s excellent “A/B Testing Manual” I remembered that an informal survey might be just what was needed here. I put together a list of questions that would hopefully prompt Bikesoup’s users to tell us what they really thought. There was a big emphasis on being brutally honest and blunt — even if what they had to say would be painful.

After reviewing and agreeing the final list of questions with the rest of the team, I used TypeForm to put together a short survey. Taking advantage of the conditional flow allowed me to keep the questions relevant and dig deeper if a user’s response hinted that they had more to say. All in all, most people could complete the questionnaire in around 5 - 10 minutes, depending on how much detail they went into.

The first part of the survey was quite generic as I tried to understand the users background. I asked what type of cyclist they were, how long they had been using Bikesoup, what areas of the site they had used. These were the soft-ball questions that I knew wouldn’t answer our questions, but could add context to an individuals response.

The second part was where I asked the tough question:

“How have you found using Bikesoup?”

I gave them a choice of four answers:

  • Great
  • OK
  • Could be better
  • Rubbish!

By themselves, these responses wouldn’t really tell me much either, but they acted as a qualifier to three of the most important questions in the entire survey.

If the answer was positive — the first two options — I asked them what they liked in particular and what they’d like to see more of. Whilst a lot of people answered this question, the responses were varied and it was hard to find any commonalities.

If the answer was negative — the last two options — I asked them what went wrong. This was useful, as it pointed to highly specific issues that could be addressed. However, the killer question was the next one. I asked the wonderful question suggested by Claire Lew from KnowYourCompany:

“What advice do you have for us?”

As per her article, this one word — “advice”, not “feedback” — did indeed unlock the answers I was looking for. It showed exactly how Bikesoup’s users felt when they were using the site and running into problems.

One of the issues that was repeated was that users said the site felt like a ghost town at times, with very few new bikes being listed and very few people enquiring about buying the bike they had listed.

This was something that I probably could have worked out from the data, but it’s very easy to work backwards and find data that supports these user’s feelings. Coming to that conclusion from the data alone would have been a stretch, and I would have still been left wondering if I’d interpreted the data correctly.

With the users feedback, I could clearly articulate what needed to improve and the metrics that could be used to track progress as changes were made. Metrics like “New bikes listed” and “Enquiries per day” were all easy to monitor, and highlighted what to emphasise when making UI changes.

Communication with users

This survey was sent out to approximately 30,000 people, and we received around 140 responses. Whilst that seems very low, 140 responses was far more than was needed. After the first 20 or 30 were in, the required changes were already obvious.

I probably spent no more than a couple of hours putting together this survey and we started to receive responses within an hour of announcing it. I can think of very few methods with as direct and short feedback loop.

To put it in perspective, setting up the right metrics in something like Google Analytics would have taken longer to implement, and we would have to wait for sufficient traffic before we started analysing the data. With a user survey, I asked a question and a few hours later I had the answer.

Of course you can’t send out a survey every week, they need to be used sparingly to avoid annoying your users. However that doesn’t mean you can’t regularly get the same sort of insights in other ways.

You can do single question polls on Twitter. They take a few seconds for your followers to respond to, and you can get almost instant feedback on a decision you’re about to make. Instead of using social media to tell people something, use it to ask them something instead.

Or, how about singling out a user every now and then and sending them a personal email asking how they’re getting on? I know that sounds just as obnoxious as Intercom-style pop ups, but people really do respond when a message has been obviously written for them personally.

Creating a dialogue between the people building a website and using it can yield incredible results. Personally I think that many businesses completely undervalue this one-on-one communication with their customers. They might think that it won’t scale or will provide a biased viewpoint because of the tiny sample size. Of course it won’t scale — that’s the point!

Writing personal emails and reading every response to your survey seems like a lot of work, which is why most businesses don’t do it.

It’s also the same reason that your users will likely respond to it, because in a world where companies are large and faceless, and customers can feel like they don’t matter; it’s a novelty to be asked your opinion.

Over time these interactions will create a connection between a business and its customers. That connection in turn develops into loyalty and trust, which is the bedrock of any stable and durable business.

Let’s see your analytics software top that!

Originally written: