Content Research: Getting the Most Out of User Surveys

Share this article

The next step in our journey to research website content is to get qualitative: let’s look at user surveys.

How do you survey people about content?

Good question. The first thing we did was pick a specific piece of content that we wanted to understand better: a training package.

I’d looked at the stats on the content and they were pretty inconclusive. Click tracking didn’t tell us much — if you’re a user on our site watching a video, there’s not much in the way of clicks for us to track. And besides, we had some specific questions about the nature of the content and its presentation that no amount of quantitative data was going to reveal. We had to get qualitative.

Girl image painted onto clipboard
photo :Dennis Hayes IV

The other thing you need to be able to do to implement a survey is be able to capture people who use the content you’re studying. For example, you could make the call to action for your survey a popup, a modal, or a persistent, on-page message.

We decided to study people who had and hadn’t used the content in question. Sound weird? Well, we were studying the value of the training package, so we wanted to be able to compare how those who hadn’t done the training were using the software against those who had.

We could do this because we had the email addresses of users, and were able to track who’d clicked through to the training material in question, and who hadn’t.

What do surveys give you?

Surveys give you information straight from the horse’s mouth. They’re great way to get a mix of qualitative and quantitative data.

Do it right, and hopefully you’ll get quantitative data in the kinds of volumes that are significant (we got response rates of 8-10% for our surveys).

But if you include some open-ended questions, you can also use the survey to get a feel for the way users think about and respond to your content in their own words.

In our case, survey respondents mentioned industry-specific topics about which I knew little at the time. But since these were the problems my users were facing, it was important that I heard about them.

How do they work?

As I mentioned, I used an online survey, and called users to complete it through an email.

A content survey isn’t really like a marketing survey, and the email doesn’t want to be too salesy either. We didn’t offer any reward to users who completed the survey: we just invited them to take part. The survey was quick, and the email sounded very personal, so many users clicked through and gave us their thoughts.

The survey contained a mix of carefully worded questions designed specifically to give us information on users’ experiences and feelings. So, for example, we deliberately avoided questions like, “How would you prefer to do training: online, face-to-face, or on video?” because we didn’t want users to have to theorise about how they’d learn.

We also worked hard to avoid biasing users’ responses to certain questions based on previous questions. We mixed up the order of the questions and answers, and used specific answer types for specific pieces of information.

We also worked hard to remove our own personal bias from the questions. For example, we had a sense that some of the training, which used humour, didn’t hit the mark for users. We also expected that that training wouldn’t be such a hit with non-English speakers or users from non-Anglo cultures. Working together to vet the questions, and choose the right question and answer format (spectrum sliders), we were able to get an objective gauge of user sentiment about these training modules without putting words in users’ mouths.

Survey wins

We had a lot of wins with our survey. We got great, useful, quantitative information from the survey questions that had fixed ranges of answers that were easily comparable. Where we had open-ended questions, we learned a lot.

But we experienced one other, unexpected bonus. This arose from the email that invited users to participate.

We wrote the email to sound as personal as possible, and it worked. I received numerous personal email responses that revealed the problems people were having with the training material. While we’d asked these kinds of questions in the survey, in some cases these respondents included information that our survey questions had failed to anticipate.

In addition to that, from the nature of the emailed responses, we were able to conclude that those who emailed had not completed the survey. So we got what were effectively “bonus answers” by wording our email carefully.

Oh, and we did all this very cheaply, too. Online surveys are a cheap research method.

Survey limitations

Darth Vader taking a survey
photo :Dennis Hayes IV

The main error I made with the survey was that I didn’t make every question mandatory. At the time I thought this might put users off responding at all. But what happened was that some users skipped questions altogether. And I still don’t know why.

Our data was still useful, but it would have been more valuable to get every respondent’s answer to every question.

Beyond that, surveys can be quite difficult to construct. Unless you know very specifically what information you’re after, it’s very easy to construct questions that bias the users’ answers or fail to deliver information you can use. Trouble is, you only realise that when you’re looking at the data after the fact.

So it can take a bit of learning to get your survey to deliver what you need.

Survey tips and tricks

There are a few pitfalls to doing surveys. Here’s how we avoided them.

  • Read up on how to construct good surveys, and follow the advice. Bad questions will only get you bad data. Don’t waste what is very precious time with users: ask them the right questions the first time.
  • Have someone test your surveys before you mail the link to customers. It goes without saying, but it matters. Get the link, and the survey, and the email right before you go out to your contacts.
  • Write a stellar email. Why settle for low response rates? The more people you convince to take your survey, the more value you’ll get from this exercise. And, as I said, you might get unexpected wins like personal responses from users. That’s good for your understanding, but it also shows that your brand has connected with your audience. See your survey as a chance to build connection, not just bleed people for information.

The goal of our survey was to find out how users perceived our training materials. We got that information — and a whole lot more. The survey, and the emailed responses, revealed the vast range of problems our users faced as they got started with our product, and gave us clear paths for improving that experience.

These included adding missing material to the training package, and promoting alternative content formats (many users cited having no speakers, sound card, or headphones as reasons for not doing the training even though we offered other formats on the site).

It also turned up some other gems. For example, we’d assumed that when they had a problem with the product, users would search for help, or call the helpdesk, or do both, or ask a friend—whatever it took to solve that problem and be on their way. But what the survey revealed was that a large portion of new users had questions that remained unanswered for a period of days or weeks.

This sparked our next piece of research, which I’ll talk about in the next article in this series: user interviews.

Survey software to try

SurveyMonkey

Survey Monkey, of course. Those guys are all over surveys, and they have tonnes of resources to help you create better, fairer surveys that get more responses.

Of course, there are plenty of good alternatives out there — you’re probably already with one, so let’s not go through them.

Instead, let me give you one other idea.

I learned most of what I learned about constructing surveys in market research courses as part of my degree. So dig up what you can on survey construction (lots of *ahem* research has been done in this field) and make sure your questions presented and asked in a way that gets you the information you need to improve your content. I’ve seen plenty of surveys render useless results because the questions were poorly constructed.

Whats next?

After using clicktracking and surveys on our site, we decided that we needed to get greater insights and look much more deeply at the use case in which users consumed our content. We did this using interviews, which we’ll talk about next week.

In the meantime, if you’ve ever used surveys to understand how users use your content, let us know. Share your advice for good user surveys in the comments.

Georgina LaidlawGeorgina Laidlaw
View Author

Georgina has more than fifteen years' experience writing and editing for web, print and voice. With a background in marketing and a passion for words, the time Georgina spent with companies like Sausage Software and sitepoint.com cemented her lasting interest in the media, persuasion, and communications culture.

AlexWux-analytics-tools
Share this article
Read Next
Get the freshest news and resources for developers, designers and digital creators in your inbox each week