The following is a short extract from our new book, Researching UX: User Research, written by James Lang and Emma Howell. It’s the ultimate guide to user research, a key part of effective UX design. SitePoint Premium members get access with their membership, or you can buy a copy in stores worldwide.
Once you’ve decided who you want to include in the research, and defined it in a recruitment brief, you’ll need a way to ensure that you’re actually getting those people. There are two ways to do this.
- Evaluating against the recruitment brief: Once you’ve identified a potential participant, you just size them up against the criteria in the recruitment brief, based on what you know about them or an informal conversation. This is the rough-and-ready method, and is most commonly used in DIY recruitment (see later in this chapter).
- Using a screener: Once a potential participant has been identified, they’re asked a series of questions that evaluates them against the criteria in the recruitment brief, and allocates them to a quota. This is the more robust, credible approach, as used by recruitment agencies.
There are pros and cons to both approaches. Evaluating against the recruitment brief can be inaccurate, risking misleading results and undermining the credibility of your project. On the other hand, a screener takes additional time to create and apply.
There are some workarounds. If you hire a recruitment agency, they will often write the screener on your behalf. If you’re conducting guerrilla research, the screener will be very short. We’ll say more about both of these scenarios later in the chapter.
Creating a Screener
A screener is a set of questions that are asked to potential participants, based on the sample criteria you defined in your recruitment brief. These questions are designed to figure out how suitable participants are for your project.
You can see an example screener here, showing the key questions you would want to ask to recruit for a project about outdoor gear. It establishes what activities the potential participants do without leading them to give certain answers. The other questions are written to probe more into their experiences and habits when buying outdoor equipment, without being leading. The final question is written to catch out anyone that is trying to trick their way onto the research.
Screener questions contain several elements:
- A question number.
- The question itself – This should be phrased as it would be read, even though in practice you should expect a little flexibility in the way a recruiter reads out the text you’ve written.
- Instructions to the recruiter. These won’t be read out, but give guidance on how to ask the question and how to classify the response. Common instructions include ‘Read out all options’, ‘Don’t read out options’, ‘Tick one’, ‘Tick all that apply’.
- A response area: normally multiple choice options, or an empty box for writing in. For each, there should be an instruction on what to do if it’s ticked, eg, ‘Thank and close’ (ie, reject the candidate and end the screener), ‘Recruit to quota’, or just ‘Continue’.
As you can see, screeners have a certain amount of jargon attached to them, but you shouldn’t worry about this. If you’re doing your own recruitment, you just need to make it clear to yourself, while if you’re using a recruiter, they will be able to interpret your instructions once you’ve talked them through it.
Some tips for creating a good screener:
- Use precise language. For example, rather than asking ,”How many times have you been to the cinema recently?”, a better question would be: ‘How many times have you been to the cinema in the past 30 days?”
- Try to make your questions as factual as possible, so participants aren’t tempted to exaggerate or misrepresent their behaviour. For example, if you want to know how frequently someone exercises, don’t ask them, “How many times do you exercise a week?” People will tend to exaggerate because they don’t want to look bad! A better question would be: “Which of the following have you done in the past week: running, cycling, gym, cinema, reading, etc.”
- Try to make your questions easy to answer. In particular, it can be hard to remember events that took place a long time ago. You can prompt with options, as in the example above, but make sure you disguise the answer you’re interested in among other choices.
- Make sure your questions are essential to selecting the best participants. The more questions you add, the more time consuming it is for you and the people you are screening.
- The ideal order to sequence your questions is to cover screening criteria first, then primary quotas, then secondary quotas, and finally information capture for contact details.
- In most cases, it’s a good idea to conceal the purpose of the research and the organisation you’re working for. The participant will probably guess to some extent, based on the questions you’re asking, but try to keep the details hidden. Otherwise, they may be tempted to do some prior reading up before attending the research session.
- Finally, try to focus on and categorise people based on behaviour if you can. If not, ask about attitudes. Demographics are least useful of all.
It’s also important that you give potential participants some briefing information about the session during the screening process. For example:
- Telling them they’ll be asked to use a website or app.
- Reminding them to bring their glasses, if they need to.
- Asking them to bring a form of photo identification.
- Confirming the incentive and how it will be paid.
- Making sure they’ve got a contact number in case of any problems finding the venue or last-minute cancellations.
Giving this information ensures that participants are making an informed decision about whether to participate, and there are no nasty surprises on the day.
Look Out for Professional Research Participants
If you are screening people yourself, keep in mind that there are some people who will try and say whatever they think they need to say to get a place on your research. Incentives can be very alluring! To avoid this, include a question designed to catch out those individuals that are not being entirely truthful.
Let’s take our example of a project for an outdoor equipment company. In the brief, we have asked for all of our participants to have visited of at least two of the big brands. One of your screener questions could be:
Out of the following UK brands, which have you visited during the past six months?
- Go Outdoors
- Snow + Rock
- Ellis Brigham
- Hiking Gear R Us
Hiking Gear R Us are our invented brand, so you can be sure that anyone who picks this screener should not be included in your fieldwork.
Different Kinds of Screeners
Screeners vary in length and complexity. A typical screener for a depth interview or user test is around 10 – 30 questions long. This allows you to capture one or two primary variables, up to 7 secondary variables, plus capture contact details. Any longer than this, and you’ll find that candidates lose interest.
If you’re conducting guerrilla research, your time is extremely limited. You’ll want to get screening over with as quickly as possible, which means five questions in total is plenty. Incidentally, that’s about as much as you can fit onto a single side of paper on a clipboard, so there are practical reasons for keeping it short, too.
Learn PHP for free!
Make the leap into server-side programming with a comprehensive cover of PHP & MySQL.
RRP $11.95 Yours absolutely free
Finally, you may choose to run your screener as an online survey, and have candidates complete it themselves rather than talk them through it. If you take this approach, it’s important to:
- Put even more thought into the phrasing of your questions.
- Keep it short – no more than two or three minutes.
- Pilot test your questionnaire with at least three people before you launch it.
On the plus side, this is a great way to save time, especially if you’re sourcing people online from websites like Gumtree or Facebook.
Emma Howell is a User Experience Consultant at cxpartners. She has been a research specialist for 10 years, beginning her career in academia before moving into UX. Emma loves designing products and services that are intuitive and enjoyable to use.
Jump Start Git, 2nd Edition
Visual Studio Code: End-to-End Editing and Debugging Tools for Web Developers