5 Pitfalls to Avoid When Reading Analytics

Share this article

5 Pitfalls to Avoid When Reading Analytics

Recently we’ve been looking into the super-fun world of UX analytics (yes, it is fun!). Firstly, we looked at what UX analytics actually are and why they matter. We then looked at 5 common myths surrounding data-driven design. So if you’re looking to get clued up on what UX analytics is all about, I recommend you read those first. But now it’s time we prepared ourselves for collecting our first set of data.

Learning about something and actually being mentally ready to do it are two very different things, and since analytics don’t always state the objective truth, we need to have the right tools and mindset if we’re to unravel the mysteries of our users. This article covers everything you need to know.

You can check out all the articles in this UX Analytics series here.

1. Don’t Invest in a Singular Idea

Given a number of ideas, we’ll tend to lean towards the ones that are our ideas. This is known as the IKEA effect.

IKEA is known for selling furniture that you then assemble yourself. Given a ready-made item of furniture, and the exact same item of furniture that you assembled yourself, you’ll naturally see more value in the latter because of the time you invested in it. To beat this cognitive bias, you’ll need to let go of your ego and accept when the data speaks for itself.

We’re also more inclined to favor the ideas that we find visually appealing, simply because we become subconsciously invested in beautiful things. This is bad, because ideas that look amazing on the surface aren’t necessarily intuitive, and the sad truth is that fantastic user experience doesn’t have to correlate with a stunning visual aesthetic (e.g. Amazon).

A/B testing can determine which idea works better.

2. Prepare to Question Everything

Analytics are plagued with cognitive biases, simply because of who we are as human beings. These are flaws in our cognitive thinking, but flaws that we can nonetheless overcome simply by being consciously aware of them. Let’s start with the belief bias, which is a tendency to believe conclusions based on their plausibility. Belief bias is extremely common.

Let’s take a ghost button for example (that’s a button with a border, but no background or sense of depth/shadow). Numerous A/B tests have indicated their lack of effectiveness, which is suggested to be because ghost buttons go relatively unnoticed.

A badly converting CTA (call to action) on your website could be blamed on a ghost button, where in fact, it could actually be the placement of the button (or something else entirely). It could also be both. Don’t rush to make assumptions because your logic seems believable. A/B test with and without the ghost button, and choose the version that converts best. After that, maybe toy with the positioning (for example, in the menu bar vs centralized in the header), because there’s rarely ever a singular fix when it comes to conversions (multivariate testing can help you test multiple variations at once).

If you’re keen to read more about A/B testing ghost buttons, here’s a fantastic example by our very own analytics expert, Luke Hay.

3. Observe the Bigger Picture

Our brains can reach different conclusions based on how the information is framed. Diving straight into an analytics example here, you’ll be more likely to take a risk to improve bad metrics than to improve average or above-average metrics, simply because we view loss as more significant than its equivalent gain. We call this the framing effect, but in more common terms you could simply call it panic.

As humans, we make bad decisions when we become anxious. We’re motivated by fear. Politicians (bad ones, anyway) often use this cognitive flaw to win elections, where campaigns use fear to draw focus on supposedly negative metrics that are often badly framed, hoping that it’ll drive us to vote for them in exchange for a diagnostic solution that they’ve already come up with. In reality, we should be keeping our cool and using data to drive design decisions, rather than making snap decisions because of surprising or unexpected metrics.

4. Analytics is a Team Effort

So many things affect conversions: performance (site speed, etc.), which developers are responsible for; sales gimmicks, which marketers are responsible for; and then user experience, which of course you, the designer, are responsible for. While complex designs can affect site speed, and marketing requirements can affect UX, the right balance will result in all departments meeting the same goal of boosting conversions.

Working together towards a common goal will help to reduce any cognitive biases, since more eyes on the data will allow for collective insight and thus more objective conclusions.

5. Define Your Goals

Most teams go wrong with analytics by not having a clear vision of what they want to achieve. Without this knowledge, you can’t track the right metrics, and you might find yourself evaluating your UX based on the wrong ones. Not defining goals beforehand often leads teams to stray towards vanity metrics such as Number of Visitors and Bounce Rate, but these metrics can sometimes be very misleading.

Wrap Up

Reading analytics is a three-way battle between what you want to see, what it seems like, and what it actually is. Being clear in your mind about your business and UX aims will allow you to focus on the metrics that really matter, and see the data for what it really is. As we’ve discussed here, some metrics out of context (and subjected to our own biases) can lead us to make inaccurate assumptions about our users.

Daniel SchwarzDaniel Schwarz
View Author

Previously, design blog editor at Toptal and SitePoint. Now Daniel advocates for better UX design alongside industry leaders such as Adobe, InVision, Marvel, Wix, Net Magazine, LogRocket, CSS-Tricks, and more.

analyticsanalytics-hublearn-ux-analytics
Share this article
Read Next
Get the freshest news and resources for developers, designers and digital creators in your inbox each week