A Guide to Analyzing Google Analytics for Clever A/B Testing
Planning A/B testing campaigns can be somewhat of a guessing game if you don’t understand your audience well enough, which is why it’s important to do your research first – and the best place to start is Google Analytics.
Google Analytics lets you delve into the nitty gritty of who your visitors are – which countries they are from, their age and gender, the screen resolution and browser they are using on a particular device, how they came to arrive on your site, what language they speak, and even what their interests are.
With all this information available to you, there is much to explore and know about your audience that can help you plan A/B testing campaigns that are more effectively targetted at your users.
One of our recent posts, 11 Best Tools for A/B Testing in WordPress, went into detail about the what and why of A/B testing. In today’s post, we’ll explore some of the metrics you can use and exploit to your advantage when planning split testing campaigns.
Why Use Google Analytics?
The whole point of split testing is that you don’t know. Your job is to make educated guesses as to what the most useful content/design would be but you can’t really go beyond that. Split testing allows you to test your assumptions and make decisions based on real data, not just guesswork.
That said, you want to be as efficient as possible in your testing process. You should concentrate on the most hopeful candidates. Deciding what those candidates are is also up for debate. This is where Google Analytics can help you. The various metrics it provides (tech capabilities, age, interests, user flow, etc) can be used to narrow down your potential candidates to something manageable.
Keep in mind that you need to reach significance in your tests for them to give you meaningful data. Unless you have literally millions of visitors a day you can’t do that many split tests. I don’t want to get into the maths here, but ConversionXL has an epic article about split test significance and other A/B testing mistakes. If you plan on doing split tests make sure to give it a read!
Metrics We Can Use
First of all, you can use all Google Analytics metrics, every piece of information you have can potentially narrow down your design or content decisions. I’ll go through some that I’ve used to make decisions in the past but this is by no means an exhaustive list. Depending on your area of focus you may find other metrics more useful. The thinking behind all this is more important. Once you master that, nothing can stop you!
Age and Gender
Age and gender are extremely important factors in your design and content. You will need different calls to action to speak to young men compared to what you might use to target middle-aged women
These are usually the first two metrics I start looking at in Google Analytics. You’ll find the information in Audience > Demographics.
The above figures come from a site I manage, which I won’t name here for privacy reasons. It’s quite obvious that this site leans toward the younger end of the age spectrum. The 25-34-year-old segment is most prominent, but close behind is the younger 18 to 24 age group.
This kind of information would prompt me to advise the site’s designers that a fresher, younger design could be explored, but nothing too radically different to what people are used to nowadays.
The data likewise has an impact on copywriting. On this kind of website, we could get away with “Jump Right In” as a CTA button to encourage users to get started. If we were targeting a much older age range I would probably not deviate from the very clear “Register Now” text.
Location and language are also decisive factors when making changes to a website. Most sites will attract a primary language group but you may bump into examples where you have a couple of significant ones. Canada would be a great example, where there are two official languages. The language and location data can be found under Audience > Geo and will look something like this.
Four of the top six countries are native English speakers but 7.65% of visitors come from India. Perhaps by analyzing cultural differences you could come up with featured images that are more likely to grab the attention of our Indian friends more.
The Audience > Behavior section offers information about sessions. You can see new versus returning visitors, sessions per visitor and engagement.
Let’s look at the frequency view, which highlights a key point we can use in our split testing efforts.
This view shows that out of the total 240,657 sessions there were 171,886 single sessions. To understand this, we need to know what a session is. Google Analytics has us covered with its definition of a session. In a nutshell, a session is a group of activities performed by a user in a given timeframe, which is usually 30 minutes.
So our visitors tend to not use the site more than once in a month (the total data I was looking at was one month). This prompts me to explore tools that would lure visitors back to the website. This could be as simple as A/B testing posting frequency, but it could be as elaborate as split testing different designs and calls to action on our newsletter registration form.
The most important metric I use when planning A/B testing in the technology section of Google Analytics is the screen resolution. You can find this information by going to Audience > Technology > Browser & OS. Above the data table, you’ll find a primary dimension selector you can use to switch to the screen resolution view or any other metric.
One conclusion I can draw from the data above is that we need to split test better visuals on small mobile devices. Devices with widths within the 300-400 pixel range account for 13% of the traffic yet spend 25 seconds less browsing on average. You could say that this is an inevitable property of small devices, but I can point to something that suggests otherwise. Visitors on 360x640px devices spend 1:06 minutes viewing content on average, and 375x667px devices spend 41 seconds. This may be a fluke or it may be a property of the devices in question. More analysis is definitely needed here, but these are the kind of things that can point you in the right (or wrong) direction for testing.
Either way split testing some variations of presenting content for mobile devices could go a long way in increasing the revenue of this site by either attracting more mobile views, or making the existing mobile viewers happier. If you drill down and take a look, the bounce rate for this resolution is 99.98, something we can definitely improve on.
Acquisition shows you where your visitors come from. The categories are organic search, direct, referral, email, social and other. No matter what your specific data looks like, you can make pretty go split test decisions based on it.
Here’s what our example website looks like:
Over 86% of this website’s traffic comes from organic searches. We could leverage this by building in a facility that detects what they searched for and offers similar content. The user may continue searching on our site instead of returning to Google.
If you have considerable traffic from referrals you could display a friendly greeting like: “Hi WPMU DEV reader, if you like our content subscribe to our mailing list.”
There is no such thing as useless data, you just have to figure out how it pertains to your website and how you can use it to get closer to your goals. While Google Analytics might not tell you exactly what needs to be done to improve your website, it offers solid statistics to help you decide which direction to take.
The next step in the process is split testing, which can tell you which exact variations, based on the directions, will be the most useful to you.
Don’t forget to take a look at our 11 Best Tools for A/B Split Testing in WordPress article, which I mentioned earlier if you would like to read more about A/B split testing.
Have you used Google Analytics to help plan your A/B split testing campaigns? What kind of split testing to you do on our site? Let us know in the comments below.
WIN a Share of $5K
Subscribe to our blog this #hostingmonth for a chance to win one of 5 prizes of $1,000 WPMU Dev credit! Learn More.