Your team feels hopelessly underwater trying to keep up with all of the content changes to your website… people are working long hours on short deadlines… and increased budget is nowhere to be found. Despite this, a senior manager has determined that the “Corporate Responsibility” section of the website deserves a high-priority rewrite requiring content editors, developers, release management, marketing, and several other teams. Is this the wisest use of these scarce resources? Does this decision leverage the highest possible point of impact on visitors? Or is this project being pushed for internal, political reasons?
Obviously this is the entire purpose of web analytics. We have all been presented with this type of situation, logged into our favourite web analytics package, and pulled some numbers that show the relative importance of this content. Web analytics has seen growth primarily because it provides an objective view of site utilization in questions like this. While “pet projects” will always come along, these days it’s a lot easier to provide concrete evidence as projects come along.
Believe it or not, web analytics isn’t enough. The art of interpreting web analytics numbers can still lead to subjective decisions… or the wonderfully oxymoronic phrase “anecdotal analytics”. This shouldn’t be possible, should it? Yet how many analysts have looked at “average session length” and “average page views per visit” and prognosticated on the level of engagement with site content? This is sometimes true, sure, but what if the “average searches per visit” metric is 50% of the total page views per visit? Well, this is an entirely different interpretation and renders the original interpretation anything but “complete and objective.”
How do we resolve this and avoid releasing invalid interpretations or taking unnecessary actions? We never expound on our analytics interpretation if we can’t prove our interpretation is accurate. Enter this very “old fashioned” concept of the scientific method. “Scientific” in this case means that the interpretation must be based on empirical, measurable observations. This approach has been around for, oh, 400 years or so. Why not use something that has such a respected, long-standing track record?
Here’s a simplified approach to scientific method:
- Define the question or inquiry and collect data related to the question
- Form a hypothesis about how to interpret the collected data
- Construct an experiment that can test your interpretation and execute the test
- Collect test results and analyze them, draw conclusions that can form a new, or corollary hypothesis
So let’s take another look at average session length and its relation to average page views per visit. Let’s create a hypothesis that the relation between these two numbers is being caused by excessive use of search. Let’s construct a test by taking the top ten most used search results used and put them on the home page. This doesn’t change the content at all… it simply reorganizes existing content and changes the user experience.
If the search volume per visit substantively decreases then we’re on a lot safer ground saying that content findability was really driving content consumption, not engagement… and if the search volume per visit doesn’t change? It’s harder to draw a conclusion in this case, but we can always repeat the process, define a new hypothesis and test, and we will eventually learn the true relationship between web analytics numbers and understand how they specifically pertain to our website.
So how many times do we have to NOT learn the right answer? Hard to say, but a fable about Thomas Edison comes to mind:
“A young reporter boldly asked Mr. Edison if he felt like a failure and if he thought he should just give up inventing the light bulb. Perplexed, Edison replied, ‘Young man, why would I feel like a failure? And why would I ever give up? I now know definitively over 9,000 ways that an electric light bulb will not work.’”
Learning what is NOT at the root of your web analytics number is just as valuable, if not MORE valuable, than luckily getting the right answer; or getting an answer that someone else in the industry found was true about their website.