This problem is not new to us at Xebidy, but a recent article in the Great New Zealand Travel Directory newsletter highlighted it, so I thought it high time we comment. The newsletter was commenting on the recent upgrade by many hosting companies from analytics package Urchin 4 to Urchin 5 – which apparently due to a change in measurement methods is now displaying less site traffic then before for the same sites.
This particular article recommends Google analytics and we could not agree more. In fact, I would go as far to say that the free server packages that most hosting companies provide are bad news for the unweary. I have a prime case in point. One of our clients was using a package called Deep Matrix and was of the understanding they were enjoying visitor numbers of 25 – 30,000 per month. In fact, they went as far as to commit a large investment in redesigning their site based on their belief.
The new site was launched and everyone was happy. That is, until the first months stats were reviewed by the marketing department. Low and behold, visitor numbers in the newly installed Google Analytics packages were 50% of what they previously were on the Deep matrix server log analysis. A quick review of Deep Matrix revealed the route of the problem – “This report includes visits from automated users such as search engine robots”; and Google analytics don’t. There is no easy way to explain that to the board!
The newsletter in fact says:
“if you load Google analytics to your site you will certainly see a drop in traffic compared to your other stats packages. However, it is, widely assumed (but probably not proven) that Google analytics is not subject to the same stats server issues that hosting companies seem to have. It may therefore be a more reliable and independent method of measuring monthly trends”.
I think what is being referred to here is server uptime, and most importantly the way Google analytics picks up the incoming traffic and recognises is an either search engine or some other automated bots (such as email harvesters etc). Google analytics sit on the website or user side and register traffic as it hits your site via some code inserted by your developer into your site. Server stats packages on the other hand take the raw data from your server logo and interpret this data into something meaningful. The difference is in how these packages recognise this traffic and categorise it, or not as may be the case (Deep Matrix simply treating the traffic the same whether it be for a visitor, a search engine bot or some other automated service).
So how much of this non-qualified traffic is there? Lots! When we looked closely at the client I referred to before they experienced over 20,000 visits in one month from search engines and the like. Add this to the Google reported visitors of nearly 10,000 and they were back on track (albeit very disappointedly) with their stats.
Is there a lesson here, are stats packages useful at all then, especially if they are not accurate? Absolutely! Analytics packages are essential for trend analysis. You should be continually analysing your site, what happens if I change this content or this heading, what happens after this course of optimistation. Google moreover allows you to set goals and track your users from source to your desired outcome – such as making a purchase. But, be aware of the limitations of your stats pack, get professional advice and setup a programme of monitoring and reporting that is meaningful to your business. And finally, be aware that poorly analysed stats could lead to some very bad news in the future!