AnalyticsComparing Web Analytics Packages

Comparing Web Analytics Packages

Stone Temple Consulting's 2007 Web Analytics Shootout takes a comprehensive look at seven top Web analytics tools.

What started as a great idea for linkbait turned into a comprehensive analysis of Web analytics packages rivaling research from the largest firms.

Stone Temple Consulting has published the 55-page 2007 Web Analytics Shootout, the results of a nine-month study of seven top Web analytics packages on four sites. The report looks at performance, accuracy, and capabilities of Clicktracks, Google Analytics, IndexTools, Omniture SiteCatalyst, Unica Affinium NetInsight, Visual Sciences’ HBX Analytics, and WebTrends.

The report is not intended to identify a “best” package, but rather to identify the strengths and weaknesses of the various applications so that potential buyers can understand how their own site structure might work with or against them, according to Eric Enge, president of Stone Temple Consulting. A major section of the report is a qualitative comparison of the various packages to help webmasters find the right fit for their situation.

“We’ve exposed a lot of things that show the scope of differences between the packages, and to start to show which application might be right for which circumstances,” Enge said. “The report comes from nine months of living with the packages, as well as extensive discussions with vendors.”

The idea began with an idea Rand Fishkin wrote about on the SEOmoz blog. Enge answered the call for someone to compare analytics packages, and spent the next nine months immersed in the project. Enge published an interim report at the eMetrics Summit in May. He revealed some of the findings in the final report at an Issues in Analytics panel at Search Engine Strategies San Jose last month, and published the final report last week.

He has also posted several “10 cool things” lists gleaned from his conversations with the analytics vendors themselves. So far, he has lists covering HBX Analytics, NetInsight, Clicktracks, and IndexTools.

Enge started with the assumption that all of the packages would have some inaccuracies, and would show some variance between their results based on the differences in the packages and their implementations. The goal of the study was to find out how inaccurate they were, what was contributing to the inaccuracy, and how it could be minimized.

As it turns out, there is a great deal of variance between analytics packages measuring traffic on the same site, as much as 20 percent in some cases. Those differences can be caused by the counting method of each package, as well as the impact of many implementation decisions made by the user. For example, the definition of daily unique visitors may include all visitors in the past 24 hours, or only those in a given calendar day.

There are also differences in the way the packages handle anomalies, such as when a proxy server strips a referrer, or when a user clicks to a new page before the Javascript runs, Enge said. The report details several areas where errors and variances might occur, most notably the placement of the Javascript.

“If you’re pursuing high-value analytics, you need to understand what your sources of variance are,” he said. One way to help do that is to run multiple packages at the same time, at least initially, to see where the trouble spots may be. For webmasters, the key lesson is to spend a significant amount of time in planning an implementation beforehand, and to look for areas where issues may arise early on, so that they can be addressed, Enge said.

While there is variance between packages, that does not mean that the results they deliver are not useful, Enge said. As long as you understand what is being measured, and not comparing one package’s results directly to another’s, any analytics package can deliver useful results.

“As Jim Sterne is fond of saying, if your yardstick measures 39 inches instead of 36 inches, it’s still great to have a measurement tool,” Enge said.

The difficulty comes when someone using a 39-inch yardstick wants to compare results with someone using a 36-inch yardstick, such as when companies are partnering or acquiring one another, or when an agency or vendor is comparing results with your own. In those cases, the two need to calibrate their analytics, so that they are talking about the same numbers, he said.

Enge plans to continue his research into the analytics packages. One planned study will dive deeper into measuring the effects of Javascript placement on the accuracy of the analytics packages. Preliminary findings in this report show that there could be significant errors, especially on pages with bloated code where the Javascript code is placed at the bottom of a page. Stone Temple Consulting is also working with Jonah Stein from Alchemist Media to study latent conversion tracking, where a conversion is happening after repeat visits, but being attributed to just one of those visits.

Search Headlines

We report the top search marketing news daily at the Search Engine Watch Blog. You’ll find more news from around the Web below.

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y