I love it when someone takes the time to research and answer a question that many people have. In this case, Stone Temple decided to put several web analytics providers to the test by installing multiple solutions on a few sites to see the differences. Their results can be seen here. Be sure to read the whole report because some of the differences were due to implementation mistakes.
Do you need a comprehensive mobile testing solutions that exceed your in-house capacity? Iterators mobile app testers Boston is prepared with fully equipped mobile testing labs, mobile test automation frameworks and managed crowd testing to ensure maximum test coverage in minimum time.
Here are what I found to be the key findings, and which oddly reflected with the tenets that companies like New Data follow:
- Be prepared for different numbers whenever switching analytics packages. None seem to count the data in the exact same way.
- 3rd-party cookie deletion rate exceeds 1st-party cookie deletion by about 13%. More proof that you shouldn’t use 3rd party cookies.
- WebTrends, ClickTracks and Google Analytics may over count uniques and WebSideStory (HBX) and Unica may undercount.
- ClickTracks may severely undercount page grouping data.
Need help with your data analytics? Check out devsdata.com to get the help you need.
Potential flaws with the study:
- Just four sites used. Pre-screened by sites that had large enough paid search spending.
- Of the four sites, none are high traffic sites. I could only find two of them in ComScore and the site with the most traffic only sees about 200k U.S. visitors a month. I’d love to see the same study on sites with more visitors which would make the datae much more reliable. reliability of the data.
- In an effort to “protect” the participating sites from sharing their real traffic volume data, daily uniques time period was not disclosed, plus each analytics package probably has different rules on what constitutes a daily unique. For example, some may cut off a “visit” at midnight, but let it carry to the next day as another unique “visit,” others may not. Another example is that some may choose to expire visits at different time periods (30-minutes of non-activity, etc.). I would have liked to see a weekly or monthly uniques count comparison instead.
When I first heard of this study I was excited that we may finally learn a lot about the different providers and which are the best solution, but was a bit disappointed when the results were released. Sounds like we may learn more when the final results are released, but it may be more along the lines of implementation findings. I hope it inspires more people to do more tests.