Posted by KeithMacDonald
August 2, 2012
SiteCatalyst version 15 was officially released last summer and Adobe is in the process of migrating clients to the new platform. For those clients who are now on version 15, April’s version 15.3 update introduces a couple of enhancements and some restored functionality.
Spider and bot filtering rules are now managed directly in the Admin Console. In previous SiteCatalyst versions, either this type of traffic filtering didn’t happen at all or was a custom data processing rule (VISTA). The benefit of client-managed rules is greater transparency into what traffic is filtered out.
Alongside the processing rules are two new reports: Bots and Bot Pages. These reports show traffic by individual bots, which provides insight into how often search engine indexing happens (as example).
Clients should be aware of this new feature. If you don’t already have filtering in place, this is definitely best practice!
Our rough estimates have bot and spider traffic representing 5-10% of total traffic for most clients, however this may be much higher for specific site types that are frequently indexed.
Unilytics consultants would be happy to assist you with implementation and impact analysis.
“Unique values” refers to the number of individual values that can be captured for a single variable (custom or pre-named) in SiteCatalyst. (In technical terms, this is the database limit for SiteCatalyst variables. In Webtrends terms, this is analogous to table limits.)
For any SiteCatalyst variable, Adobe used to impose a limit of 500,000 unique values per month, with anything beyond that grouped together in a single value called “Uniques Exceeded”. Values were captured on a first-come-first-served basis: any popular values captured late in the month were buried in Uniques Exceeded.
This aggregation meant you couldn’t report on late-captured individual values in SiteCatalyst (although you could in DataWarehouse). The real problem though was, because of aggregation, Uniques Exceeded was often one of the top 10 values for the data point, which is not helpful.
Adobe has created an algorithm that will report late-captured popular values:
For most clients this is a non-issue, since “Uniques Exceeded” is usually of no impact to the business. (This normally impacts only seriously long-tail analysis, which is generally more effort to sort out than will return in revenue lift or cost savings.)
For those clients affected, you should be aware of the change in that it may impact your reporting (you may see new entries in top 10 lists) and it may enable new analysis (unique values that would have otherwise been buried) or impact ongoing analysis (values that would previously have been available are now swept into Uniques Exceeded).
Traditionally SiteCatalyst variables were case-sensitive, meaning “home” is reported separately from “Home” and “HOME”. This is a huge problem for a lot of clients since it’s somewhat difficult to regulate the case of values passed into SiteCatalyst.
Adobe is now aggregating all values where the only difference is case: home, Home and HOME will all be reported as home. This applies only to new report suites and will be made an option for existing report suites at some point this year. This also applies only to traffic props – eVars already behave this way.
This change is targeted at smoothing out small annoyances with the product that have been long-standing. Clients should be aware as month-over-month values may jump unexpectedly. Implementation teams should also be aware since there is less requirement now to force values to lowercase.
Version 15 was launched with some popular (and useful) reports missing. These have now been reinstated and include: Pathfinder, Full Paths, Path Length, Original Entry Page, Days Before First Purchase, All Search Page Ranking and Pages Not Found.
Clients who have completed migration to version 15 should now be able to do most, if not all, of their analysis and reporting in version 15 – no need to log back into version 14!
This is another change targeted at smoothing out small annoyances. Clients should spend some time validating data in the newly-restored reports to ensure the migration process has gone smoothly and there is no major data fluctuation caused by the platform.
Explore Posts By Category