Tagged: webtrends 

Peder 3:08 pm on Aug 16, 2012 Permalink | Reply
Tags: , Omniture, , web analytics tools, Webtrends   

Since the London games just finished, you’ll have to excuse the sporting references. Who’s the fastest man in the world? Well, it depends. Over 100 and 200 meters it’s obviously Usain Bolt. But lengthen the track or throw in some hurdles and you’ll have a different winner. Same applies for web analytics.

Not all products made it to the Web Olympics final. In our “competition”, we opted to look at Adobe SiteCatalyst, Google Analytics and Webtrends (jump to comparison matrix below). Combined, they enjoy by far the largest market share, and Unilytics is certified in all three so we are very familiar with them. Each of these three main products has special strengths and weaknesses. We have tried to make this analysis as unbiased and vendor neutral as possible. In all cases, Unilytics consulting recommends whichever solution best fits our clients’ needs.


Eric 12:08 pm on Aug 29, 2011 Permalink | Reply
Tags: Visitor History Export, Webtrends   

Webtrends Analytics provides a feature known as Visitor History Export (VHE). This is an extremely useful feature many organizations are taking advantage of. In essence, this feature provides per-visitor information that falls into the following categories:

  • Campaign History
  • Search Engine History
  • Visit History
  • Purchase History
  • Custom Visitor Segmentation
  • Content Group Unique Visitor Tracking
  • Page of Interest Unique Visitor Tracking

There are many ways to use this information, especially when viewed over time. For example, Unilytics provides a product named VHE Distiller that extracts and transforms this information into a regular relational database suitable for data mining or enterprise reporting. It should be noted that VHE is not a data warehouse, per se; nor is it a complete replacement for a full data warehousing product based on visitor information. There are limitations built into the design of the VHE feature that should be kept in mind when using it. This blog presents some of these “gotchas” when using the VHE feature and resulting export files. (More…)

Peder 2:06 pm on Jun 20, 2011 Permalink | Reply
Tags: hosted service, privacy, Webtrends   

On May 26, 2011, minutes before a midnight deadline, President Barack Obama extended the USA PATRIOT Act (Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act) of 2001 which gives authorities of post 9/11 powers to search records and conduct roving wiretaps in pursuit of terrorists.

Should we care?

In an effort to protect the American people, the Act allows for unprecedented access to personal information stored on computer systems and in other formats. Perhaps the most controversial section of the Act is section 215 which is commonly referred to as the “library records” provision because of the wide range of personal material that can be investigated. It allows the US government to secretly request and obtain records for large numbers of individuals without any reason to believe they are involved in illegal activity. Items that can be searched include “books, records, papers, documents, and other items,” which includes dumps from private-sector computer databases.

This is likely a concern to many organizations wanting to protect the privacy of its customers and users. And for companies using web analytics tools, the collection and storage of that data in the US provides government authorities unfettered access to that information. Furthermore, the US government can access considerably more information than any web analytics tool would be capable of. For example, the US government can cross-reference the IP address assignment records for any given IP address for any given time. This allows them to access name, address, phone number, account numbers, etc. Not even the best web analytics software has access to this depth of information. There may not be much US based organizations can do to prevent such access, but those located in Canada and abroad can choose not to use services which store collected information in the US.

Web analytics tools such as Adobe SiteCatalyst, Google Analytics and Webtrends On Demand all store visitor information on US servers. As an alternative, Webtrends offers software on premise to ensure collection occurs within the organization and within its own country borders. Since hosted services free staff from implementing hardware solutions to support installed software, they continue to be popular options for many. For clients who don’t want their data stored in the US but also don’t want to deal with software installations, Unilytics offers managed services, whereby we install and manage the Webtrends implementation. This is a very attractive solution for organizations concerned with the location of collected data, but who are also attracted to a hosted solution.

The number of court approvals for business record access jumped from 21 in 2009 to 96 in 2010. While that remains a relatively low number, it poses an uncertainty that many organizations are unwilling to accept. It’s that uncertainty that causes the issue.

It may be that the greatest risk to organizations allowing data to be stored in the US is to their own reputation. The threat to privacy may be more perceived than actual, but that may be reason enough for organizations to not want their data stored in the US.

Web analytics should begin with a plan. Unfortunately, most don’t. And the plan should start by answering the most basic of questions; why do we have a web site?

All web sites can be categorized as having one of two objectives; either to make money or to save money. Sites which make money are not only those in which shopping carts are presented. Rather, any site involved in the promotion of products or services have as their underlying purpose to increase brand awareness and encourage purchases on or offline.

Sites which attempt to make money include:

  • lead generation
  • brand awareness
  • eCommerce
  • Social networking
  • Entertainment

Conversely, sites which attempt to save money are those which encourage self-service. It decreases corporate costs if web visitors can successfully download a document, register for assistance or fill in an online form rather than contacting a toll-free number and asking for the same to be done or mailed out.

Sites which attempt to save money include:

  • >Customer education
  • Self-service
  • Customer service
  • Informational
  • Intranet

Understanding the objective of your site allows you to construct and establish key performance indicators (KPI) you need to track. Studying raw web analytics reports is meaningless.

Measures, which illustrate visitor behaviour and traffic volumes, are not indicators of web site success. Reports showing page views, most popular pages and visitor traffic do not indicate whether a site is performing well or not. No organization, with the possible exception of media firms, should fundamentally care if page views and visitor numbers increase. To extract real meaning from increases in raw numbers like page views or visitors it must be put into some context. That’s where KPIs come in.

Peder 10:12 pm on Dec 1, 2009 Permalink | Reply
Tags: SDC, Webtrends   

More and more organizations are implementing an alternative technique to collect web site traffic information rather than relying on web server log files. This technique is called client-side data collection, or data tagging for short. Webtrends’ implementation is referred to the SDC (SmartSource Data Collector). Data tagging solves many problems associated with web server log file analysis.

Implementing data tagging requires some development work to ensure that data tags are inserted and maintained on web pages.

With data tagging, web traffic data is more accurate because traffic normally hidden by cache or proxy servers is tracked. IT administration is eased because data collection is centralized in one location versus site data being dispersed among several log files from multiple web servers that may also be geographically dispersed. And web data can be collected from specialized applications, such as application servers and browser applications (e.g. Macromedia Flash).

In short the SDC has the following advantages:

  • Generates logs that are optimized for Webtrends that are as much as 90% smaller than traditional access logs.
  • Produces a single centralized SmartSource file rather than separate log files for each web server. This essentially eliminates the administrative headaches associated with gathering logs from multiple, geographically dispersed web servers. The SmartSource file can even contain hits from multiple domains (the domain name can also be passed as a query parameter), allowing visitor behavior to be analyzed across an organization’s sites or even partner sites provided they permit your tags to be included on their pages.
  • Provides information that is difficult or impossible to obtain with log files. For example, data tags linked to your SDC can be included in your banner ads placed on other sites. SmartSource tags can also be inserted into Flash applications, permitting a hit to be entered into the SmartSource file for each event fired in the program. This means visitor activity within Flash applications can be analyzed just like visitor interactions with HTML-based pages.
  • Web traffic data is more accurate because traffic normally hidden by cache or proxy servers is tracked. In many cases, web server log files do not accurately represent the actual interactions visitors have with a web site. Proxy servers are one of several examples of how analysis results can be distorted by web server log file data collection. Proxy servers deflect page views from web servers by caching the most frequently requested pages. Local caches have a similar effect, handling browser requests through locally cached pages rather than making repeated requests to the web server. In doing so, these page views are not recorded in the web server log files.
  • Creates a cookie for more accurate reporting. Cookies ensure visitors are tracked as they navigate and return (if using a persistent cookie) to your site. This enables the most sophisticated features of Webtrends such as Scenario Analysis, SmartView, Path Analysis, and Custom Reporting.
  • Acts as a filter: you only tag the pages you need reporting on.
  • Bots and Spiders don’t need to be filtered out and / or scrubbed from the logs resulting in more accurate reporting and requiring less CPU processing power.
  • Enhancing reporting capabilities through META-tagging (i.e. tracking revenue, correlating Paid Searched Terms with conversions, …)
  • Reporting on specific events within a page through the use of the dcsMultiTrack JavaScript function:
    • Tracking PDF downloads
    • Tracking dynamic / Web 2.0 events on pages (i.e. DHTML or any browser-supported event)
    • Tracking events within Flash movies

Client-side data collection is quickly growing in popularity as the superior approach to collecting web visitor behavior information. It provides greater reporting accuracy and lower administrative overhead. Organizations should carefully analyze the costs and benefits of data tagging versus web server log file analysis, and determine which method will best meet your insight needs.

Page 1 of 212