Recent Updates RSS

Eric 2:09 pm on Sep 12, 2016 Reply
Tags: , data outliers, , normalize data   

The Challenge

A common request when analyzing large amounts of data is to evaluate the impact exceptional data has on results. Statistics addresses these needs by offering “median” and “average” when normalizing large numbers of data points.

Median selects a data point in the exact center of all data points to define the “normal” value and, as a result, is unaffected by exceptionally high or low data.

Average, also known as “mean”, on the other hand, sums all of the data points and divides by the number of data points to determine the “normal” value. Average is affected by exceptionally high or low data.

For example, if you have 100 data points, where 97 of them are 100 and the last three values are 1,000, 10,000, and 100,000, which would be “right skewed” data, look at the difference:

● Median value = 100
● Average/Mean value = 1,207 (More…)


Peder 10:08 am on Aug 9, 2016 Reply
Tags: Dashboard changes, , , Tableau updates   

Client dashboards are rarely ever “done” and evolve over time from one version to the next. When dashboard functionality or information changes, the descriptive contents of the dashboard have to be manually updated. Also, Tableau dashboards often link to other dashboards, and when they are moved between Tableau sites or Tableau Servers, the links need to be changed. (More…)


Huey Coelho 7:07 am on Jul 8, 2016 Reply  

7 reasons why you don't get bi tools

As a sales professional, I have the opportunity to connect and speak with a variety of different people in various business sectors – from solopreneurs to large corporations and government agencies. And while they all have different needs, there are a lot of common reasons as to why and when they get BI tools. Or don’t. Here are my anecdotal top 7 reasons why people don’t get BI tools, and tips on when (and how) they should: (More…)


Eric 5:04 pm on Apr 29, 2016 Reply  

This is the first in a series of challenges for those that believe they have mastered the more complicated aspects of Tableau. The series will call on your understanding of the intricacies of how Tableau functionality works.

For example, did you know that you can perform merchandising affinity analysis using dynamic sets? Did you know that forecasting in Tableau uses the exponential smoothing technique, and do you know how that affects the results of the forecast? Did you know that when using non-additive computations or LOD expressions that reference a secondary data source you may need to have the linking field from the primary data source in the view?

If these questions are news to you, then you may be interested in trying out our Tableau Master Challenge series. We will explore some new facet of advanced Tableau functionality with each post. Even if you can’t solve the challenge, we will walk you through the solution and the logic of how and why the solution works. (More…)


Ray Rashid 3:03 pm on Mar 18, 2016 Reply
Tags: data blending, , data preparation, , ETL   

Have you embarked on a journey to create the most insightful dashboard of all time just to realize you have disparate data? Data Integratiion
Whether you are a new analyst entering the work force or a seasoned business intelligence specialist, chances are you have or will encounter the need to combine data from different sources to properly answer your business questions. The reality is visualization software offers limited capabilities in data blending.
(More…)

Page 1 of 1312345...10...Last »