On a previous post about efficiency I shared my experience when I optimized a dashboard that was running slowly.
However, I didn't measure the performance improvement and the comparison, although very obviously noticeable, was not quantified. It wasn't important then, but in many other cases it is necessary to know more accurately. As it is said, what isn't measured can't be managed.
Measuring efficiency
Tableau's help guide includes an article on Recording and Analyzing Workbook Performance that can lead you through the step by step process on how to evaluate the performance of a workbook. I ran this analysis for the Original Ecobici file and then for the Optimized version.
When filtering the heatmap table by a certain time and day in the week, it took an average of 22 seconds for the original file to run:
While it only took 0.35 seconds for the optimized file to execute the queries implicated.
This signified that the same query ran 60 times faster on the latest workbook, but it also implied a certain tradeoff in the simplicity that the tool offers since it was a little laborious to adapt the data. In this example, it was definitely worth investing some effort in better organizing the information for Tableau to be swifter. However, in some occasions this may not be the case.
The pretentious name for this dilemma is called the Ousterhout's dichotomy:
Easy for humans to program vs Easy for machines to run