Dashboard performance is of the utmost importance when delivering sound solutions to your stakeholders. A dashboard that takes 7+ seconds to load upon interaction is not sufficient. The general rule I use to measure dashboard performance is that if the dashboard does not render within 3-5 seconds, I need to do something to make it better. A long load time can make or break your customer’s confidence in the usefulness of your dashboard. Here are some basic to intermediate tips on making sure your dashboard is running as fast as possible.
First off, Infrastructure
Before jumping into some of the practical things you can do to increase dashboard performance, I wanted to touch on how we pull the data into our Tableau workbook. Pulling in predefined table into Tableau should theoretically be faster than creating a custom SQL to pull in that same table via a Select *. If you’re pulling in the table as is you’re essentially doing Select * From Table A, but if you do it via custom SQL you’re saying Select * From (Select * From Table B) A. This slight difference is what causes a straight pull from Tableau to be faster.
The issue with this is that many times we want to just pull in a select subgroup of data instead of an entire table. This is where Custom SQL could be a much more efficient way to pull your data. Not only are you bringing in fewer columns, you can also set your filtering criteria before querying the database. If you bring in a predefined table you are only able to add your criteria once it’s pulled in meaning you’ll be pinging your database several more times.
Whether you use a predefined table in Tableau Desktop or a Custom SQL query, optimize that query by only bringing in the columns you need, and limiting the rows by the proper criteria. If your data set is a million rows, one unneeded column will force you to bring in 1M more data elements than needed.
Move all of your row-level calculations to be fields in the back end. Performing a row-level calculation in Tableau can cause serious performance delay vs bringing [insert] data in from your original query.
The same logic applies for any renaming of fields. If you have any calculations including text, like IF [Field A] = “Spencer” THEN [Field B], this will slow down your render times as well. Any and all mapping, text calculations, or other row calculations show be engineered in the back end if possible.
Any tables that are being joined should be done so in a back end view when possible. The biggest impact will not be the query execution once the join is created, but the recreation of the extract. Think about it this way. When you recreate your extract using tables in Desktop, you’re having to delete and recreate the extract using the previously established processes (extract filters, row level calculations), but when you do this in Custom SQL, you are recreating an extract on only one data set, the one that is the product of your join query.
Additionally, when joining in Tableau, your query is defaulting to joining the two tables in their entirety, not only the columns needed. Doing this in Custom SQL can help you optimize your data set creation time, which will only compound during the creation of your extract.
One of the best ways to optimize performance is to use data extracts. Although this itself is a simple concept, it is best to know why. Data extracts take your large data set and cram it into a more efficiently structured package allowing for increased data storage and easier querying ability. This why interaction with an extract in Desktop it is so much quicker than querying live. Your server admin will also thank you as you’re not shooting a query to your database with every mouse click.
Creating a Tableau Published data source is also another way to increase performance. Instead of querying a database and taking up server bandwidth, publishing data sources to your Tableau Server will take a load off your data base and will make your queries faster as you’re not running redundant queries about your server.
Lastly, when creating a data extract, make sure to click the Hide All Unused Fields option, as this will create an extract of only the fields actually used in your workbook. Just like one of my earlier points, this eliminates the creation of a data set with unneeded fields, which will speed up query time and the extract refresh time. Note that if you need a field in the future you will need to recreate your extract to include the data from that newly added field, a potential downside.
Using parameters instead of quick filters can improve your final render times as well. This will help you cut down on querying your data source as your parameters will force calculations to run instead of queries to be fired off. Running a calculation on a data set is most often much faster than running a query to create a new data set then applying an aggregation.
One example is a [Year] filter. Using standard quick filters, you would have a field, Sum([Profit]), and you would click on year 2017. The mechanics of this are that Tableau Desktop fires off a query to your data source saying Select * from Table A where [Year]=2017. This data set is then returned and your calculation applied. However, if you structured your calculation as such: SUM(IF [Year Parameter]=[Year] THEN [Profit] END), then your data set is already created, all your Desktop does is run the row level calculation based upon your Parameter selection.
These are not the only ways to improve dashboard performance, but just a few ways that I find helpful. Thanks for reading! If you have any comments or other ways to improve dashboard performance please send them to me at firstname.lastname@example.org and I will update this blog post with them. Thanks!