BusinessData ManagementAnalyticsTechnicalLookerTableauPower BIEnterprise

How to Improve Dashboard Performance when using  Tableau, Power BI, and Looker

By August 27, 2018 No Comments
How to Improve Dashboard Performance 

Helping solve your run time issues

As dashboards and reports become more and more complex, slow run times can present major roadblocks. Here’s a collection of some of the top tips on how to improve dashboard performance and cut slow run times when using Tableau, Power BI, and Looker.

Tableau looker powerbiUniversal Tips

Before getting into the specific tips for the three tools, here are a few universal principles to keep in mind that will lead to improved performance in almost any case.

Limit logic used in tool itself: If you’re generating multiple calculated tables/views, performing complex joins, or adding numerous calculations in the BI tool itself, it’s a good idea for performance as well as governance to execute all those steps in the database or a separate business layer. The more data manipulation done by your BI tool, the more queries and functions your tool has to execute itself before generating visualizations.
Note: This is not an issue for Looker, as Looker offloads all of its computing onto the database via SQL.

Have the physical data available in the needed format: When the physical data in the source matches the granularity and level of aggregation in the dashboard, the BI tool doesn’t need to execute a function to aggregate it. Developing this in the data mart/warehouse can be a lot of work, but can save a lot of time and pain during dashboard development.

Keep your interface clean and dashboards focused: Consolidate or delete unused report pages, data sources, and fields. Limiting the number of visualizations on each dashboard also helps cut dashboard refresh time.

Strings hinder performance: In general, processing systems execute functions with strings much more slowly than ints or booleans. Where possible, convert fields like IDs to ints and avoid complex string calculations.

Tableau

Tableau

Built-in performance tracking: Always the sleek, powerful, and intuitive leading BI tool, Tableau has a native function that analyzes performance problem areas. The performance recorder tells you which worksheets, queries, and dashboards are slow, and even shows you the query text.

Extracts over Live: Tableau performs much faster when executing queries on extracts versus live connections. Use extracts  whenever possible —  and keep them trimmed-down to limit query execution time. If you want to stream data or have a constantly-refreshing dataset, then extracts isn’t an option.

Again, limit logic: Tableau isn’t built to handle too much relational modeling or data manipulation — too many complex joins or calculations really slows down its processing. Try to offload as much of these steps as possible onto the database or a semantic layer.

Limit marks and filters: Each mark included on a visualization means more parsing that Tableau needs to perform, and too many filters bog down the system. Try instead to split complex worksheets/visualizations into multiple smaller views and connect them with action filters to explore those relationships more quickly.

Sources: Tableau’s website has a succinct and very informative blog post that details out most of these suggestions and other specific recommendations further. You can find it here.

Power BI

Power BIImplications of DirectQuery: Similar in concept to Tableau’s extract vs. live connection options, import and DirectQuery options for connecting to data sources have different impacts on performance. It is important to remember that if you’re using DirectQuery, the time required to refresh visuals is dependent on how long the source system takes to execute Power BI’s query. So if your database server is flushed with users or operating slowly for some other reason, you will have slow execution times in Power BI and the query may time out (see more important considerations when using DirectQuery here).

Utilize Drillthrough:: Drillthrough pages are very useful for data exploration and decluttering reports, but they also have the added benefit of making sure your visuals and dashboards aren’t overly complex. They cut down query execution time and improve runtime while still allowing for in-depth exploration.

Be Careful with Row-Level Security: Implementing row-level security has powerful and common security use cases, but unfortunately, its implementation has the tendency to bog down system performance. When RLS is in place, Power BI has to query the backend and generate caching separately for each user role. Try to create only as many roles as absolutely necessary, and be sure to test each role to know the performance implications.

Sources: Microsoft’s Power BI documentation has a page dedicated to improving performance that further details out these options and more. Check it out here.

Looker

LookerUtilize Dashboard Links: Looker has a wonderful functionality that allows for easy URL linking in their drill menus. If you’re experiencing long refresh times, a nifty remedy is to split up your dashboard into multiple different ones and provide links between them in drill menus.

Validation Speed: LookML Validation checks the entire project —  all model, view, and LookML dashboard files. Increased complexity and crossover between logic in your files leads to longer validation time. If large files and complex relationships make lag in validation time problematic, it can be a good idea to break up your projects into smaller pieces where possible. The key here is handling complex SQL optimally by utilizing whatever methods will maximize SQL performance on the database side.

Caching: Caching is another important consideration with Looker performance. Developers should be very intentional with how they set up caching and the conditions for dumping and refreshing a cache, as this will greatly affect dashboard runtime. See their documentation for more information on caching.

Persistent Derived Tables (PDTs) and Derived Tables (DTs): Caching considerations come into play when deciding between using PDTs and DTs. A general rule of thumb is that if you’re using constantly-refreshing data, it’s better to utilize DTs. If you’re querying the database once and then developing heavily off of that query, PDTs can greatly increase your performance. However, if your PDTs themselves are giving you performance issues, check out this Looker forum post for a few remedies.

Further Sources: Looker’s forums are rich with development tips. These two forum pages are particularly helpful for improving performance:
Factors Impacting Query Performance
Considerations when Building Performant Looker Dashboards

If you have any more questions or comments, please don’t hesitate to contact us to discuss further.

Related Articles and Case Studies on Analytics Dashboard:

A CTO’s Guide to Selecting an Analytics & BI Platform
5 Important Principles for Dashboard Development
Google Analytics & Tableau Case Study
Tableau Visualization Case Study