Following from my previous article about how Snowflake and Qlik work in conjunction, I started to see what options were available to Snowflake customers to report on their usage through the visualisations of Qlik.
Snowflake is a powerful piece of tech, allowing you to have a cloud based Data Lake, only paying for your compute resources when you need them, but how do you monitor and make sure that you are not over utilising those compute resources and that developers are not using the wrong size warehouse, making unnecessary charges?
As part of each Snowflake setup, there is an incoming data share called “Snowflake” which contains tables on usage. This table can be queried like any other table in a data lake. With Snowflake, inbound and outbound shares to supplementary data sources can be set up to enrich data, adding context and insight. The outbound shares can be used to pass other organisations data securely. This can be on a free or chargeable service.
With the Snowflake usage data share, you recieve a share containing details on usage, users and a whole load more. You can then extract the data into Qlik Sense and look at details such as:
- Cost and Usage Analysis -The number of credits used and trend of this over time
- User Auditing and Security – Who is logging in from where and when
- Performance and Optimisation – Who is running queries and what is this costing the organisation
This would be available for anyone to create their own dashboard, but luckily a plug and play application for Qlik Sense, written by the fantastic Dave Freriks of Qlik already exists. The dashboard is available to download from GitHub with full instructions on how to set it up. The link for this is below:
Using this, we can now report on exactly how much our Snowflake instance is costing and also manage the performance, security and quality of its offering.
For more details on Snowflake or Qlik Sense, drop me an email to firstname.lastname@example.org