Monitoring can be expensive...really expensive. And while there are open source options available (shameless self promotion here for a favourite of mine in the opensource space), some people / organizations prefer commercial software/solutions whereas others prefer complete customisation or may not have a budget available.
There is nothing wrong with either approach, both have their pros and cons.
In the Azure space, there have always been several options for monitoring your SQL workloads, including
Azure Monitor
Query Performance Insight
And not forgetting, at each database level, you always have the magnificent Query Store.
Very fast recap time....
Azure Monitor.
Found under the Monitoring blade of your SQL Azure DB, you can see Metrics and Alerts (At the server level, you may only see Logs)
This allows the collection of basic metrics, such as percentages of CPU, Data/Log IO and Data space. Other metrics such as Tempdb space, session counts, Login errors are available.
So not particularly detailed, but allows a highlevel view, and is included with your Azure SQL DB.
From here, its quick and easy to create dashboards by just pinning the panels to an Azure Dashboard, or.....even add these visuals into a Grafana workspace if you have one...which is pretty cool as if you follow this blog you may have guessed that I love Grafana.
This is just showing the SQL Azure DB metrics, however Azure monitor also provides similar metrics visualisation for other Azure resources, allowing you to build a dashboard that caters for a wide spread of deployed resources. Easy Life, and free (ish).
Query Performance Insight
Lets dig down a step further, under Intelligent performance we have Query Performance Insight.
..which essentially allows a breakdown of the top queries by the main metrics, ie... CPU, IO, Duration, Execution count, and quickly aggregate them.. all for the time period you're looking at. Pretty darn useful if you ask me, its like having a pre-wrapped sys.dm_exec_query_stats in your pocket ready.
Query Store is required to be enabled on the DB, so ... think it of as the Top resource consuming report available at a high level, but with reduced detail. I'm not going to cover query store here, there are lots of awesome resources out there for it, needless to say, if you can enable it, then I'd recommend doing so.
Diagnostics
Although not monitoring in the same way as other options, you can still send a whole load of really useful data to a target for later analysis. The downside for these is that they aren't visualised in the same way immediately as the other options, and we all like our pretty graphs. But still, some great metrics in there.
All this is pretty good if you ask me, so what does the Database Watcher new kid on the block give us? Lets have a look.
Database Watcher
First off, at the time of writing, this is still in Preview, and is only for Azure SQL PaaS offerings, namely Azure SQL DB and SQL Managed Instance, so if you're out of luck if you're using SQL on VM. Expect this to be added at some point in future, its number 2 on the published roadmap.
Preview or GA...the long and short of it is that it allows collection of performance AND config data into a central datastore for ALL of your SQL MI and Azure DB estate. With all of the data in one place, then dashboards are connected to here for easier estate-wide visualisations.
Shamelessly grabbing the Microsoft diagram shows the architecture nicely.
Components
The entire system is made up of four main areas.
Targets
Your SQL DBs/Servers that you want to monitor...not much point of any monitoring solution without something to monitor!
Watcher
This is effectively your collection engine, that drives the extraction of metrics you're interested in.
DataStore
Ah, your repository for your collected metrics. Watcher can make use of Azure Data Explorer, or Fabric Real Time Analytics which are explorable with TSQL or KQL.
Dashboards
However you choose to display your pretty data is up to you, options including PowerBI, Grafana (yay!) or Azure Data Explorer Dashboards via Azure Workbooks.
Costings
Watcher
here is no specific cost per monitored resource or user accessing Watcher. Meaning you can safely monitor as many resources as you like, and have as many users as you like looking at the data. There is a limit of targets per watcher, but more on that later.
Data Explorer
For the purposes of this blog, i'm going to use of Azure data Explorer. One word of note, i'm using a Free Tier instance here for Data Explorer, which avoids all costs. If you are using the Development Tier that also avoids an additional markup price... so take care with costing if POCing in Free/Dev tiers and then planning to move to Production later.
Regions
Watcher can traverse regions, and it is possible to house your SQL Targets, Watcher and Datastore all in different regions...however this may incur additional charges, so its recommended to house as much as possible in a single region. Thats Microsoft saying that, not just me.
Limitations
There are, its 1000 targets per subscription, with that made up of a maximum of 50 targets per watcher, and a maximum of 20 watchers per subscription. (Currently, its still preview remember so this will probably change )
Enough waffle - Lets get this set up
Implementation
Start off by searching for Database Watchers in the main search bar, you'll find it under Monitor / Monitoring Tools / Database watchers.
Thats the one, go for a +Create
Select your Resource group, name and region as normal in Azureland.
Identity
Completely up to you, i'm just leaving this as a system identity for simplicity.
Data Store
Here, you can pick the desired datastore for holding the watcher collected metrics. As mentioned above, i'm going to Create and Use an Azure Data Explorer cluster.
Be careful creating an Non free tier Azure Data Explorer from here whilst still in Preview, Â the deployment may fail if the default size isn't available in the region selected (UK South for example...) .
So, i'm selecting a data store instead....
Now the option is revealed to use a Free tier Azure Data Explorer, add a name and location. Note, its recommended to place the data store and watcher in the same region, i'm just adding a free store in the closest region I can currently.
SQL Targets
At this point, you can add monitor targets directly, if you should so desire. I'll add others later as well in exactly the same way.
Note, slightly lower in the page...
Isn't that nice, MS even give us links dynamically giving theTSQL required to grant minimum priviledge access to the SQL targets. Thank you very much, easy life.
Note, don't try and assign this yet if you haven't created the DB Watcher at this point. Otherwise, you will get:
Msg 33130, Level 16, State 1, Line 1
Principal 'DBWatcherRod' could not be found or this principal type is not supported.
Now - Review and Create. If using a Free Tier Data Explorer, it shouldn't take too long to create, but grab a cuppa anyway.
SQL Permissions
When deployed, now you can grant the permissions to the SQL target
For an Azure SQL DB target, this is:
CREATE LOGIN [DBWatcherRod] FROM EXTERNAL PROVIDER;
ALTER SERVER ROLE ##MS_ServerPerformanceStateReader## ADD MEMBER [DBWatcherRod];
ALTER SERVER ROLE ##MS_DefinitionReader## ADD MEMBER [DBWatcherRod];
ALTER SERVER ROLE ##MS_DatabaseConnector## ADD MEMBER [DBWatcherRod];
Done. Nearly.
If you aren't using Public access for your SQL Servers, and lets face it, you probably aren't, you need to create a Private endpoint to access the SQL resource.
Even if you have private endpoints for your targets, you can't use any existing endpoints that may be created on the same target. These are watcher specific unfortunately.
+Add, and select the relevant target details from the dropdown. This will take a couple of mins to create.
Thats it right?
Nope... upon creation, the Private link goes into a pending state... it has to be approved. Off to the private Link center with you... Just search for "Private link", then select Pending Connections
Tick and approve. Cool, connection sorted.
Done yet? No.
You now need to grant access to the Azure Data Explorer Database
If you've got a paid for cluster, you can just add the Principal of the Watcher as and Admin under the Security + Networking blade on the Data Explorer page
With Free Tier Data Explorer, you have to use Kusto, which I have...because i'm tight.
Open up a KQL explorer https://dataexplorer.azure.com/freecluster and connect to the database you've created, running the below with your relevent details.
.add database [adx-database-name-placeholder] admins ('aadapp=identity-principal-id-placeholder;tenant-primary-domain-placeholder');
adx-database-name-placeholder - The name of your Azure Data Explorer DB
identity-principal-id-placeholder - The object principal ID, which Idon't care about sharing this, as it will be deleted before I release this blog/
tenant-primary-domain-placeholder (Required for Free Tier - Paid Tier Azure Data Explorer datastores don't need this.
Are we there yet Dad?
Yes! - Now you have a Database Watcher, you have to start it.
And if you've got everything correct.... HURRAH!.
If something is not correct, these will show as 0,0,0. I haven't yet found decent logs to see where the problem lies either....so make sure you follow all steps.
To add more targets, just pop along to the Configuration blade, then SQL Targets.
I have found that new targets added may not appear until the Watcher is restarted. Expect this to change going forward though.
Dashboards
All this work was to get to this point. Our pretty pictures.
Click on the type of DB you want to look at, i've only got one here so far...
If you click on the Azure SQL Databases link, you'll go to the overall Estate for that SQL type. If you click anywhere else in the panel, you'll get a drop down of the database targets currently being monitored which you can click on to get the Detailed view (see below).
Estate View
Here you can get a high level view of the pain points in the estate right now...namely.
Database Heatmaps
Elastic Pool Heatmaps
Heatmaps quickly show the pain points coded by the metric you're interested in. CPU, DataIO, Waits etc. Lovely stuff! One glance giving us an easy fast view of what is bad, based on the colour it is, Red Rubbish, Green Great... Yellow...err... you get the idea.
Clicking on a heat tile, then the link allows you to zoom in to the detailed view of the collection.
An example of a heatmap In a non sandpit world, this could be something like this.... now, which one is the bad one?
What else do we have?
Databases - This allows you to see all DBs, clicking takes you to the Detailed view (see below).
Elastic Pools
Top Queries - Yes, the heaviest queries across the estate. Love this.
Detailed View
By clicking on the DB name link, you get to the detail metrics....and a lot of these will look familliar.
Oh, hello!
Performance
Waits! I love waits.
Common performance Counters
And other sections containing the good stuff like Memory breakdowns. And this is only in the first tab!
You have all these other tabs to play with!
I particularly like the Tables and Indexes tabs, easy view of what is being hit at specific times.
Performance - See above
Activity - whats going on
Sessions
Who, what app is causing you pain.
And much much more!
Top Queries - as above, but filtered to the DB selected.
Storage - What are my IOPs / Latency and Throughput metrics like?
Tables - What tables are being hit? (thinking a heatmap on this would be nice)
Indexes - By what indexes?
Connectivity - Successful and failed connections
Geo-Replicas - Exactly what it says
Backup - When have backups occurred?
Properties - Various useful config elements for the Database.
Thats a lot of information...
Summary
Honestly, as a Preview, I think this is looking very promising...
Most of the SQL metrics that we usually want...
System level resource/performance metrics and
The SQL running at those times
Collected to a single datastore
Visible in a single monitoring pane out of the box
New targets easily added
Cheap(ish)
I'll be keeping an eye on this one as it matures. Yes, definitely some functionality isn't yet available, the elephants in the room being Alerts, and Monitoring SQL on VM...but they'll be coming. If you don't want to custom create everything, this one is going to be a great option for people I think!
Thanks for reading as always.
StingRod
Comments