Choosing Between a vCore Pricing Model and DTU-based Service Tiers
When choosing between vCore and DTU-based pricing tiers, consider the following.
Licensing
A vCore pricing model provides up to 30% cost savings by using existing on-premises SQL Server Standard or Enterprise licenses with software assurance. Therefore, if you are migrating an existing on-premises SQL Server infrastructure, consider opting for a vCore pricing model.
Flexibility
A DTU-based model bundles up the compute, IOPs, and storage under DTUs and provides a pre-configured range of varying DTU amounts for different types of workloads. It's therefore best suited for when you need a simple pre-configured option.
A vCore model provides flexibility when selecting compute and storage options and is therefore best when you want more transparency with control and over the compute and storage options.
Consider a scenario where you have a database with high compute requirements and low storage requirements; say, 125 DTUs with a database size of 200 GB. You'll have to opt for the Premium service tier and pay for the unused storage (300 GB):
Figure 2.17: Azure portal pricing feature
The preceding screenshot is from the Azure portal and shows the pricing options for a Premium DTU-based tier. Observe that the pricing is calculated per DTU. The storage cost is inclusive of the DTUs. Therefore, in this instance, you will pay for all 500 GB of storage, even if it's not used.
In a vCore model, the compute and storage cost are calculated independently. Therefore, you only pay for the storage you use, which is 200 GB and the vCore used.
Note
The Premium service tier includes 500 GB of free storage. An additional cost of approximately $0.16 is applied on additional storage (beyond 500 GB) up to 1 TB.
Figure 2.18: General Purpose vCore pricing model
The preceding screenshot is from the Azure portal and shows the pricing options for the General Purpose vCore pricing model. Observe that the pricing is calculated per vCore and per GB of storage used. Therefore, you pay for the storage you use. You can, however, scale the storage up or down at any time, as per your requirements.
Consider another scenario, where a team is just starting up with a product and is looking for a SQL database pricing tier; a Standard S2 or S3 tier with 50 to 100 DTUs and a maximum of 250 GB would be a good option to go for. As the product matures and the scalability requirements become clear, the team can scale up accordingly.
You can scale between vCore-based and DTU-based service tiers. When scaling from DTU- to vCore-based pricing tiers, consider the following rule of thumb for choosing the correct compute size:
- 100 Standard tier DTU = 1 vCore General Purpose tier
- 125 Premium tier DTU = 1 vCore Business Critical tier
Note
Once you move to the Hyperscale service tier, you can't move to any other service tier.
Determining an Appropriate DTU Based Service Tier
As a SQL Server DBA, when migrating to an Azure SQL Database, you will need to have an initial estimate of DTUs so as to assign an appropriate service tier to an Azure SQL Database. An appropriate service tier will ensure that you meet most of your application performance goals. Estimating a lower or a higher service tier will result in decreased performance or increased cost, respectively.
This lesson teaches you how to use the DTU calculator to make an appropriate initial estimate of the service tier. You can, at any time, change your service tier by monitoring the Azure SQL Database's performance once it's up and running.
Azure SQL Database DTU Calculator
Developed by Justin Henriksen, an Azure Solution Architect at Microsoft, the DTU Calculator can be used to find out the initial service tier for an Azure SQL Database. The calculator is available at https://dtucalculator.azurewebsites.net.
DTU Calculator Work Flow
The DTU Calculator works as shown in the following diagram:
Figure 2.19: DTU Calculator workflow
First, you have to set up a trace to record the following counters for at least an hour:
- Processor: % processor time
- Logical Disk: Disk reads/sec
- Logical Disk: Disk writes/sec
- Database: Log bytes flushed/sec
You can run a trace by using either the command-line utility or the PowerShell script provided on the DTU Calculator website.
Capture the counters on a workload similar to that of the production environment. The trace generates a CSV report.
Note
These utilities capture the counters at the server level and not at the database level. To capture at the database level, use SQL Server Dynamic Management Views (DMVs).
The DTU Calculator uses this report to analyze and suggest an initial service tier.
Finding an Initial Service Tier for the Database to be Migrated Using the DTU Calculator
Let's get back to Mike. Mike is unsure about the service tier that he has to select while migrating to Azure SQL Database. Hence, he wants to make use of the DTU Calculator to select the service tier to migrate to. The following steps describe how he can use the DTU Calculator to determine the initial service tier for his database:
Open https://dtucalculator.azurewebsites.net and download the command-line utility to capture the performance counters.
Note
You can also open the file saved at C:\Code\Lesson02\ DTUCalculator\sql-perfmon-cl.
The sql-performance-cl folder has two files, SqlDtuPerfmon.exe and SqlDtuPerfmon.exe.config. The first one is an executable that, when run, will capture the counter in a CSV file, and the second file is a configuration file that specifies the counters to be captured.
- Open SqlDtuPerfmon.exe.config in Notepad and make changes, as suggested in the following points.
Change the SQL instance name to be monitored:
Figure 2.20: Change in the config file
Under the SQL COUNTER comment, modify the value of SqlCategory as per the instance name of your SQL server.
If you are running on a default instance, then replace MSSQL$SQL2016:Databases with SQLServer:Databases.
If you are running on a named instance, say, Packtpub, then replace MSSQL$SQL2016:Databases with MSSQL$Packtpub:Databases.
Set the output file path:
Figure 2.21: Output file path
If you wish to change the output file, modify the value for the CsvPath key shown in the preceding screenshot.
You are now ready to capture the performance trace.
- Double-click the C:\Code\Lesson02\DTUCalculator\sql-perfmon-cl\SqlDtuPerfmon.exe file to run the performance trace.
A new command prompt will appear and will display the counters as they are being monitored and saved into the CSV file.
You should get a command-line window similar to the one shown in the following screenshot:
Figure 2.22: Command Prompt window with counters
- The next step is to upload the output file on the DTU Calculator website and analyze the results. To do this, open https://dtucalculator.azurewebsites.net/ and scroll down to the Upload the CSV file and Calculate section.
In the Cores text box, enter the number of cores on the machine you ran SqlDtuPerfmon.exe on to capture the performance counters.
Click Choose file and in the File Open Dialog box, select the C:\Code\Lesson02\DTUCalculator\sql-perfmon-log.csv file.
Click the Calculate button to upload and analyze the performance trace:
Figure 2.23: Panel to upload the CSV file
- The DTU Calculator will analyze and then suggest the service tier you should migrate your database to. It gives an overall recommendation and further breaks down the recommendation based on only CPU, I/O, and logs utilization.
Overall Recommendation
The DTU Calculator suggests migrating to the Premium – P2 tier, as it will cover approximately 100% of the workload.
You can hover your mouse over different areas of the chart to check what percentage of the workload each tier covers:
Figure 2.24: Service tier recommendation window
Recommendation Based on CPU Utilization
The calculator recommends the Premium – P2 tier based on CPU utilization:
Figure 2.25: Recommendation based on CPU utilization
Recommendation Based on IOPs Utilization
Based solely on IOPs utilization, the DTU Calculator suggests going for a Standard – S2 tier, which covers approximately 90% of the workload. However, it also mentions that approximately 10% of the workload will require a higher service tier. Therefore, the decision lies with you. The Premium – P2 service tier is costlier, but it covers 100% of the workload. If you are okay with 10% of the workload performing slowly, you can choose the Standard – S2 service tier.
Remember, this is an estimation of the service tier and you can at any time scale up to a higher service tier for improved performance:
Figure 2.26: Recommendation based on IOPs utilization
Recommendation Based on Log Utilization
Based solely on log utilization, the DTA Calculator recommends the Basic service tier. However, it is also mentioned that approximately 11% of the workload requires a higher service tier.
At this point, you may decide to start with the Basic service tier and then scale up as and when required. This will certainly save money. However, you should consider that the Basic service tier allows only 30 concurrent connections and has a maximum database size limit of 2 GB:
Figure 2.27: Recommendation based on log utilization
Determining Compatibility Issues
Once you have finalized the service tier, the next step is to migrate both schema and data from the on-premises SQL database to the Azure SQL Database. As we learned in Lesson 1, Microsoft Azure SQL Database Primer, not all features are the same and supported on Azure SQL Server. Therefore, you will first have to do a compatibility test or assessment to find and fix the compatibility issues.
The following are the tools available to detect compatibility issues with. Although these tools can be used to migrate the database, in this section, we'll specifically talk about using them to assess compatibility.
Data Migration Assistant
Data Migration Assistant (DMA) is a standalone tool for detecting compatibility issues and migrating on-premises SQL Server databases to Azure SQL Databases. It provides a wizard-type, easy-to-use graphical user interface for compatibility assessment and migration.
It detects and highlights compatibility issues. Once all compatibility issues are identified and fixed, you can migrate the database.
SQL Server Data Tools (SSDT) for Visual Studio
SSDT is the best tool for Azure SQL Database (V12) to find and fix incompatibility issues. It has the most recent compatibility rules. The compatibility issues can be fixed from SSDT itself, after which we can migrate the database.
SQL Server Management Studio (SSMS)
SSMS has two options to detect and migrate:
- Export Data Tier Application: This exports the data and schema in a bacpac file, and while doing so, lists out any of the incompatibilities found.
- Deploy Database to Microsoft Azure SQL Database: This deploys the database to Azure SQL Database, first by exporting the database in a bacpac file and then importing the bacpac file into an Azure SQL Database. It lists incompatibilities when generating the bacpac file.
SQLPackage.exe
This is a command-line tool that helps to automate database development tasks such as importing, exporting, and extracting bacpac or dacpac files. Its actual use is to help automate database life cycle management; however, it can be used to detect and get a report of the incompatibilities found.
It is included in SSDT. You can download a different version of SSDT from here: https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017.
SQL Azure Migration Wizard
This is a community-supported tool that provides a wizard-driven graphical user interface to detect compatibility issues and migrate the database. The tool isn't updated fully to work with the later Azure SQL Database V12 version.
You can download it from here: https://github.com/adragoset/SQLAzureMigration.