top of page
Search
Writer's pictureSQL Shark

Setting up your-cost effective Azure Database Environment (3)

Chapter 3

Migrating your SQL Server On-Premise to Azure SQL Database

Migration to the cloud is a topic that would become more and more important as many companies nowadays want to move their infrastructure to the cloud.


Some of the advantages to using a cloud-based solution include:

  • Cost Savings

  • Security

  • Flexibility

  • Mobility

  • Insight

  • Increased Collaboration

  • Quality Control

  • Disaster Recovery

  • Loss Prevention

  • Automatic Software Updates

  • Competitive Edge

  • Sustainability

In this article, we’ll mainly cover database migration. Many companies have the misconception that migrating to the cloud is really expensive. However, those who are attempting to weigh the advantages and disadvantages of using the cloud need to consider more than one factor than just the initial price they need to consider ROI. If you’re worried about paying for features you don’t need nor want, most cloud-computing services are pay-as-you-go. So eventually you only pay for what you use.


When it comes to database migration there are many ways you can migrate your database to the cloud. The 2 methods I’ll be covering today are:

  • Creating a .bacpac file which is similar to a .bak file that encapsulates both schema and data. We’ll deploy the .bacpac file to an empty database straight from SSMS.

  • The second method is using the Data Migration Assistant/Data Migration Service. The tool can perform assessments to identify feature parity and database compatibility issues when moving your SQL Server to any of the available deployment options. It presents a report which includes all the compatibility issues that your database has and will suggest possible fixes. The tool can also be used for migrating data.

An important note here is that for this project we’ll be migrating a non-live database (aka data at rest). This means lifting and shifting the database will be quite easy. In a real-world scenario, you would need to consider replacing all of the ETLs with equivalent ones in the cloud.


I recommend doing a pre-migration checklist and a post-migration checklist.


The pre-migration checklist might include questions such as:

  • What is the size of your database?

  • How many schemas and tables do you have?

  • How many really big tables do you have?

  • Do you have engine-specific data types that won’t be migrated by your migration tool?

  • How hot (busy) is your source database?

  • What kind of users, roles, and permissions do you have on the source database?

  • How can your database be accessed (firewalls, tunnels, VPNs)?

  • Do you have enough bandwidth to move all your data?

  • Can you afford downtime? How much?

  • Do you need the source database to stay alive after the migration? For how long?

  • What are your high availability (HA) requirements

  • Does all the data need to move?

  • Does it need to move to the same place?

  • What happens to your database after migration?

  • What is your contingency plan if things go wrong?

The post-migration checklist will be more focused on the types of testing and monitoring you can do after the migration is complete:

  • · Test the throughput of the migration process (number of records per unit time). This testing will be used to verify that the planned downtime is sufficient. For planning purposes, consider the time to verify that the migration process was completed successfully.

  • Compare Migrated Records to Records Generated by the Destination System – Ensure that migrated records are complete and of the appropriate context.

  • Summary Verification – Several techniques provide summary information including record counts and checksums. Here, the number of records migrated is compiled from the destination system and then compared to the number of records migrated. This approach provides only summary information and if any issue exists, it does not often provide insight into an issue’s root cause.

  • Compare Migrated Records to Sources – Tests should verify that fields’ values are migrated as per the migration specification. In short, source values and the field level mappings are used to calculate the expected results at the destination. This testing can be completed using sampling if appropriate or if the migration includes data that poses significant business or compliance risk, 100% of the migrated data can be verified using an automated testing tool.

There are other stages of database testing which I will be covering in another article non-related to this project so stay tuned for more information on database migration testing.


Migration - Method 1 .bacpac

To begin with we’ll need to open SSMS and connect to our source and destination servers. For this project, I’ll be using a localhost database instance (for my on-premise solution).



The database we’ll be migrating today is the AdventureWorksDW2017. Right-click on the database and go to Tasks-> Deploy Database to Microsoft Azure SQL Database.




You’ll see the wizard window pop up which will help you deploy your database quicker. Click on Next and follow to the next window. Specify the target connection and connect to your cloud database server. For this project, I’ll be using a Basic Tier. You can assess which pricing tier works best for your organization depending on your requirements.



Click on next to see the summary and then finish deploying the database. Your final results should look like this.



To ensure our database has been successfully migrated refresh the database folder in SSMS and AdvntureWorksDW2017 should appear.


You’re all set and your data at rest has been migrated successfully.


Migration - Method 2 Data Migration Assistant

Database Compatibility Assessment

Method 1 is quite easy to grasp. Unfortunately, in a real-world scenario, your on-premise data warehouse might not be compatible with Azure if you’re using an older version of SQL Server for example.


To identify and resolve the compatibility issues, you can use the data migration assistant which will help you assess your database.


You can download the DMA (Database Migration Assistant) from the following link: https://www.microsoft.com/en-us/download/details.aspx?id=53595


Once you get it installed, open it and select Project Type -> Assessment. For the project, give it a random name.


The next step is to connect to your Source – On-premise SQL Server instance. Tick the Trust Server Certificate as well.




Once authenticated, you’ll be asked to select the database you want to be assessed. In our instance this is AdventureWorksDW2017. Select it and click on the Add button.


Once it’s added click on the Start Assessment button in the bottom right corner. 140 Compatibility level is pretty good so I’m expecting almost no issues in this assessment.


The alignment of SQL versions to default compatibility levels are as follows:

  • 100: in SQL Server 2008 and Azure SQL Database

  • 110: in SQL Server 2012 and Azure SQL Database

  • 120: in SQL Server 2014 and Azure SQL Database

  • 130: in SQL Server 2016 and Azure SQL Database

  • 140: in SQL Server 2017 and Azure SQL Database

  • 150: in SQL Server 2019 and Azure SQL Database


The results are:



Both of these won’t cause any problems when migrating so we’re all set for the next step which is migration.


Database Migration with DMA

Once assessed, we can now proceed to migrate the database to the cloud. Create a new project, this time click on Project Type -> Migration. Give a random name for the project name and leave everything else as default.



The first step is selecting our source server. Please select it and make sure you tick the trust server certificate. Also, select the database you want to migrate.


Then select your destination database (this is your Azure SQL Database).


Then select the objects you want to migrate. In our case, it would be everything. However, you can use this tool to migrate individual objects as well.

Once you’ve chosen your objects the DMA tool would generate the SQL Scripts for you, deploy the schema and migrate the data.


After deployment, your results should look like this.


Select the tables you want to be migrated. In our case, since I already migrated all of the data using the wizard the DMA tool has grayed out my tables so I can’t migrate the data.


Click on start data migration and wait for the data to migrate.


You’re all set and your schema and data are migrated to the cloud. :)



In the next chapter we'll be covering Automation in the cloud so stay tuned. I'll be comparing the Azure Automation Account and the Azure Data Factory that we set up in chapter 2 and comparing which option is more efficient and cost-effective.

47 views0 comments

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
Designer (10)_edited_edited.jpg

Fuel the SQL Shark! Buy me a coffee to keep the data waves rolling! 🦈☕

bottom of page