Challenges of move SQL Server data to Microsoft Azure SQL Server PaaS

This post covers one of the challenges we face when moving our infrastructure from on-prem to Microsoft Azure: Moving SQL Server data from on-premise to Microsoft SQL Server Paas.

Moving our data was a challenge for the following reasons:

  • Our company has tight controls and rules about what software/connectivity is allowed on our network
  • The database that we need to migrate after compression was over 100GB in size.
  • No information about how long other migration processes would take. As needed to ensure our migration was complete before the next Monday on the weekend of our migration.

For most people migrating their data, the best option is to use Azure Database Migration Service, especially when transferring a database size that we were. We couldn't use the Azure Database Migration Service for the following reasons:

  • Azure Database Migration Service requires software installed on a server within our network
  • Connections to both databases were needed.
These were not acceptable for our company security teams.

Another approach Azure offers for transferring data is Azure Data Boxes. There are various Azure data boxes options available, which the Azure Data Box Disk. There were a few reasons we had to discount this approach: 

  • The lead time for Azure Data Box disk exceeded our migration window.
  • There are extra costs for using this approach
  • Our use case didn't precisely fit with the use cases for using Azure Data Boxes.

Azure's final method for data transfer, which we used, is SQL Package (DacFx) to create a BACPAC for exporting/importing.  Our risk with SQL Package was as follows: 

  • Can it handle a database of over 100 GB in size (our largest table has 400 millions rows of data)?
  • How long will it take to package, transfer and import?

We found that SQL Package can handle our database size once we resolve our package errors:

  • Cannot access a closed stream
  • Not enough disk space to complete option, even though the final location of the BACPAC file had enough free disk space.

To resolve "not enough disk space error" we found that we needed to move the windows temp folder (TMP and TEMP environment variables) to a disk with as much free space as the database's size.  

For the error: "cannot access a closed stream" we notice the location set by environment variable: LOCALAPPDATA needed to be on a disk with an excess of 80GB of free space.

We were able to complete the process of packaging, copying and importing into Azure within our required timeframe.  Here is our breakdown of each stage of the process: 

  • To export into BACPAC completed in about 9 hours.
  • Copying from our on-premises server to Azure took just over an hour using AZCopy.
  • To Import into our SQL Server PaaS database took about 9 hours to complete.

Comments

Unknown said…
I appreciate you taking the time and effort to share your knowledge. This material proved to be really efficient and beneficial to me. Thank you very much for providing this information. Continue to write your blog.

Data Engineering Services 

Machine Learning Solutions

Data Analytics Solutions

Data Modernization Services

Popular posts from this blog

Azure SQL Server IaaS SSIS: A required privilege is not held by the client

SSIS File handle leak in For Each Loop

Cannot update a database that has been registered as a data-tier application