Planning a Migration from Azure SQL DB to Azure Managed Instance

Problem

Your organization has been using Azure SQL DB (Single Database) for a while now, and has come to the realization that a lot of missing features are hindering your development efforts and limiting your product offerings.

You have a need for wider access to instance-level features on the one hand, but you still want to retain managed service features in the Azure cloud, such as 99.99% uptime SLA, automatic updates, Advanced Data Security, and Azure SQL Log Analytics, and basically not having to manage an IaaS virtual machine.

Therefore, you reach the conclusion that Azure SQL Managed Instance (MI) is the right solution for you at this time.

Example features in Azure SQL MI that are unavailable in Azure SQL DB:

  • Cross-database access and manageability (closest to that is External Data Source)
  • Using your database(s) as Publisher(s) in Replication
  • BACKUP/RESTORE database commands via T-SQL (to/from URL)
  • .NET SQL CLR Support
  • Server-scoped triggers or logon triggers
  • Change Data Capture (CDC) (closest to that is Temporal Tables)
  • Resource Governor
  • Service Broker
  • And more

» Click here for the full list of features comparison between Azure SQL DB and Azure SQL MI

» Click here for a full features overview for Azure SQL Managed Instance

Examining Our Options

While planning your migration from Azure SQL DB to Managed Instance, you suddenly realize that your options are very much limited…

What we cannot do:

  • You cannot backup and restore from Azure SQL DB to a Managed Instance.
  • You cannot use Azure SQL DB as a source in Azure Data Migration Service (DMS).
  • You cannot use Azure SQL DB as a source in Data Migration Assistant (DMA).
  • You cannot use Azure SQL DB as a source in DB Mirroring, AlwaysOn, Transaction Log Shipping, or Replication.
  • Implementing a custom T-SQL, SSIS, or ADF (Azure Data Factory) solution will be too complicated, and will require an unacceptable development overhead.

What can we do?

There are only two possible options left (at this time, anyway):

  1. SSIS Export/Import Data Wizard in SSMS (dtswizard)
  2. Export/Import Data-Tier Application Wizard in SSMS (SqlPackage.exe)

Using these wizards in their most simple form, however, would mean full application downtime during the entire migration process.
This could be unacceptable in your scenario, whether because you have a strict uptime SLA for your application, and/or whether your database is simply too large to handle.

However, in both of these solutions you can also specify subsets of tables to copy.

Depending on how our application behaves, this may help us perform the migration gradually, where in every “iteration” we would copy a specified subset of tables.

This, of course, would NOT help us if our application has a need to access all of the tables, all of the time.

But, if there is some kind of logical separation in our database, for example based on schema, then we could implement the migration schema-by-schema.

Or, if there is some kind of separation between “hot” and “cold” data in different tables, then we could start with migrating “cold” data first (which could be done without downtime), and finish with the “hot” data in the last iteration (which could be done with a relatively short downtime).

See below sections for more info.

SSIS Export/Import Data Wizard in SSMS (dtswizard)

This wizard is available in SSMS, which generates an “ad-hoc” SSIS package based on the configurations you choose. It can be quite straight-forward and function as an almost direct “pipeline” of data between the two servers.

To start the wizard, right-click on a database and select Tasks > Export Data…
Follow the wizard steps.
Use the tables and views page to select your subsets of tables to copy.

Possible Issues

  • This method would copy table data only, but no other objects such as constraints, triggers, indexes, procedures, functions, etc.
  • You cannot automate this process. While you can use it to generate an SSIS package for later use, you cannot use dtswizard with command-line arguments. The dtswizard command is, in fact, the wizard itself. It cannot receive parameters from outside.

Possible Solutions

One way you could solve these issues (although very much painstakingly), is by using the wizard to generate a set of SSIS packages per each subset of tables in advance. But this will have to be done manually.

Only then, when all of the SSIS packages are ready, you can use the DTEXEC command line utility to execute them as part of an automation process.

The automation process, by the way, can also be executed using the Azure-SSIS Integration Runtime in Azure Data Factory (ADF).

Export/Import Data-Tier Application Wizard in SSMS (SqlPackage.exe)

The biggest advantage of the Data-Tier application wizard over the SSIS Data Wizard is that it uses the SqlPackage executable behind the scenes, which is a fully functioning command-line utility that can accept arguments.

You can use SqlPackage to export a BACPAC file from Azure SQL DB and import it to an Azure Managed Instance, and you can fully automate the whole process. No need to painstakingly specify the table subsets manually. Just do something like a Powershell script that will generate these table subsets automagically within a loop (or write a T-SQL script to print out the SqlPackage.exe commands for later use).

If we look at the parameters for the SqlPackage utility (specifically, for the Export action), we find a property called: /p:TableData:

PropertyValueDescription
/p:TableData=(STRING)Indicates the table from which data will be extracted. Specify the table name with or without the brackets surrounding the name parts in the following format: schema_name.table_identifier. This option may be specified multiple times.
Excerpt from the Microsoft Docs page

This parameter will let us use SqlPackage to export a specific subset of tables. This ability is also available as part of the Import/Export Data-Tier Application Wizard, when you use the “Advanced” tab in the “Export Settings” page:

This is the Export Data-Tier Application wizard in SSMS.
However, using the SqlPackage command-line would be preferable if you want to automate this.

Possible Issues

  • All tables that you specify in a subset must include any Foreign Key related tables. The Export action of SqlPackage, when used with the /p:TableData parameter, would include the full database schema definition in the BACPAC, which means that you cannot export “orphaned” data.
  • The target database specified for the Import action MUST BE EMPTY OF ANY USER OBJECTS. This is a limitation of SqlPackage. Which means we cannot incrementally import data into the same target database.
  • Exporting a BACPAC file is very similar to the extraction of a DACPAC file, and as part of this process there’s a “schema validation” step that’s being run. This step will FAIL if you have VIEWS or FUNCTIONS in your database that reference invalid objects. These “invalid objects” may include:
    • Objects that no longer exist
    • Objects in another database or server
    • System objects

Possible Solutions

All tables that you specify must include any Foreign Key related tables

If you’re trying to migrate a table which is involved in a foreign key relationship, then you must migrate all of its “child” tables as well as “parent” tables. In some schemas, this could feel like pulling a huge yarn ball thread.

If this is what your schema ERD looks like, you’d have to import one whole “ball” at a time.
But if ALL of your tables are connected via FK to each other…
Then you’d have to either import all of them, or consider dropping the foreign keys.

You can use something like the following script to easily detect all of your table subsets (i.e. “yarn balls”):

Each group of tables with the same subset_group_id would represent a “yarn ball”

The target database specified for the Import action MUST BE EMPTY OF ANY USER OBJECTS.

Otherwise, you can get the following error:

*** Error importing database:Data cannot be imported into target because it contains one or more user objects. Import should be performed against a new, empty database.

But wait, this is a Managed Instance we’re importing into, right? Which means we have cross-database access. So, instead of incrementally loading directly into the target database, we can load every iteration into a “staging” database, and then use T-SQL, Powershell, DbaTools, or something like the SSIS Import/Export Data Wizard to load the data directly from the staging database to the target database.
It may require from us some development overhead, but it should be significantly easier than implementing a full end-to-end migration.

The Export operation will FAIL, if you have VIEWS or FUNCTIONS in your database which reference invalid objects. These “invalid objects” may include:

  • Objects that no longer exist
  • Objects in another database or server
  • System objects

This means that before we can start our migration process, we’ll have to get rid of all these reference issues.
The last two are particularly problematic, and may require breaking changes to your database schema.

If this is a serious issue for you, you may have to perform the migration (mostly manually) using the SSIS Import/Export Data Wizard instead.

Specifically in Azure SQL DB, it’s also likely that you have two “system” schemas in your database, called “jobs” and “jobs_internal“, which are used for some automated processes. You will have to drop these schemas and everything in them, if you want to use SqlPackage for the migration.

You can use the following script to make it easier, but use at your own risk!

Dropping these schemas may cause some Azure automated processes to stop working

What Else We Need

It should be clear without saying, that for best performance, it would be ideal to have our Azure SQL Managed Instance in the same region as its source Azure SQL DB.

However, we also need some kind of “middle-man” that is capable of running SqlPackage and saving the BACPAC files to an intermediary file storage, or if you’re using the SSIS Import/Export Data Wizard, then this “middle-man” needs to be able to run SSIS packages.

Running SqlPackage

It would’ve been preferable to use Azure Powershell or Azure CLI via the Portal, and use a storage account to save the BACPAC file… But unfortunately, this is not supported for Managed Instances, as noted by Microsoft documentation:

A managed instance does not currently support migrating a database into an instance database from a BACPAC file using Azure PowerShell. To import into a managed instance, use SQL Server Management Studio or SQLPackage.

Source: Microsoft Docs

The best option, therefore, is to set up a Virtual Machine in the same Azure Region as the Azure SQL DB and Managed Instance. We’ll be able to install everything we need on this “middle-man” VM, and it will be used for executing SqlPackage and/or DTEXEC.

Of course, if you already have such a VM in the same region which you can use for this purpose – even better!

Running SSIS Packages

Running SSIS packages can be done either using the DTEXEC command-line utility on a VM, or using the Azure-SSIS runtime integration in Azure Data Factory.

The latter of which may be preferable if you want to avoid having to set-up a VM.

Constructing Our Plan

So, based on all the information we’ve collected so far, we come up with something like the following outline of a migration plan:

Prerequisites

Migration Process

Using DTEXEC as our main tool

If we want to use DTEXEC as our main migration tool, then we could do the following steps:

  1. Create your destination database either by creating a DACPAC, using Generate Scripts in SSMS, or using SMO in Powershell.
  2. Use the SSIS Import/Export Data Wizard to prepare SSIS packages, as per the table subsets you’ve chosen.
  3. The migration process itself would involve running DTEXEC per each SSIS package (using something like a command-line batch file, Powershell, or the Azure Data Factory)

Using SqlPackage as our main tool

If we want to use SqlPackage as our main migration tool, then for each subset of tables, we could do the following steps:

  1. Export the subset of tables into a BACPAC using SqlPackage.
  2. Import the BACPAC using SqlPackage into a staging database.
  3. Copy the data from the staging database to the destination database.
  4. Drop the staging database.
  5. Rinse and repeat.

On the “middle-man” VM, run SqlPackage with /a:Export and /p:TableData parameters. Set your Azure SQL DB as the source, and save to disk as the target. Example:

sqlpackage.exe /a:Export /SourceServerName:myAzureSQLDBServer.database.windows.net /SourceDatabaseName:mySourceDatabase /SourceUser:myUserName /SourcePassword:myPassword /TargetFile:D:\myPath\partial_1.bacpac /p:TableData=[dbo].[MyTable1] /p:TableData=[dbo].[MyTable2] /p:TableData=[dbo].[MyTable3]

On the “middle-man” VM, run SqlPackage with the /a:Import parameter, and set the Managed Instance as the target. If this is your first iteration, you could set your destination database as the target. Otherwise, specify an empty or non-existing staging database name. Example:

sqlpackage.exe /a:Import /TargetServerName:MyManagedInstance.appname.database.windows.net /TargetDatabaseName:myDB_Staging /SourceFile:D:\myPath\partial_1.bacpac

Conclusion

I’ve outlined the considerations and steps taken to prepare a migration plan from Azure SQL DB to Azure SQL Managed Instance, and explained all considerations involved.

The process is not simple, and in some cases may not even be possible without significant downtime and/or significant schema changes.

This makes sense, actually, from Microsoft’s point of view, considering that, from the start, their Managed Instance solution was offered as a “stepping stone” towards migration from IaaS/on-prem to a fully-managed Azure SQL DB. So, it would make sense that they wouldn’t provide solutions for migrations in the “opposite direction”. 🤷‍♂️

Hopefully, this article has given you a better understanding of the task that stands before you, and possible methods to overcome it. Or, maybe even enlighten you towards the gravity of repercussions of such a migration, and perhaps make you re-consider the whole idea. 😬

Additional Resources

2 comments

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.