Azure SQL Managed Instance Cross Subscription Database Restore using Azure Data Factory by info.odysseyx@gmail.com October 15, 2024 written by info.odysseyx@gmail.com October 15, 2024 0 comment 1 views 1 Azure Data Factory (ADF) sets up automatic, continuous, or on-demand restore of Azure SQL Managed Instance (MI) databases between two separate subscriptions. You must turn off TDE before starting the database restore process. If you need to enable TDE, check out the ABC blog for instructions. prerequisites Azure SQL Managed Instance spans two separate subscriptions. The same subscription SQL Managed Instance as Azure Blob Storage is located. Azure Data Factory (ADF) instance Permission is required To perform backup and restore operations, the SQL Managed Instance managed identity must have “Contributor, Storage Blob Data Contributor” permission on Blob Storage. To transfer backup files between two storage locations, the ADF managed identity needs “Storage Blob Data Contributor” permission on the Blob storage. To perform backup and restore operations, the ADF management identity requires ‘sysadmin’ permissions on SQL Managed Instance. memo: Managed identities were utilized for authorization. If you use a different identity, make sure it has the same permissions assigned to it. Step: 1 Create server-level credentials. Credentials are records that contain the authentication information needed to connect to resources external to SQL Server. USE master GO CREATE CREDENTIAL [https://.blob.core.windows.net/] WITH IDENTITY='Managed Identity' GO Verify the successfully created credentials. Step: 2 Create ADF Link service connections to both SQL Managed Instance and your storage account. Creating an ADF dataset using both SQL Managed Instance and the Storage Account Link Service Step: 3 If you utilize private endpoints, set up the ADF integration runtime and follow the Manage links. Create an Azure Data Factory managed private link Step: 4 Create an ADF pipeline to take database backups from the source. Split backup files into multiple files for faster backups Use the script below to perform a copy-only database backup. Use the Script activity to run a backup script using the Source SQL MI Link service. Uas Master GO BACKUP DATABASE [@{pipeline().parameters.source_database_name}] TO URL = N'https://.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_01.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_02.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_03.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_04.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_05.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_06.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_07.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_08.bak' WITH COPY_ONLY, MAXTRANSFERSIZE = 4194304, COMPRESSION, STATS = 10 Allow one minute for the backup to be transferred to Blob storage, and adjust the period to suit your specific needs. Step: 5 Create an ADF pipeline to copy database backup files from the source storage account to the target storage account. Use the copy activity to copy backup files from a source storage account to a destination storage account. Allow one minute for the backup to be transferred to Blob storage, and adjust the period to suit your specific needs. Step: 6 Create an Azure Data Factory pipeline that restores a database from a specified storage account to a target SQL Managed Instance backup. Use the script below to restore the database from the specified storage account. Use the Script activity to run a restore script using the target SQL MI Link service. USE [master] RESTORE DATABASE [@{pipeline().parameters.target_database_name}] FROM URL = N'https://.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_01.bak', URL = N'https://.blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_02.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_03.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_04.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_05.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_06.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_07.bak', URL = N'https:// .blob.core.windows.net/databaserefresh/@{pipeline().parameters.source_database_name}_08.bak' Step: 7 Use the appropriate activity actions to remove orphaned database users, provide user access, or set up additional pipelines to perform any additional tasks required after the restore. Step: 8 Create an ADF pipeline workstation that executes steps 4 > 5 > 6 > 7, all in that order. To enable dynamic operation of your pipeline on different databases, set parameters for source_database_name and target_database_name. Source link Share 0 FacebookTwitterPinterestEmail info.odysseyx@gmail.com previous post 3 Innovative Ways Developers Are Building with AI next post Securing Hardware and Firmware Supply Chains You may also like How to strengthen AI security with MLSecOps December 6, 2024 The Sonos Arc Ultra raises the bar for home theater audio December 5, 2024 Aptera Motors will showcase its solar EV at CES 2025 December 3, 2024 How Chromebook tools strengthen school cybersecurity December 2, 2024 Nvidia unveils the ‘Swiss Army Knife’ of AI audio tools: Fugato November 26, 2024 Nvidia Blackwell and the future of data center cooling November 25, 2024 Leave a Comment Cancel Reply Save my name, email, and website in this browser for the next time I comment.