Showing posts with label Mirosoft SQL Server. Show all posts
Showing posts with label Mirosoft SQL Server. Show all posts

Sunday, October 20, 2019

Migrating your SQL server Databases to Azure Managed instances in offline mode using native backup restore method

Hello Readers !

The below is a small blog on how do we migrate SQL Server Databases to Azure Managed instance using SQL native backups we take from our On-premise SQL servers. This is an offline method where you would require downtime throughout the activity till the connection is re-established to your new database in the cloud and hence not suitable for giant databases with businesses not willing to put up with a bigger downtime window. This can still be adopted for small databases based on your convenience.

Managed instances databases do not support "norecovery" mode in restoring databases which means that we cannot use a restore plan involving restoration of latest full, differential, subsequent log backups.

Lets get to it.

Step 1: Creating a Blob container in Azure storage account

Go to your storage account in azure portal and click on Containers (basically blob folders),

Here we would get to create a container.



You can refer my previous detailed blog series on storage account to learn more. Part 1 2 3

Step 2: Creating SQL credentials with the SAS key of the storage account for storage account.

The concept of  "Credentials" has been there in MS SQL server space for long. It is used by SQL server to connect to resources outside SQL server, in our case the azure storage account where we have created the blob container.

The name of the credential we create should be the complete URL of the blob container which is going to be used for holding the backup files.
You can easily find it out by navigating to your storage account and by clicking the properties settings in the left pane.













Please check the top portion of my blog here to learn how to generate SAS token which contain the authentication information to connect to the storage account.

Below is a sample t-SQL script to create SQL credentials using the SAS token


Use master
go
CREATE CREDENTIAL [https://<Yourstorageaccount>.blob.core.windows.net/<Containername>] WITH IDENTITY = 'SHARED ACCESS SIGNATURE' , SECRET = 'sv=2017-11-09&ss=bfqt&srt=sco&sp=rwdlacup&se=2028-09-06T02:52:55Z&st=2018-09-04T18:52:55Z&spr=https&sig=WOTifdfdfdfdfdf%2BwndxW2CQ7%2B5fHd7Is%3D'


Please make sure to remove the '?' sign in front of the SAS token . That is a common mistake many do when they are creating the credentials. We need to have the SQL credential created in both the source SQL server instance (If you are directly taking backup to azure storage account/blob container) and in the managed instance.





Step3: Backing up your databases to azure blob container from the Source DB instance

Use the regular backup T-SQL command only with a difference of "To URL" to write your backup set to azure blob container.Authentication for this backup command to write the backup onto azure storage, will be taken care by the Credential we created above with the container URL and the SAS token.

T-SQL command



Use master
go
BACKUP DATABASE <DBName> TO
URL = 'https://<YourStorageaccount>.blob.core.windows.net/<Yourblobcontainer>/Dbname.bak' with compression


You may also keep Copy_only if you are doing your test migration and dont want to affect the backup schedule in your DB instance.It also supports stripping of big DB backups to multiple files for ease of management. All you need to do is adding as many "URL ='Backup foldername+filename' in the backup command like below.


Use master
go
BACKUP DATABASE <DBName> TO
URL = 'https://<YourStorageaccount>.blob.core.windows.net/<Yourblobcontainer>/Dbname1.bak' ,
URL = 'https://<YourStorageaccount>.blob.core.windows.net/<Yourblobcontainer>/Dbname2.bak' ,
URL = 'https://<YourStorageaccount>.blob.core.windows.net/<Yourblobcontainer>/Dbname3.bak'
with compression

Step4: Restoring the Backup from the Blob container into the managed instance
T-SQL command
use master
go
RESTORE DATABASE <DBname> FROM URL = 'https://<YourStorageaccount>.blob.core.windows.net/<Yourblobcontainer>/WideWorldImporters-Standard.bak'

If you have created stripped backup from step3, make sure to give the URLs of all your backup in the restore command.

use master
go
RESTORE DATABASE <DBname> FROM
URL = 'https://<YourStorageaccount>.blob.core.windows.net/<Yourblobcontainer>/Dbname1.bak',
URL = 'https://<YourStorageaccount>.blob.core.windows.net/<Yourblobcontainer>/Dbname2.bak' ,
URL = 'https://<YourStorageaccount>.blob.core.windows.net/<Yourblobcontainer>/Dbname3.bak'



You can view your restored database in SSMS or in the portal in your managed instance page.

You can also see the progress of your restoration using DMVs like below.



use master
go
Select percent_complete from sys.dm_exec_requests where session_id=<SPID of your restore session>


I hope you are able to find your restore session's SPID using Sp_who/Sp_who2 active.

For online restoration to minimize downtime, we have methods like SQL replication and Azure Database migration service (DMS), which is a premium service listed in Azure market place.

That brings us to the end of this blog. Hope you find it useful and simple. I will write on Online DB migration in future with all pre-requisites.










































Sunday, October 6, 2019

Migrating SQL server Instance Audits to Managed Instance

Hi Readers!


The below blog content is intended to help those who are involved in SQL server migration to Azure managed instance and are having SQL audit already configured in their On-premise SQL server.

As you might know, SQL server Audit events can be written to event logs and audit files for the server level (Instance level) and Database level audit specifications. I would limit the scope of this blog to migration alone and would not get much into the details of the types of auditing we can do with SQL servers using the feature.There are plenty in both Database and Server level specifications. We do not have the luxury of accessing the VM behind the SQL Managed instance in Azure and hence cant use the option to write the audit events  to Event logs. We can get the audit write its events to an Azure Storage blog container and the events can also be retrieved and seen in SSMS just like how we have in On-premise. 

For database audit specifications' migration, we don't need to worry much  as it will be already there in the migrated database in the managed instance.We can map the audit specifications to the audit created  to make it working. For the Server/Instance level, I will be using the UI for adding a few events . You can create the ones which you already have in your on-premise environments.

Lets start with the steps.

1.Creating a Blob container in azure storage account for maintaining the audit files. (Azure portal)

Choose Blobs in your Storage account Overview page, and Click on the '+Container' option on the top. Give a meaningful name in all small caps and click on to create one blob container. 



2.Creating Credentials in the managed instance for authentication.

We need to create SQL credentials in the managed instance where we will be configuring the aduit to authenticate the connection between the instance and Azure storage account and let the audit write its events to the blob container.

You will need to take a note of the Shared Access Signature key from the storage account which we will need to create the credential. 



Click on Shared Access signature on the left pane of your storage account and fill in the details on expiry date for the SAS key, Allowed permissions, protocols like below. You could choose to keep a longer expiry date for the SAS key, so that the credentials would be valid for longer. Click on the button below for creating the SAS key.





Below the button, you should have the SAS token listed now. Click on the copy button the right to copy the SAS token text. When you are pasting it to a notepad/ssms window, make sure to remove the '?' sign in the beginning as it is not needed for credential creation. With the ? symbol, credential wouldn't work.

Go back to your SSMS and open a new query window against your managed instance and past the below query for credential creation.


Create CREDENTIAL [https://xxxxxxstorageac.blob.core.windows.net/dbinstanceaudit] 
WITH IDENTITY = 'SHARED ACCESS SIGNATURE', 
SECRET = 'sv=2018-03-28&ss=bfqt&srt=sco&sp=sdsfsfsfp&se=2019-10-03T22:37:04Z&st=2019-09-06T14:37:04Z&spr=https,http&sig=fdfddgdgdgdgdgdgd2IJZOZro7MNVyVLlzI%3D'; 

dbinstanceaudit - blob container we have created in the storage account.
Value for 'Secret' - SAS key that we have copied from above.

xxxxxxstorageac is my storage account name with prefix changed for security reasons. You should replace is with your own storage account.

3. Creating SQL audit

Navigate to Security (Instance level) and Right click on the "Audits" and select "New Audit". You should be able to see the below window.

Clicking on browse will open a window to sign in  to your azure subscription and select the storage account  and the blob container.



I would leave the name of the audit and queue delay and retention period for you to decide or set as per the existing configuration in the on-premise. Queue delay is the parameter that decides how much the audit can wait (In milliseconds) before writing the event to the audit file.



4.Mapping Server and Database audit specifications to the Audit 

Once we have the Audit created, please move on to providing the server audit specifications.
Right click on the Server Audit specifications under Security and add the specifications like how you have in your On-premise.Also select the audit which we have created in step3.


If your database that is migrated from the  on-premise has database level audit specifications already configured, you should get to see them under database specifications like below.


Click on that and map the 'Audit' we have created.


Now the last step would be enabling the audit, server audit specifications and database audit spefications by right clicking on them and selecting 'Enable'.

You may verify the audit logs that are being captured by right clicking on the audit and choosing 'View audit logs'.

That brings us to the end of this blog. I will blog on another topic soon. Please keep visiting.
Please let me know if you want me to write on a topic.