Thursday, December 26, 2019

Searching for a word/Substring in object definitions

Hello Readers!

Below is how do we search for a word or a subscript in every object definitions across the databases in an instances. The script basically uses a cursor to navigate among databases and further uses object_definition function to scan the definitions for a word or a substring.

For logging the result set from each database, I have created a table in DBadmin databases and have also made sure all the object types are covered. "Create" is the substring we are seaching for , which is expected to have a lot of results as most of  object  definition starts with "Create ". Please replace "Create" with what you are looking for .

One of a few other applications of this script could be finding if a cross DB query is used to reference objects in other database. For example , If you think you have objects like stored procedures, views that are written with cross DB query (DB.Schema.objectName) that references  one of a master kind of database named 'masterdb', and you wish to confirm if a particular DB is referred from every database in your instance, you can try searching for "%masterdb.%" or "%masterdb].%". This will list all objects in your instances which is written with scripts referring to the database masterDB with masterdb.dbo.object or [masterdb].dbo.object.


Below is the script. Hope you find it useful !

Use DbAdmin
go
IF OBJECT_ID('dbo.substringref', 'U') IS NOT NULL
 DROP TABLE dbo.substringref;
Create table dbo.substringref (ObjectName Nvarchar(100),ObjectId int,ObjectDefinition nvarchar(max),DatabaseName Nvarchar(30),object_type Nvarchar(30))
go
Use master
go
DECLARE @tsql nvarchar(max)
DECLARE @dbname varchar(500)

DECLARE MyCur CURSOR STATIC FORWARD_ONLY FOR
    SELECT [name] 
    FROM sys.databases 
    WHERE [name] NOT IN ('tempdb','master','Msdb','model','distribution')

OPEN MyCur 

WHILE (1=1)   
BEGIN   
    FETCH NEXT FROM MyCur INTO @dbname

    IF @@FETCH_STATUS <> 0 
        BREAK 

SET @tsql = 'use ' + @dbname + '
SELECT [name] as ObjectName,[id] as ObjectID,Object_definition(id) as [Defenition],Db_Name() as Databasename,
CASE obj.type
WHEN ''AF'' THEN ''Aggregate function (CLR)''
WHEN ''C'' THEN ''CHECK constraint''
WHEN ''D'' THEN ''Default or DEFAULT constraint''
WHEN ''F'' THEN ''FOREIGN KEY constraint''
WHEN ''FN'' THEN ''Scalar function''
WHEN ''FS'' THEN ''Assembly (CLR) scalar-function''
WHEN ''FT'' THEN ''Assembly (CLR) table-valued functionIF = In-lined table-function''
WHEN ''IT'' THEN ''Internal table''
WHEN ''K'' THEN ''PRIMARY KEY or UNIQUE constraint''
WHEN ''L'' THEN ''Log''
WHEN ''P'' THEN ''Stored procedure''
WHEN ''PC'' THEN ''Assembly (CLR) stored-procedure''
WHEN ''R'' THEN ''Rule''
WHEN ''RF'' THEN ''Replication filter stored procedure''
WHEN ''S'' THEN ''System table''
WHEN ''SN'' THEN ''Synonym''
WHEN ''SQ'' THEN ''Service queue''
WHEN ''TA'' THEN ''Assembly (CLR) DML trigger''
WHEN ''TF'' THEN ''Table function''
WHEN ''TR'' THEN ''SQL DML Trigger''
WHEN ''TT'' THEN ''Table type''
WHEN ''U'' THEN ''User table''
WHEN ''V'' THEN ''View''
WHEN ''X'' THEN ''Extended stored procedure''
ELSE ''Unspecified'' END AS [type]
FROM   sys.sysobjects obj
WHERE  Object_definition(id) LIKE ''%Create%''

Insert into DBadmin.dbo.substringref
EXEC sp_executesql @tsql
END
CLOSE MyCur;   
DEALLOCATE MyCur;
go
use DBadmin
go
Select * from dbo.substringref



Sunday, November 24, 2019

Monitoring SQL Sessions with necessary parameters

Hello Readers !

Below is a useful T-SQL query having necessary monitoring parameters to do a live monitoring of SQL server sessions in SSMS. You can put more filters in the where clause to make it more relevant to your environment.

It includes isolation levels used by the sessions along with popular columns that are monitored like hostname, wait time, program name etc.

use master
go
SELECT   s.session_id,
            r.status,
            r.blocking_session_id as BlockedBy,                                 
            r.wait_type,
            wait_resource,
DB_name(s.database_id) as DBName,
            s.host_name,
CASE r.transaction_isolation_level
WHEN 1 THEN 'ReadUncomitted'
WHEN 2 THEN 'ReadCommitted'
WHEN 3 THEN 'Repeatable'
WHEN 4 THEN 'Serializable'
WHEN 5 THEN 'Snapshot'
ELSE 'Unspecified' END AS transaction_isolation_level,
            r.wait_time / (1000.0) as [WaitSec],
r.total_elapsed_time / (1000.0) as [ElapSec],
            r.cpu_time,
            r.logical_reads,
            r.reads,
            r.writes,

            Substring(st.TEXT,(r.statement_start_offset / 2) + 1,
                    ((CASE r.statement_end_offset
                        WHEN -1
                        THEN Datalength(st.TEXT)
                        ELSE r.statement_end_offset
                        END - r.statement_start_offset) / 2) + 1) AS SQL_text,
            Coalesce(Quotename(Db_name(st.dbid)) + N'.' + Quotename(Object_schema_name(st.objectid,st.dbid)) + N'.' + Quotename(Object_name(st.objectid,st.dbid)),
                    '') AS command_text,
            r.command,
            s.login_name,
            s.program_name, 
dateadd(HOUR, 3, S.last_request_end_time) as last_request_end_time,
            dateadd(HOUR, 3,s.login_time) as Login_Time,
            r.open_transaction_count
FROM     sys.dm_exec_sessions AS s
            JOIN sys.dm_exec_requests AS r
            ON r.session_id = s.session_id
            CROSS APPLY sys.Dm_exec_sql_text(r.sql_handle) AS st
WHERE    r.session_id != @@SPID 
ORDER BY r.cpu_time desc, r.status,
            r.blocking_session_id,
            s.session_id

Please let me know in comment section if you find any other columns that are useful to amend and improve this.


Friday, November 22, 2019

How to copy files from Azure Fileshare to Azure Blob Containers

Hello Readers !

Below is a small useful blog content on how do we copy file copy files between Azure Blob containers and Azure fileshares using Azcopy.

For dummies, Azcopy is a command-line utility tool to work with Azure storage which you can download it from here. This is supposed to work faster than your azure storage explorer. You may do a comparison check yourself to confirm that. At present, Azure storage explorer, doesn't support copy-paste between two different services, lets say files and blobs.
One you download it , you can copy it into a windows folder and access it from your CMD.

For running AzCopy commands, in Command prompt you should navigate to the directory/folder which is holding your Azcopy.exe.

For example, I have kept it in C:\Azcopy . 
To run Azcopy commands, in command prompt I should get to the localion by running the below.

CD C:\Azcopy

Command to copy file from Azure fileshare to an Azure blob container.


Azcopy copy "<Azure FileshareURL>?<SAS key of storage account which hosts this azureFileshare>" "<AzureBlob Container/folder URL >?<SAS key of storage account hosting this blob container >"
 --recursive 

To get the URLs of your fileshare and blob containers, you can navigate to that in your Storage explorer or in Azure Portal and right click and get the "properties" which will have it.

Sample is given below.

Azcopy copy "https://mystorageac1.file.core.windows.net/backups/Prod?sv=2019-02-02&ss=bfqt&srt=sco&sp=rwdlacup&se=2020-11-21T17:32:48Z&st=2019-11-21T09:32:48Z&spr=https,http&sig=5CPP5XJ4c%2jjbhbhbhbhbhcBjlZGeoNWFFtXNJMoI%3D" "https://mystorageac2.blob.core.windows.net/stagingbackups?sv=2019-02-02&ss=bfqt&srt=sco&sp=rwdlacup&se=2020-11-21T17:32:48Z&st=2019-11-21T09:32:48Z&spr=https,http&sig=5CPP5XihbhbhbObb90ocBjlZGeoNWFFtXNJMoI%3D" --recursive 

You will experience a magical speed over 6 GB/Sec while copying files between two azure locations as the data transfer happens with in the azure network itself and with Azcopy you are only triggering the transfer. 

Needless to say, the same tool can be used to move files between your local folders and Azure file shares/blob containers. Only difference is that you don't have to add a '?' sign, followed by the key to get access. Also make sure you give enough permissions when you are generating the SAS keys in your Azcopy commands. 

Hope that helps. Please write down your related questions  below in the comments section. I will get back to you at the earliest.



Saturday, October 26, 2019

Adding your Managed Instance Subnet in your storage Account's Network and Firewall settings

Hello Readers!

The below blog content is to highlight an azure limitation and a workaround to fix that till Azure came up with a direct solution for it. Its about enabling access for managed instance in the network and firewall settings of a storage account , if you are choosing the 'Public- Selected Networks ' configuration.

If you have noticed, when we add a particular VNet/Subnet to the settings, it will first enable service end points onto the subnet we are adding as the first step. This is not really possible for the dedicated subnet we create and maintain for Azure managed instances . As per design or existing configurations, managed instances' subnet can not have service end point enabled.

So how do we enable it?

There is a workaround for this limitation. 

We can find the managed endpoint IP address for the managed instances and whitelist that. It will be accepted in the network and firewall settings of the storage account , just like how we add a public IP to it. But the important part is having both your storage account and Managed instance in different Regions.

Below is how you do it.


Step1 : Finding the managed endpoint IP
Connect to Cloudshell (Powershell client embedded in the azure portal) or your own PowerShell client.

If you are logging in from your own Powershell client, make sure you have Az module installed by running the below commands and you are logged into your subscription.

Install-Module -Name Az #Installing Az module

connect-Azaccount #Logging into Azure subscription

resolve-dnsname 'xxxxx-mi1.xxxxx.database.windows.net' | select -first 1  | %{ resolve-dnsname $_.NameHost.Replace(".vnet","")}  #Finding the managed endpoint IP of your managed instance.

Please replace "xxxxx-mi1.xxxxx.database.windows.net" in the above script with the private endpoint URL of your managed instance which you get from the "Connection strings" of your managed instance in the Azure portal.

Managed endpoint address is common for all the instances in your MI subnet. 

Step2: Adding the IP into the firewall settings of the storage account.

This was covered in the bottom of my previous blog on storage account . Access it from here. 

May be in future, microsoft will come with a better solution which would support enabling service end point in MI Subnet. Let us all keep checking azure updates page 

Thats the end of this blog. Please follow if you are finding my blogs knowledgeable. If you do , you will receive notifications when I post something new here. Thanks for reading and Connecting ! 

Friday, October 25, 2019

How to perform a Geo-Restore in your Azure SQL managed Instance(Using Azure portal)

Hello Readers!

The below blog content is on how do we perform Geo-restore of databases backed up automatically from the azure managed instances.


Geo-restore only means restoring the database backup taken from an instance in one region to another . It is listed as one of DR method that can be used,  in the event of a disaster causing the whole region data centers going down.

Lets get to it

Using Azure portal

Step 1:
Go to the Target instance Page in the Azure portal and click on 'New Database'



Step2: Choose the backup

In the popup we get, set a name for the New database that will be created from this restore operation.


Choose backup against  'Use Existing Data'




After the above steps, you can select the backups of databases in the drop down list below.



The backup list contains the instances in your subscription and the latest restore points created by the automated backup.

You can choose one and click on OK to initiate restore.

You can initiate many of these restore activities. It will be queued and processed according to your Vcore settings.

Thanks for visiting !.