Sunday, November 24, 2019

Monitoring SQL Sessions with necessary parameters

Hello Readers !

Below is a useful T-SQL query having necessary monitoring parameters to do a live monitoring of SQL server sessions in SSMS. You can put more filters in the where clause to make it more relevant to your environment.

It includes isolation levels used by the sessions along with popular columns that are monitored like hostname, wait time, program name etc.

use master
go
SELECT   s.session_id,
            r.status,
            r.blocking_session_id as BlockedBy,                                 
            r.wait_type,
            wait_resource,
DB_name(s.database_id) as DBName,
            s.host_name,
CASE r.transaction_isolation_level
WHEN 1 THEN 'ReadUncomitted'
WHEN 2 THEN 'ReadCommitted'
WHEN 3 THEN 'Repeatable'
WHEN 4 THEN 'Serializable'
WHEN 5 THEN 'Snapshot'
ELSE 'Unspecified' END AS transaction_isolation_level,
            r.wait_time / (1000.0) as [WaitSec],
r.total_elapsed_time / (1000.0) as [ElapSec],
            r.cpu_time,
            r.logical_reads,
            r.reads,
            r.writes,

            Substring(st.TEXT,(r.statement_start_offset / 2) + 1,
                    ((CASE r.statement_end_offset
                        WHEN -1
                        THEN Datalength(st.TEXT)
                        ELSE r.statement_end_offset
                        END - r.statement_start_offset) / 2) + 1) AS SQL_text,
            Coalesce(Quotename(Db_name(st.dbid)) + N'.' + Quotename(Object_schema_name(st.objectid,st.dbid)) + N'.' + Quotename(Object_name(st.objectid,st.dbid)),
                    '') AS command_text,
            r.command,
            s.login_name,
            s.program_name, 
dateadd(HOUR, 3, S.last_request_end_time) as last_request_end_time,
            dateadd(HOUR, 3,s.login_time) as Login_Time,
            r.open_transaction_count
FROM     sys.dm_exec_sessions AS s
            JOIN sys.dm_exec_requests AS r
            ON r.session_id = s.session_id
            CROSS APPLY sys.Dm_exec_sql_text(r.sql_handle) AS st
WHERE    r.session_id != @@SPID 
ORDER BY r.cpu_time desc, r.status,
            r.blocking_session_id,
            s.session_id

Please let me know in comment section if you find any other columns that are useful to amend and improve this.


Friday, November 22, 2019

How to copy files from Azure Fileshare to Azure Blob Containers

Hello Readers !

Below is a small useful blog content on how do we copy file copy files between Azure Blob containers and Azure fileshares using Azcopy.

For dummies, Azcopy is a command-line utility tool to work with Azure storage which you can download it from here. This is supposed to work faster than your azure storage explorer. You may do a comparison check yourself to confirm that. At present, Azure storage explorer, doesn't support copy-paste between two different services, lets say files and blobs.
One you download it , you can copy it into a windows folder and access it from your CMD.

For running AzCopy commands, in Command prompt you should navigate to the directory/folder which is holding your Azcopy.exe.

For example, I have kept it in C:\Azcopy . 
To run Azcopy commands, in command prompt I should get to the localion by running the below.

CD C:\Azcopy

Command to copy file from Azure fileshare to an Azure blob container.


Azcopy copy "<Azure FileshareURL>?<SAS key of storage account which hosts this azureFileshare>" "<AzureBlob Container/folder URL >?<SAS key of storage account hosting this blob container >"
 --recursive 

To get the URLs of your fileshare and blob containers, you can navigate to that in your Storage explorer or in Azure Portal and right click and get the "properties" which will have it.

Sample is given below.

Azcopy copy "https://mystorageac1.file.core.windows.net/backups/Prod?sv=2019-02-02&ss=bfqt&srt=sco&sp=rwdlacup&se=2020-11-21T17:32:48Z&st=2019-11-21T09:32:48Z&spr=https,http&sig=5CPP5XJ4c%2jjbhbhbhbhbhcBjlZGeoNWFFtXNJMoI%3D" "https://mystorageac2.blob.core.windows.net/stagingbackups?sv=2019-02-02&ss=bfqt&srt=sco&sp=rwdlacup&se=2020-11-21T17:32:48Z&st=2019-11-21T09:32:48Z&spr=https,http&sig=5CPP5XihbhbhbObb90ocBjlZGeoNWFFtXNJMoI%3D" --recursive 

You will experience a magical speed over 6 GB/Sec while copying files between two azure locations as the data transfer happens with in the azure network itself and with Azcopy you are only triggering the transfer. 

Needless to say, the same tool can be used to move files between your local folders and Azure file shares/blob containers. Only difference is that you don't have to add a '?' sign, followed by the key to get access. Also make sure you give enough permissions when you are generating the SAS keys in your Azcopy commands. 

Hope that helps. Please write down your related questions  below in the comments section. I will get back to you at the earliest.