Learning something new!


You all know the old saying, “Can’t teach an old dog new tricks”. Well I am here to tell you, I AM old (at least in the IT world) and I can learn new tricks.

After study, testing, studying and testing; I finally passed enough exams to earn my Microsoft Certified Solutions Expert: Data Management and Analytics Certification. (Yeah me!). This is not to brag and this was not required of my job, but a personal endeavor for me to complete it. Yes, I know some people put zero stock in MSFT exams and certificaitons and I understand that, but again this was a personal goal.

So what’s next?

I know in the IT world if you are not always learning something new you will be left behind and become a dinasaur real quick. Since my current job role is primarily “to keep the engine running” and we have over 100 instances of SQL; I felt I needed to expand my toolbelt. The first logical option is Powershell for SQL and DBAtools (http://dbatools.io)

I know there are 100s if not 1000s of blog posts about SQL Powershell, but I hope to document this learning (i.e. take notes) and hopefully I can retain some of this information, because you know I am getting old and such!


Missing in Action (not really)

I know it has been a while since my last post, but when one is working, I mean really working, a full time job, plus kids, hobbies, and everything else, where does one find time to blog???

My work though is allowing me to learn new things, delve into different areas and become more proficient with SQL 3rd party tools.

Right now I am working on a T-SQL security report. A simple, or not so simple, batch of t-sql that will provide 2 options: 1) full or 2) deficiencies.

Option 1 will give you a list of every server principal in the system and what permissions it has, all the way down to the object level. Option 2 will only provide you with the things that need to be remedied based on security best practices.

The future is Power(ful)

I am finally geting around to reading up on SQL Server vNext 2017 which will hopefully be released sometime this year.

Of course we have all been spinning up RedHat boxes so we can play with SQL on Linux (btw, this does not excite me), but something else caught my eye that made me step back, scratch my head and say “uh, excuse me”

I found this statement to be very troublesome.

  • SQL Server Reporting Services is no longer available to install through SQL Server’s setup as of CTP 2.1.


SSRS is no longer available?  Say WHAT?

Reading on I discover that it is now “Power BI Report Server”, in the cloud or a downloadable on premise with desktop version. (https://powerbi.microsoft.com/en-us/documentation/reportserver-get-started/)

Truth be told, I have not experimented with anything “Power” since MS PowerPivot and Excel.  (Hey I’ve been busy with Admin stuff.  Yes, I know I need to get into the current decade.)  This looks to be a promising direction MSFT is taking SQL Reporting.  What I have seen of Power BI through blogs, SQL Saturday’s and other presentations, I have been somewhat impressed with its capabilities.

I am just concered about converting all those 2008 – 2016 SSRS reports to PowerBI!  Well, at least it is job security!

Until next time…


Another change in jobs: and upcoming events

Recently I was forced to make a job change. 

I am now working for a reputable local bank as a DBA and I am no longer a consultant.  l enjoyed my time as a consultant even though I never could get used to the idea of being “billable”.  And putting the fate of your job in the hands of other individuals (sales department) was nerve racking to say the least.  But I learned valuable skills and information from my time as a consultant.

I hope to continue participating in the SQL community.  In fact, I am presenting my newest presentation “You want me to Isolate What? An Introduction into Data Integrity and Isolation Levels of SQL Server” THIS Saturday at SQL Saturday Pensacola #617.  Of course I will upload my slide deck and demos after this weekend.

And you can catch me in two weeks at SQL Saturday Houston #651.

Hope to see you there!

Expanding a DBA’s Role

I have never really considered my self a System Administrator, but I am familiar enough with the Windows Operating System to consider myself a little dangerous. However, with more and more of our clients utilizing the power of Microsoft Azure VMs I see my role as a SQL Consultant starting in encompass System Administrator duties.

Recently a client found this article on “Best Practices for SQL Server in Azure Virtual Machines” and wanted to re-provision his volumes to adhere to them.

No my first thoughts was wait, I’m a DBA, not a System Admin that’s not my role! But thinking more about it I realized the client views this as a SQL Server issue and I am the SQL Server Consultant and that it is my job to remedy this problem.

Not being 100% confident in Azure, I spun up a VM SQL Server and attempted to add some volumes.  To my surprise, this was way too easy.

After selecting the server I wanted to manage….

Step 1:    Select Disk on the left menu and then click “+ Add Data Disk”AddingDisks1 

Step 2: Configure Disk

  • You will need to provide a name or you can use the pre-populated name
  • Select Source Type, in this case I left it at the default “New (empty disk)”
  • Select Account Type, again, left it at the default of “Premium (SSD)”
  • Choose the Size.  From 0 t0 1023 which is a Terabyte of space.
  • Select Storage Container.  This will open another tab and just pick the one you want or create a new one
  • Storage Blob name – once again you can use the pre-populated name


Step 3: Select a Container

This is where the actual VHD file will reside, you can use an existing container or create a new one for these files



That’s it!  That is all that is need to mount a new volume to an existing VM SQL Server. It automatically populates in the OS Disk Management where you will need to create a “Simple Volume” (or however you would like to do it) and format the drive (remember MSFT recommends 64K for data and log drives)

You are now ready to do. 

And removing an existing drive is even easer.  At the top of the Disk Panel is the work “Edit” click there and you are given the option to delete. It will automatically remove it from your server, no reboot no warning!  So be careful.

Heck, maybe being a System Administrator in Azure isn’t so tough after all! 

Isolation Levels 101: Concurrency Issues

The other day I had a client complaining about slowness, queries erroring out, blocking and processes stopping; you know the usual complaints. So, like any other DBA I pulled out my handy dandy toolbox of queries to determine exactly what was going on with the server.

Upon reviewing their current running sessions, I noticed that the Isolation Level for a simple UPDATE transaction was “Serializable” and all other transactions were “Read Committed”. And this Serializable transaction was blocking all other transactions.

Now, let me preference the rest of this blog series with: in all my SQL experiences, I have yet had to change the isolation levels on a transaction or database level. Which means, I have never studied this topic in depth. So, this got me thinking, what are the different isolation levels and why would anyone ever change from the default isolation, which is “Read Committed”?

Typically, when I attempt to learn something, I like to write it down. Because this is a lot of information, I am going to present my learning in 3 parts: Concurrency Issues and how SQL works, Isolation Levels and how they help resolve Concurrency issues, and some helpful scripts to test and figure all this out.

So, let’s start at the beginning. The purpose of an Isolation level is to provide SQL server with a mechanism to protect data against specific concurrency problems when multiple CRUD (create, read, update, delete) operations occur on the same data.

Concurrency Problems

Data is intended to be read, modified, created or deleted over time. Most databases have multiple connections doing all these things at the same time. Managing all these concurrent connections SQL server must decide which statements gets precedence. SQL typically operates on a “first-in, first-out” method of handling queries. But what those queries do can determine the fate of queries in line to be executed. And these decisions can cause data concurrency issues. The following concurrency “phenomena” have been identified by ISO/ANSI standard for SQL-92 as possible results for concurrent transactions.

Lost Updates

It is possible for two or more transactions to modify the same row. While the subsequent transactions are reading the data, the 1st transaction commits its changes. Then the subsequent transaction makes its changes and commits, thereby possibly losing the changes committed by the 1st transaction. In this scenario, the last transaction always wins.

Dirty Read

A dirty read is when a SELECT statement will read against a table that is currently being modified and not yet committed to the database. By default, SQL Server will protect against this on all isolation levels, except for Read Uncommitted or by using NOLOCK in your queries.

Non-Repeatable Read

This form of concurrency problem can happen when a specific dataset (using the same WHERE clause) is accessed multiple times within the same transaction and some of the rows in the dataset are updated between the 1st and 2nd access. This will product two different result sets.

Phantom Reads

Very similar to non-repeatable reads, a phantom read concurrency is when two queries in the same transaction, against the same table with the same WHERE clause produces two different result sets because a 3rd query has changed data between the first and second SELECT statements.

How to combat these Concurrency Problems?

For most DBAs, the integrity of their data is, or should be, the most important aspect of their job. It quite literally can be a “job killer”. Fortunately for us, Microsoft has built SQL Server with several levels to isolate the data during CRUD operations to ensure these concurrency problems are eliminated. These Isolation Levels are called Read Uncommitted, Read Committed, Repeatable, Serializable, Snapshot.

The following script when executed will display all current sessions and their Isolation Levels. It is always good to know what your connections are doing to SQL.

	,db_name(s.database_id) [Database]
	     when s.transaction_isolation_level = 1 then 'Read Uncommitted'
	     when s.transaction_isolation_level=2 then 'Read Committed'
	     when s.transaction_isolation_level=3 then 'Repeatable'
	     when s.transaction_isolation_level=4 then 'Serializable'
	     when s.transaction_isolation_level=5 then 'Snapshot'
	     else 'Unspecified'
	end [Transaction Isolation Level]
from sys.dm_exec_sessions s
where s.session_id>50

In the case of my client, there was one UPDATE transaction that was using Serializable Isolation Level, it was this session that was BLOCKING all other transactions from completing. But why was the transaction Serializable and why was it blocking for so long of a period?

Not knowing the details behind Serializable Isolation Level, it was time for some more learning. I will discuss the different Isolation levels in my next post. Until then…