A New Mini-Adventure with Idera Software

Yesterday, Idera Software (@Idera_Software) announced their slate of 2017 Idera ACEs (Advisors and Community Educators).Idera_Ace_Logo

I am honored to be included as an Idera ACE for 2017. Part of my responsibilities as an ACE is to represent Idera at local SQL Saturdays which means I get to travel more, meet more people and present more!

One of the reasons I enjoy working as a consultant with Sparkhound is the concept of “knowledge transfer”.  We believe in not only helping clients with their SQL Server problems; but also teaching them how to do it for themselves.

Being an Idera ACE seems to fall right in line with that philosophy!  Joining me in this endeavor are the following members of the SQL Family!  I hope to meet each and everyone of you over the course of the next year!

Rie Irish – Atlanta, GA

Mindy Curnutt – Dallas, TX

Maximo Trinidad – Port St. Lucie, FL

Lance Tidwell – Austin, TX

Brandon Preece – Salt Lake City, UT

Sebastian Meine – Philadelphia, PA

I will also do my best to dedicate more time to blogging, answering community forum questions and I will see you at SQL Saturdays!

A Day in the life of SQL Mirroring

This past week I have been working with a client to setup mirroring on a SQL 2014 instance.  I realize mirroring is a deprecated feature; but many DBAs and companies still use it as a fail safe line of defense. And the fact that Mirroring has been a “deprecated feature” since SQL 2012 and is still available in SQL 2016; I think it might be around a little longer.

I am writing about this not to call out any client on the use of Mirroring or to promote or condone mirroring but to remind me of the things to look for when setting up any mirroring environment.

The Task at hand

My client was upgrading from SQL 2012 to 2014.  The current SQL 2012 instance was mirrored to a second server.  I was to reproduce the mirroring on the 2014 instance in preparation for the rollout to production.

This particular setup was problematic for me because of Windows Firewall rules and having multiple instance (heck multiple versions) of SQL installed on the same server.

Server Setup

The following version of SQL were installed on the same server:

  • SQL 2008 R2 (default instance)
  • SQL 2012 ( named instance) mirrored with a second server
  • SQL 2014 (named instance)  this was the server I needed to setup with mirroring to reproduce the SQL 2012 environment.
  • Windows Firewall was basically configured to close everything unless a rule specified otherwise.

This list of things to check may not apply to all configurations and all mirroring setups.  It is just some things that prevented this setup for succeeding.

  1. With multiple instances and connecting via Named Pipes, it is very important to have the “Named Pipes” protocol enabled in SQL Configuration Manager.
  2. With multiple instances SQL Browser becomes very important, especially if you do not want to use Port Numbers in your connection string.   Here is the important part, check the Windows Firewall rules to ensure UDP Port #1434 has both an Incoming and Outgoing rule.
  3. Mirroring requires its own Port for communication as well.  Incoming and Outgoing Windows Firewall rules will have to be created to ensure communication between the two servers.  By default this is TCP Port #5022.  However, because the SQL 2012 instance was already using this port for mirroring, I had to configure SQL 2014 with port #5023.  Again, another firewall rule. Actually, I just added the port to the existing firewall rule.
  4. The last piece that was problematic was permissions.  Because Principle and Target each had their own domain service account, each account needed to be added to the opposite server with the correct database permissions. 

Once all of this was figured out and configured correctly, the mirroring session configured and started up with no problems!

“Stretching” my SQL Skills

Philosophy has always intrigued me and René Descartes is one of my personal favorite philosophers.  Cogito ergo sum “I think, therefore I am”.  The moment we stop thinking is the moment we stop existing. So, today I decided to learn something new.  I wanted to setup a Stretch Database in SQL Server 2016.

Just about everything in SQL Server can be accomplished by either the SSMS GUI or Transact SQL commands.  Anytime, I attempt something for the first time I always use the GUI, then learn the T-SQL commands.

I began by right clicking on my [StretchTest] database and selecting TASKS >> STRETCH >> ENABLE.Image1

This will launch the “Enable Stretch Wizard.  First step in the wizard is the the Introduction.  First thing I noticed was the the statement: “Once the data is migrated to Azure, unique and primary keys will not be enforced on Azure”  Really?  I wonder what else will be “not enforced” or not compatible with Azure.  Time to look up my friendly neighborhood BOL, Limitations for Stretch Database. Wow, that’s a good number of limitations; however this is a simple test so I am not worried.

Next we have to connect to an Azure Subscription. The wizard step allows you to configure your SQL Azure database environment, select the Azure region (always pick one closest to you) and you get to choose between a preconfigured Azure server or allow the wizard to create one. I chose to allow SSMS to create a new Azure server. Next you have to create a database master key (DMK) to protect your Azure data.  And finally you need to provide your IP Address (remember your external IP, not your machine IP) so SSMS can setup firewall rules for you.

Image2aImage4Image3Image5

 

Then you get the customary SQL Summary page before you execute the commands. Now that my database has been stretched, I added a table and data using this schema:

CREATE TABLE [dbo].[tblUsers](
	[ID] [int] IDENTITY(1,1) NOT NULL,
	[FirstName] [varchar](50) NULL,
	[LastName] [varchar](50) NULL,
	[Email] [varchar](100) NULL,
	[DateEntered] smalldatetime null,
 CONSTRAINT [PK_tblUsers] PRIMARY KEY CLUSTERED 
(
	[ID] ASC
) ON [PRIMARY]
)
GO

I not only wanted to test the Primary Key limitation; but I also wanted to test the Filtered Stretch on the Date Entered Column. I populated the table with 11,000 records using the website http://mockaroo.com.

After the tblUsers table was populated, I enabled the Stretch Table wizard by right clicking on the table and selecting STRETCH >> ENABLE.  After the Introduction screen you get to select which tables you would like to stretch to Azure.  The GUI screen shows you all the tables and if there will be any “warnings” about stretching, the option to filter, number of rows in the table and the size of the table. 

Table1Table2Table3

The first time I executed this I did not use the Stretch Filter and stretched the entire table.  It worked perfectly.  I could use SSMS to connect to my remote SQL Server and query the Azure table directly and could see there were 11,000 rows.

Executing a simple SELECT query against the local database produced this Query Execution plan: RemoteQuery

As you can see, we now have a new path in our query plan with an operator called “Remote Query”.  Basically the local server queries the remote query then using the local Primary key Concatenates them back together to produce the desired result. So can we update the data?

InsertError

Nope, sure can’t.  Once the data lives in Azure, the data is READ ONLY.

Next I wanted to add the filtering on Table3the DateEntered column so I can edit more recent data.  I disabled Stretch and re-enabled it using the filters this time.  I received error after error after error.  According to BOL this query is not a valid filtering query.  According to BOL, the date parameter needs to include the format value.  At least that was the case with mine.

/*  INCORRECT */
CREATE FUNCTION dbo.fn_example5(@column1 datetime)
RETURNS TABLE  
WITH SCHEMABINDING   
AS   
RETURN  SELECT 1 AS is_eligible  
        WHERE @column1 < CONVERT(datetime, '1/1/2014')  
GO  

/*  CORRECT  */
CREATE FUNCTION dbo.fn_example5(@column1 datetime)
RETURNS TABLE  
WITH SCHEMABINDING   
AS   
RETURN  SELECT 1 AS is_eligible  
        WHERE @column1 < CONVERT(datetime, '1/1/2014',101)  
GO  

After initiating the filter, I queried the remote server directly and noticed there wereRemoteQuery_AzureRows

6713 rows found in Azure

RemoteQuery_LocalRows

and 4287 rows found in the local database.  Exactly what I expected.  Again, the query execution plan has to use “Remote Query”  and Concatenate the results to produce the desired results.

 RemoteQuery_Filtered

Final Thoughts on Stretch Database

The SSMS has designed the Stretch Wizards to be very effective and very easy to use.  However, when you “Disable” a stretch database it does not delete the table/database in Azure which means you will still incur charges for storage. And if you re-enable Stretch on a database it creates a 2nd database in Azure. 

With the READ-ONLY limitation from the Azure side of Stretch database, you better make 100% sure you don’t need to edit anything in Azure. The only way to make any changes to the Azure side of the data is to Disable and “bring the data back” to the local machine and this incurs charges on the Azure side. Then you make your changes to the data, and re-enable it for Stretch.  Which again, created another database in your Azure server.

The pricing for Azure SQL Stretch database seems to be a little steep. At the lowest performance level offered, it is ~$1860 per month!  Microsoft calculates what they call DSU (Database Stretch Units) which “represents thee power of the query and is quantified by workload objectives: how fast rows are written, read and computed against.”  https://azure.microsoft.com/en-us/pricing/details/sql-server-stretch-database/

All in all, Stretch Database is a very cool feature for archiving older data that will not change.  But only if  you can afford it.  I would think using file groups with partitioning of data on separate drives could be a more cost effective solution for archiving than Azure stretch database. A very good learning experience. 

My first time, lessons learned!

This past weekend, I had the privilege of doing a presentation at SQL Saturday #514 in Houston Texas.  I had been to other SQL Saturdays before as both a participant and a “volunteer”; but I had never have spoken before.

Speaking in front of people is usually no big deal to me.  I have been designing, presenting, hosting and administrating all types of training for the Boy Scouts for many, many years.  I have taught Scouting newbies how to tie a knot to presenting on topics like “Project Planning and SMART goals” (email me if you want to know what SMART goals are).  I have even coordinated an all day training event very similar to SQL Saturday for my Council called University of Scouting.

But what was different about this experience was the topic.  I have been active in the Boy Scouts since I was eleven.  I know Scouting information, concepts, and topics backwards and forwards. I have been facilitating trainings (presenting) since I was 15. So Scouting is 2nd nature to me.  But presenting on SQL Server topics was somewhat nerve wracking.  For me it was a test of not my presenting abilities; but my SQL DBA abilities.  As an “Accidental DBA” I have always questioned my abilities with SQL Server.  Everything I have learned about computers I have learned my self, so I am always 2nd guessing myself. 

As a “First timer”, I thought it would be best to cover Beginning Level topics.  My thought was 1) I knew the material and 2) after reading Tim Ford’s #EntryLevelChallenge it motivated me to remember my fellow newbies.

I won’t go into the topic of my presentation today as this is more about the experience.  The audience, which I had 14 in attendance, was very attentive and I don’t think I put anyone to sleep!

I do need to work on my demos.  They all worked as expected; but switching between laptop and projector, throwing SSMS on the projector screen, changing screen settings from duplicate to extend was all too much.  There has to be a better way to switch between PowerPoint in presentation mode and SSMS to do demo. So I will practice that.

I thought I did best on time management.  The time allotted was 60 minutes, I said “Thank you for your time” at 59 minutes and 30 seconds. 

Over all the experience was well worth it. By the responses I received from the speaker evaluations, I think I did an OK job.  I had no major criticisms except to slow down, I assure you that was nerves at the beginning. 

Andrea Allred presents RoyalSQL

Bringing happy endings to all your data stories.

IDERA Community

A community site for users of all IDERA product and SQL community members.

Strate SQL

Data Adventures with a Architect

SQL Sanctum

A Join on SQL And Everything Else

Home Of The Scary DBA

Intimidating Databases and Code

rhondastephens

To Catch A Falling Cactus

SQLSwimmer

Swimming through the Sea of SQL

SQL Padre

Just another WordPress site

SQL Studies

 Live, Learn, Share

Aunt Kathi's SQL Server Home

Learn SQL Server from your favorite aunt!

Jen's Blog: We Owls

...stuff. Also, things.

Journey to SQL Authority with Pinal Dave

SQL, SQL Server, MySQL, Big Data and NoSQL

Voice of the DBA

Writings from Steve Jones, the Voice of the DBA

Glenn Berry's SQL Server Performance

Semi-random musings about SQL Server performance