SQL Server and Other Topics


I got my start with SQL Server in 1997 with SQL Server 6.5 at BellSouth Cellular. From that point on I've had the opportunity to work on Versions 6.0 to 2014. I specialize in Performance Tuning, High Availability and SQL Development. Contact me through this site or through my Twitter handle @SQLDiver

This post is as much for me as it is for you just so I can remember where I saw it!

 Chris Thompson, a PFE from Microsoft wrote a blog article TCP Port Is Already In Use.

 The article is about what could happen if after a reboot another application/service accidentally grabbed the SQL Server port because the service starts up before the SQL Server Service and uses a dynamic range that includes the SQL Server port. I've never run into this in of the companies I've worked with over the past 20 years, but it makes sense to mitigate the problem in the beginning.

1) There had better be a really good reason there would be another application installed on "MY" database server...but...say there is.

 Chris suggests adding the following to your steps for installation. Ok fair enough, something very simple to add to my scripts that get run from my automated install process... lets do it:

This is the netsh command to be run from the Windows command line. I'll add it to my PowerShell install script. 

netsh int ipv4 add excludedportrange tcp startport=1433 numberofports=1 store=persistent

Thanks Chris!

Part 1 > Part 2 > Part 3

 In the last post, part 2, I talked about the the plan and project for implementing Availability Groups.Today I will discuss the creation of an unattended install.

First, what is an unattended install, well, simply it is the ability to use a "template" configuration file to automatically complete all of the possible parameters required to install, in this case, SQL Server. It is critical to adopt a standard configuration setup, where to put the data and log files, etc., so you're unattended install will configure all servers the same.

 The easiest way to create the configuration file "ConfigurationFile.ini" is to do an install, make sure you chose all of the parameters as you would like them to be saved and used for every install. Once you get to the end of the install, on the "Ready to Install" screen prior to clicking the Install button you will see the path to the "ConfigurationFile.ini". Copy the path and file name from the text box so we can get to it later. You can now cancel the install as you've captured all of the parameters need to install SQL Server.


That was easy!

 Now lets make sure all of the configuration settings are correct. At this point paste the path in File Explorer to open the file in Notepad (or your favorite text editor). It should look something like the following, keep in mind that I would not copy my example as it is for another environment.

 Unattended Install ConfigurationFile.ini

Look at each parameter in the configuration file and change the ones that you want to change. You will want to test the file until it is exactly what you want, then you can put it on a network drive somewhere and run it quiet when you want to install SQL Server.

   ;SQL Server 2014 Configuration File  
; Specifies a Setup work flow, like INSTALL, UNINSTALL, or UPGRADE. This is a required parameter.  
  ;Accept Terms <- Saves you from having to accept the license terms   
  ; Use the /ENU parameter to install the English version of SQL Server on your localized Windows operating system.   
; Parameter that controls the user interface behavior. Valid values are Normal for the full UI,AutoAdvance for a simplied UI, and EnableUIOnServerCore for bypassing Server Core setup GUI block.  
; Setup will not display any user interface. (when set to true)   
  ; Setup will display progress only, without any user interaction. (when set to true)   
  ; Specify whether SQL Server Setup should discover and include product updates. The valid values are True and False or 1 and 0. By default SQL Server Setup will include updates that are found.    
; Specify if errors can be reported to Microsoft to improve future SQL Server releases. Specify 1 or True to enable and 0 or False to disable this feature.  
; If this parameter is provided, then this computer will use Microsoft Update to check for updates.  
  ; Specifies features to install, uninstall, or upgrade. The list of top-level features include SQL, AS, RS, IS, MDS, and Tools. The SQL feature will install the Database Engine, Replication, Full-Text, and Data Quality Services (DQS) server. The Tools feature will install Management Tools, Books online components, SQL Server Data Tools, and other shared components.    
  ; Specify the location where SQL Server Setup will obtain product updates. The valid values are "MU" to search Microsoft Update, a valid folder path, a relative path such as .\MyUpdates or a UNC share. By default SQL Server Setup will search Microsoft Update or a Windows Update service through the Window Server Update Services.  <-- where are the SPs or CUs?   
; Displays the command line parameters usage  
  ; Specifies that the detailed Setup log should be piped to the console.  <- Good for when the install errors   
; Specifies that Setup should install into WOW64. This command line argument is not supported on an IA64 or a 32-bit system.  
  ; Specify the root installation directory for shared components.  This directory remains unchanged after shared components are already installed.    
 INSTALLSHAREDDIR="C:\Program Files\Microsoft SQL Server"  
 ; Specify the root installation directory for the WOW64 shared components.  This directory remains unchanged after WOW64 shared components are already installed.   
INSTALLSHAREDWOWDIR="C:\Program Files (x86)\Microsoft SQL Server" 
  ; Specify a default or named instance. MSSQLSERVER is the default instance for non-Express editions and SQLExpress for Express editions. This parameter is required when installing the SQL Server Database Engine (SQL), Analysis Services (AS), or Reporting Services (RS).    
; Specify that SQL Server feature usage data can be collected and sent to Microsoft. Specify 1 or True to enable and 0 or False to disable this feature.  
; Specify the Instance ID for the SQL Server features you have specified. SQL Server directory structure, registry structure, and service names will incorporate the instance ID of the SQL Server instance.  
  ; Specify the installation directory.    
 INSTANCEDIR="C:\Program Files\Microsoft SQL Server"  
  ; Agent account name  
; CM brick TCP communication port  
; How matrix will use private networks  
; How inter brick communication will be protected  
; TCP port used by the CM brick  
; Startup type for the SQL Server service.  
; Level to enable FILESTREAM feature at (0, 1, 2 or 3).  
; Set to "1" to enable RANU for SQL Server Express.  
; Specifies a Windows collation or an SQL collation to use for the Database Engine.  
   ; Account for SQL Server service: Domain\User or system account. (Remember your password will be clear text) <- these parameters can be passed in the command execution   
   ; Windows account(s) to provision as SQL Server system administrators. (Remember your password will be clear text) 
; The default is Windows Authentication. Use "SQL" for Mixed Mode Authentication.  
  ; The Database Engine root data directory.    
  ; Default directory for the Database Engine user databases.    
  ; Default directory for the Database Engine user database logs.    
  ; Directory for Database Engine TempDB files.    
  ; Directory for the Database Engine TempDB log files.    
; Provision current user as a Database Engine system administrator for %SQL_PRODUCT_SHORT_NAME% Express.  
; Specify 0 to disable or 1 to enable the TCP/IP protocol.  
; Specify 0 to disable or 1 to enable the Named Pipes protocol.  
; Startup type for Browser Service.  

Pre-Installation Script File

I usually create a script that runs prior to the install and one that runs after the install. This will allow you to automate some prep work. For instance, create folders for the data and log files on the drive.

I named my file PreInstallationScript.ps1 and this one contains the creation of the file folders for the data and log files. This script can be used to set many different configurations.

 New-Item D:\SQLServer\SQLData -type directory 
New-Item D:\SQLServer\SQLLogs -type directory 
New-Item F:\SQLServer\SQLData -type directory 
New-Item F:\SQLServer\SQLLogs -type directory 
New-Item G:\SQLServer\SQLData -type directory 
New-Item G:\SQLServer\SQLLogs -type directory 
New-Item G:\SQLServer\TempDb\SQLData -type directory 
New-Item G:\SQLServer\TempDb\SQLLogs -type directory 

Post-Installation Script File

(Uses PoSh script by Eric Humphrey (http://www.erichumphrey.com/category/powershell/)

 # Set the Windows Power Options to High Performance
set-location $PSScriptRoot
#Set Trace Flags 1117, 1118, 3226, 4199
.\Add-SqlServerStartupParameter.ps1 '-T1117'
.\Add-SqlServerStartupParameter.ps1 '-T1118'
.\Add-SqlServerStartupParameter.ps1 '-T3226'
.\Add-SqlServerStartupParameter.ps1 '-T4199'
$compname = $env:computername
Write-Host $compname
Enable-SqlAlwaysOn -ServerInstance $compname -Force
Invoke-Sqlcmd -InputFile ".\AddTempDBFiles.sql"
Invoke-Sqlcmd -InputFile ".\Configs.sql"
Invoke-Sqlcmd -InputFile ".\GlenBerryCriticalAlerts.sql"
Invoke-Sqlcmd -InputFile ".\AlwaysOnNotifications.sql"
Invoke-Sqlcmd -InputFile ".\OlaMaintenanceSolution.sql"
read-host "Press enter to continue"

Then I wrap it Up in a Bow!

I create an UnattendedInstall.bat file to kick off the install. 

 net localgroup administrators /add yourdomainname\SQLEngineSVC 
cd "Z:\Apps\SQLServer2014" 
Setup.exe /ConfigurationFile=ConfigurationFile.INI 


 The unattended install can be customized to your environment using the scripts. I have left out some of the scripts here because the method of calling them is more important that the code.

This concludes Part 3 of this post. Next time we'll look at the Availability Groups.

This script is very handy to make sure you have your replicas set synchronous and have an automatic failover partner when needed. Put this script in your arsenal!


SELECT ar.replica_server_name, ag.name,
CASE WHEN hags.primary_replica = ar.replica_server_name THEN 'PRIMARY - ' + ar.failover_mode_desc ELSE
CASE WHEN ar.failover_mode_desc = 'AUTOMATIC' THEN ar.failover_mode_desc ELSE ar.availability_mode_desc END
END  as Mode
FROM sys.availability_replicas ar
INNER JOIN sys.availability_groups ag ON ag.group_id = ar.group_id
INNER JOIN sys.dm_hadr_availability_group_states hags ON ag.group_id = hags.group_id
) src
--for name in ([agMISC],[agHK],[agAP],[agEU],[agUS],[agSA],[agPPTables],[agIPAY],[agCSP])
for replica_server_name in ([P-MSSQL01-EF],[P-MSSQL02-DE],[P-MSSQL02-EF],[P-MSSQL01-DE])) piv;

Part 1 > Part 2 > Part 3


In part 1 I started building the framework for moving from SQL Server Database Mirroring (mirroring) to SQL Server AlwaysOn Availability Groups (AGs).

Inventory what you have in all environments, what are you going to need to migrate to the new server (remember this is about moving to new (rebuilt) servers. There are many blogs about how to move mail configuration, sql jobs, users, alerts, operators, database and server configuration, etc., etc.  Don't forget anything. Get the team together to brainstorm all of the things that must be moved and configured.

Complete Inventory

Create a design document. This design document should cover all aspects of what the environment will "look" like after the migration. It is critical to think through the licensing, the availability replica modes, failover  policy, etc. It gives you a chance to think deeply about what needs to be done to configure all aspects of the migration. Know the roadblocks, the requirements for an Availability Group and where AGs will not work. Here are a few:

  1. DTC - Don't do AGs if you're application uses DTC.
  2. Cross Database Transactions - Don't do it! If you have a failover and your app is doing a cross database transaction, the primary could come back in suspect mode. This arfticle will help confirm if you're using cross database transactions.
  3. Domain - A domain is required for the WSFC (that changes in Windows Server vNext).
  4. Cluster - The AG replicas must all be in the same cluster.

 There are more issues you will encounter with Ags... see here.

Determine what the business SLAs are for Recovery Time Objective (RTO) and Recovery Point Objective (RPO) and document in the design document.  You want to be able to tell management how much downtime to expect.

Use Visio to create diagrams, for your own benefit as much as everyone Else's. These diagrams will help keep your head straight with what your design is. It will allow you to plan the number of licenses for management, and it will be your "evidence" after your done when management says, you didn't tell us... oh yes I did, a a matter of fact it is right here in the design document that you approved. Ok, so there is another critical element. Make sure management approves the design document (put a spot for signatures on the cover page), it will prompt them to sign it. Don't deliver the design document until you have tested the migration in a lower environment first. It will expose gotchas that you didn't think about (hopefully you tested prior to getting buy in from management). 

Approved Design

Now that we have the design approved by the client (management), its time to work on the most important step...planning. Without a proper plan not only will you not succeed, but you will probably fail horribly. Let's put that down as a critical component.

Complete Plan

The plan should be a step by step plan with dependencies,  start time of each step (as a calculation based upon the previous step duration), estimate duration step, step status (Completed, In Progress, Cancelled, etc.), responsible party, day of week (calculated from date time column) and step number for quick reference. I usually put my plans in Excel and have a conditional format for the row based upon the status.

This plan should include all steps include server builds, network, stress test, benchmark testing, etc. If you want to be a plan Nazi include the regression testing, etc. Hopefully your company or client has an experienced  project manager for the project. It will take a little longer, but if you have a good project manager your project will have a complete project plan avoiding gotchas at a later date.


I know this is a lot of work, but it will save you in the end. Your job is to eliminate all possibility of accidental downtime and minimize the length of downtime as well as have a highly available environment that works for your company or client. Please take your time and do the work.

I'm at a client that uses SQL Server Database Mirroring for their high availability solution. The principal and mirror for the databases happen to have the data file on two different drive locations (i.e. they don't match). Why is this important when you create a file? Easy, the CREATE FILE command is a transaction that is sent to the mirror.

 Run the CREATE FILE T-SQL or add a file from SSMS and you'll have an ugly situation on your hands.

 Suspended? What this suspended status about? Oh, no, you've just made a mess.

 Error log from the mirror server:

2015-04-08 12:29:18.94 spid30s     CREATE FILE encountered operating system error 3(The system cannot find the path specified.) while attempting to open or create the physical file 'E:\SQLData\StupidFileBrokeMe.ndf'.
2015-04-08 12:29:18.94 spid30s     Error: 5123, Severity: 16, State: 1.
2015-04-08 12:29:18.94 spid30s     CREATE FILE encountered operating system error 3(The system cannot find the path specified.) while attempting to open or create the physical file 'E:\SQLData\StupidFileBrokeMe.ndf'.
2015-04-08 12:29:18.94 spid30s     Error: 1454, Severity: 16, State: 1.

So lesson one when adding a data file, know where your data files are on both principal and mirror. If you've gotten to this point you're going to have to remove the mirror and re-add it from the beginning.  Not only that, but it is going to require a SQL Server service restart. You can't drop the mirror, you can't bring it online, you can put it into emergency... you're gonna have to trick the system. I actually had to shutdown the service, delete the data files. When the service started up I was able to drop the database. It turned out to be a bigger pain than I thought it would be just to demo it for you. Thank goodness I didn't have this happen to me in production.

So, the client was smart enough to already know the data files were on different drive letters and paths. Bonus! Now we can really work some magic.

 So, let's look at what really needs to happen and save us from having to go through the effort of dropping the mirror and re-adding the mirror from scratch. I certainly don't want the production environment unprotected during the massive ... across WAN ... copy of the backup to the DR site.

The process is as follows:

  1. Remove Mirror
  2. Add File on "Principal"
  3. Backup Transaction Log
  4. Restore Transaction Log to "Mirror" copy WITH MOVE to the new location of the data files.
  5. Add the mirror partner and witness (if it exists)
  6. Set safety off if asynchronous.

So easy right? Well... the hard part is making sure every step is done correctly... time to look for that old tool... you know the one you may never have used because you didn't quite understand it. Ever used SQLCMD mode in SSMS? Yes, this is the magic.

 Like many of you DBAs out there it took me years to see SQLCMD as the tool it really was. Getting others to find the magic has taken almost as long.

I recommended it as a tool to implement the data file addition, and the DBA got a look in his eyes, almost a fear that it would take too long to learn. 

I whipped out my cape and pulled out the old wooden toolbox, and started digging through the tools... and buried deep in the bottom was this bright shiny tool that was all smooth and balanced and comfortable to use. It was very simplistic in its usage.


For SQLCMD, start with the easy basic commands:


:SETVAR var "variable value"

There is enough to munch on for years in just those two SQLCMD commands. Look how simple this script is, but how much easier than writing multiple scripts connected to several servers. We can change our connections back and forth between principal and mirror server (don't forget your GOs).



:SETVAR prisrv "SERVER01" 
:SETVAR mirsrv "SERVER02" 
:SETVAR filegroup "JOB" 
:SETVAR phyfilename "TST_SWITCH_NEW_FILE.ndf" 
:SETVAR priloc "E:\SQLData" 
:SETVAR mirloc "C:\SQLServer\Data" 
:SETVAR backuploc "\\server02\Backups\TST_SWITCH.trn" 
:SETVAR filesize "204800KB" 
:SETVAR filegrowth "204800KB" 
:SETVAR mir "TCP://SERVER01.domain.com:5022" 
:SETVAR pri "TCP://SERVER02.domain.com:5022" 
:SETVAR wit "TCP://SERVER03.domain.com:5022" 
-- Remove Mirror, Add Data File, backup logfile 
:CONNECT $(prisrv) 
USE [$(db)] 
ALTER DATABASE [$(db)] ADD FILEGROUP [$(filegroup)] 
NAME = N'$(filename)',  
FILENAME = N'$(priloc)\$(phyfilename)' ,  
SIZE = $(filesize) ,  
FILEGROWTH = $(filesize) )  
TO FILEGROUP $(filegrowth) 
BACKUP LOG [$(db)] TO DISK = '$(backuploc)' WITH INIT 
:CONNECT $(mirsrv) 
FROM DISK = '$(backuploc)' 
MOVE '$(filename)' to '$(mirloc)\$(phyfilename)' 
:CONNECT $(mirsrv) 
ALTER DATABASE [$(db)] SET PARTNER = '$(mir)' 
:CONNECT $(prisrv) 
ALTER DATABASE [$(db)] SET PARTNER = '$(pri)' 
ALTER DATABASE [$(db)] SET WITNESS = '$(wit)' 


The key to this script is the ability to test the script in another environment to make sure it works. Be prepared to make mistakes, set up your test environment by creating a database and set up mirroring. It doesn't have to have data in it. Make sure the data files are on different file paths.

 If you tried to do this with the SSMS GUI it would be easy to forget a step, run a step out of order, get confused where you are. With SQLCMD, its all about the variables, the T-SQL code doesn't change.

The best part of this script, after helping the DBA write a script with SQLCMD to get comfortable with SQLCMD, I gave him my script with the SETVAR for each of the variables that would change. He was able to test it multiple times in a lower environment. When we was comfortable he modified the parameters to match production...and at 5:30AM (while I was sleeping), it ran flawlessly.It ran flawlessly for several reasons, 1) it was tested many times, 2) he was comfortable with the processes and 3) he was very careful to use the correct variables.



I covered the simplicity of SQLCMD and why it is a great tool to use for processes that need precise coordination and timing across multiple servers. It is important for you to push your comfort level to make your life easier. Most DBAs have more work than they can handle (its why I'm a consultant) so make your life easier and learn SQLCMD today.

I used the example of adding a new data file to a mirrored database where the principal and mirror used different drive paths for the data files. This script simplifies the implementation of adding a new data file to principal and mirror.



SQL Server Database Mirroring

Yes, mirroring is a deprecated feature with the addition of AlwaysOn Availability Groups. I needed to test my process of moving from Mirroring to AGs over and over and needed a quick way to implement Database Mirroring and tear it down and start over. 

The process actually involves setting up the Mirror, then creating log shipping from the mirror across a slow WAN. That article will come out some time in the near future.

It took me some research because this is my first real foray into PowerShell, so I found a blog by John P. Wood, "Using Powershell to Set Up SQL Server Mirroring", and bastardized it. I rewrote most of the script and kept some important steps.


* Warning - I'm not a PowerShell Guru, this code can be written different ways, some may be better. Let me know when improvements can be made.


The script pulls the needed variables from csvFileName which is passed into the script.



The csv file has the following columns:

 database SourceServer
 SourcePath  DestServer DestPath
 \\Q-MSSQL01-LB\Backup\ Q-MSSQL11-LB


DestDataFilePath DestLogFilePath Witness Timeout BreakMirror FileName
 domain.com  complete C:\SQLServer\Data C:\SQLServer\Logs


Function Get-FileName { 
    $names = $path.Split('\\') 
    $names[$names.Count - 1] 

 I'm sure there is a better way to do this, I got this from someone else.


Function New-SMOconnection { 
    Param ( 
    $conn = New-Object Microsoft.SqlServer.Management.Common.ServerConnection($server) 
    $conn.applicationName = "PowerShell SMO" 
    $conn.StatementTimeout = 0 
    if ($conn.IsOpen -eq $false) { 
        Throw "Could not connect to server $($server) for database backup of $($dbname)." 
    $smo = New-Object Microsoft.SqlServer.Management.Smo.Server($conn) 

Function Invoke-SqlBackup { 
    Write-Host "Performing Backup of"$database 
    $dbbk = new-object ('Microsoft.SqlServer.Management.Smo.Backup') 
    $dbbk.Action = [Microsoft.SqlServer.Management.Smo.BackupActionType]::Database 
    $dbbk.BackupSetDescription = "Full backup of $database"  
    $dbbk.BackupSetName = $database + " Backup" 
    $dbbk.Database = $database 
    $dbbk.MediaDescription = "Disk" 
    Write-Host "$SourcePath\$bkpfile" 
    $device = "$SourcePath\$bkpfile" 
    $dbbk.Devices.AddDevice($device, 'File') 
    $smo = New-SMOconnection -server $SourceServer 
    Try { 
        $dbbk.Action = [Microsoft.SqlServer.Management.Smo.BackupActionType]::Log 
    Catch { 
        $ex = $_.Exception 
        Write-Output $ex.message 
        $ex = $ex.InnerException 
        while ($ex.InnerException) 
            Write-Output $ex.InnerException.message 
            $ex = $ex.InnerException 
    Finally { 
        if ($smo.ConnectionContext.IsOpen -eq $true) { 


Function Invoke-SqlRestore { 
    # Get a new connection to the server 
    $smo = New-SMOconnection -server $DestServer 
    $backupDevice = New-Object("Microsoft.SqlServer.Management.Smo.BackupDeviceItem") ($filename, "File") 
    # Get local paths to the Database and Log file locations 
    #Set Data File Path 
    If ($DestDataFilePath.Length -gt 1){ 
    ElseIf ($smo.Settings.DefaultFile.Length -eq 0) {$DBPath = $smo.Information.MasterDBPath } 
    Else { $DBPath = $smo.Settings.DefaultFile} 
    #Set Log File Path 
    If ($DestLogFilePath.Length -gt 1){ 
    ElseIf ($smo.Settings.DefaultLog.Length -eq 0 ) {$DBLogPath = $smo.Information.MasterDBLogPath } 
    Else { $DBLogPath = $smo.Settings.DefaultLog} 
    # Load up the Restore object settings 
    Write-Host "Preparing Restore" 
    $Restore = new-object Microsoft.SqlServer.Management.Smo.Restore 
    $Restore.Action = 'Database' 
    $Restore.Database = $database 
    $Restore.ReplaceDatabase = $true 
    $Restore.NoRecovery = $true 
    # Get information from the backup file 
    $RestoreDetails = $Restore.ReadBackupHeader($smo) 
    $DataFiles = $Restore.ReadFileList($smo) 
    # Restore all backup files 
    #Write-Host "Restoring all data files" 
    ForEach ($DataRow in $DataFiles) { 
        $LogicalName = $DataRow.LogicalName 
        $PhysicalName = Get-FileName -path $DataRow.PhysicalName 
        $RestoreData = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile") 
        $RestoreData.LogicalFileName = $LogicalName 
        if ($DataRow.Type -eq "D") { 
            # Restore Data file 
            Write-Host 'Restoring Data File' $DBPath"\"$PhysicalName 
            $RestoreData.PhysicalFileName = $DBPath + "\" + $PhysicalName 
        Else { 
            # Restore Log file 
            Write-Host 'Restoring Log File' $DBPath"\"$PhysicalName 
            $RestoreData.PhysicalFileName = $DBLogPath + "\" + $PhysicalName 
    Try { 
    $Restore.FileNumber = 1 
    # If there are two files, assume the next is a Log 
    if ($RestoreDetails.Rows.Count -gt 1) { 
        $Restore.Action = [Microsoft.SqlServer.Management.Smo.RestoreActionType]::Log 
        $Restore.FileNumber = 2 
    Catch { 
        $ex = $_.Exception 
        Write-Output $ex.message 
        $ex = $ex.InnerException 
        while ($ex.InnerException) 
            Write-Output $ex.InnerException.message 
            $ex = $ex.InnerException 
    Finally { 
        if ($smo.ConnectionContext.IsOpen -eq $true) { 


Function Set-Mirror { 
    $conn = "Server=$server; Integrated Security=SSPI; Database=Master" 
    $cn = New-Object "System.Data.SqlClient.SqlConnection" $conn 
    $cmd = New-Object "System.Data.SqlClient.SqlCommand" 
    $cmd.CommandType = [System.Data.CommandType]::Text 
    $cmd.CommandText = "ALTER DATABASE $database SET PARTNER = 'TCP://" + $partner + "." + $Domain + ":$Port'" 
    $cmd.Connection = $cn 
    try { 
    catch [System.Net.WebException] { 
      # $_ is set to the ErrorRecord of the exception 
      if ($_.Exception.InnerException) { 
        Out-Log $_.Exception.InnerException.Message 
      } else { 
        Out-Log $_.Exception.Message 

Function Set-Witness { 
    $conn = "Server=$server; Integrated Security=SSPI; Database=Master" 
    $cn = New-Object "System.Data.SqlClient.SqlConnection" $conn 
    $cmd = New-Object "System.Data.SqlClient.SqlCommand" 
    $cmd.CommandType = [System.Data.CommandType]::Text 
    $cmd.CommandText = "ALTER DATABASE $database SET WITNESS = '$Witness'" 
    Write-Host $cmd.CommandText 
    $cmd.Connection = $cn 
    try { 
    catch [System.Net.WebException] { 
      # $_ is set to the ErrorRecord of the exception 
      if ($_.Exception.InnerException) { 
        Out-Log $_.Exception.InnerException.Message 
      } else { 
        Out-Log $_.Exception.Message 


Function Set-Timeout { 
    $conn = "Server=$server; Integrated Security=SSPI; Database=Master" 
    $cn = New-Object "System.Data.SqlClient.SqlConnection" $conn 
    $cmd = New-Object "System.Data.SqlClient.SqlCommand" 
    $cmd.CommandType = [System.Data.CommandType]::Text 
    $cmd.CommandText = "ALTER DATABASE $database SET PARTNER TIMEOUT "+$Timeout 
    $cmd.Connection = $cn 
    try { 
    catch [System.Net.WebException] { 
      # $_ is set to the ErrorRecord of the exception 
      if ($_.Exception.InnerException) { 
        Out-Log $_.Exception.InnerException.Message 
      } else { 
        Out-Log $_.Exception.Message 

Function Break-Mirror { 
    Write-Host "Breaking Existing Mirror" 
    $conn = "Server=$server; Integrated Security=SSPI; Database=Master" 
    $cn = New-Object "System.Data.SqlClient.SqlConnection" $conn 
    $cmd = New-Object "System.Data.SqlClient.SqlCommand" 
    $cmd.CommandType = [System.Data.CommandType]::Text 
    $cmd.CommandText = "ALTER DATABASE $database SET PARTNER OFF " 
    $cmd.Connection = $cn 
    try { 
    catch [System.Net.WebException] { 
      # $_ is set to the ErrorRecord of the exception 
      if ($_.Exception.InnerException) { 
        Out-Log $_.Exception.InnerException.Message 
      } else { 
        Out-Log $_.Exception.Message 


The Process - The loop

This is where the magic occurs. The variables are populated from the csv file that will be used to pass to the functions for each step that needs to be completed to instantiate the mirror.

 FOREACH ($Server in $ServerList) { 
    [string]$DestDataFilePath = $Server.DestDataFilePath 
    [string]$DestLogFilePath = $Server.DestLogFilePath


    if ($mode -eq 'Restore'){ 
    $bkpfile =$FileName 
    else { 
    $bkpfile = $($SourceServer.Replace("\", "$")) + "_" + $database + "_FULL_" + $(get-date -format yyyyMMdd-HHmmss) + ".bak" 
If a mirror exists on the source database remove the mirror if specified in the csv file. This is mostly for testing, so you can rerun the process over and over.  
    if ($BreakMirror -eq 'True'){ 
    Break-Mirror -server $SourceServer -database $database 
 This step creates the backup and if it is the Backup only step, it tells you where to put the backup based upon the csv data. This is for a large file to be copied across the slow WAN.  
    if ($mode -eq 'Backup') { 
        Write-Host "Copy your backups from: $SourcePath" 
        Write-Host "Put your backups here: $DestPath on Server: $DestServer" 
        Write-Host "Put your log backups here: "$DestPath\$database"_LOG on Server: $DestServer" 
    Elseif ($mode -eq 'Complete') {Invoke-SqlBackup} 
    if ($mode -ne 'Backup'){ 
        if ($mode -eq 'Complete') { 
            if($SourcePath -ne $DestPath){ 
                Copy-Item $(Join-Path $srcUNC $bkpfile) -Destination $destUNC -Verbose 
        This step restores the backup to the new location.  
        Invoke-SqlRestore -filename $bkpfile -DestDataFilePath $DestDataFilePath -DestLogFilePath $DestLogFilePath 
         This step sets the mirror on both mirror and principal.  
        if ($mode -eq 'Complete') { 
            # Establish Mirroring from the mirrored database 
            Set-Mirror -server $DestServer -database $database -partner $($SourceServer.Split('\\')[0]) 
            # Start the mirror 
            Set-Mirror -server $SourceServer -database $database -partner $($DestServer.Split('\\')[0]) 
 Add the witness if specified in the csv file. 
            if ([string]::IsNullOrEmpty($Witness)){ 
                Write-Host "Witness parameter not provided. Configure manually" 
                Write-Host "ALTER DATABASE $database SET WITNESS = [put witness name here]" 
                Write-Host 'Adding Witness' 
                Set-Witness -server $SourceServer -database $database -witness $WitnessFile 
  For a mirroring configuration across a WAN you'll want to modify the Timeout value (default of 10 seconds) to something greater. I start at 30 seconds based upon the network latency. 
                if ([string]::IsNullOrEmpty($Timeout)){ 
                Write-Host "Timeout parameter not provided. Default (10) set." 
                Set-Timeout -server $SourceServer -database $database -timeout $Timeout 
            Write-Host $SourceServer 
            Write-Host 'Backup Tail of Log and copy all log backups to destination server and restore.' 
            Write-Host "-- After logs have been restored, run:" 
            Write-Host "-- ALTER DATABASE [$database] SET PARTNER = 'TCP://$DestServTrim.${Domain}:$Port'" 
            Write-Host "TCP://"$Witness"."$Domain":"$Port 



 Sample csv File

 PowerShell Mirroring Script


Part I | Part II

Part II of  Move What? No downtime? What?

 Log Shipping Configuration

The strategy of log shipping to the two instances that will be the principal and mirror, will be configured in this part of the blog article. This step, since it is for testing can be completed through the wizard, but I recommend doing it exactly as it will be done in production so you get the hang of the backup, copy  and restore process since it is across a slow WAN that could take many hours or possibly even days.


Configure Folders for Log Shipping Backups

In order to avoid as much confusion as possible, create a folder for each database specific to the log shipping process. If you don't have folders setup you're going to run into issues. 

On each server create an empty folder for the log shipping and a folder for each database inside the log shipping folder.


These folders will be the home for the database log backups during the migration process. The current log backup jobs will need to be disabled during this process to ensure log backups are not missed. The log shipping restore jobs will restore the log backups from this location across the WAN.

For the log shipping full backup, create a folder for each database on both servers where the secondaries will reside. The backups will be copied to each location and restored. This is to simulate a large backup file backup, copy and restore across a slow WAN.





Naming Conventions of Logical File Name

So future scripts work as expected (based on a standard naming convention), make sure your logical file names are named with the same naming convention. The following script will help with this step. Replace <<DatabaseName>> with the name of your database. This will create a script you can run against the database. You'll have to be creative if you have more than one data file. Remember this is just for testing.




'ALTER DATABASE  ' + DB_NAME(database_id) + '

MODIFY FILE (NAME = ' + name +  ', NEWNAME = '+ DB_NAME(database_id) + ') '

FROM sys.master_files

WHERE database_id = db_id('<<DatabaseName>>')

AND type = 0



'ALTER DATABASE ' + DB_NAME(database_id) + '

MODIFY FILE (NAME = ' + name +  ', NEWNAME = '+ DB_NAME(database_id) + '_log) '

FROM sys.master_files

WHERE database_id = db_id('<<DatabaseName>>')

AND type = 1


This is a good spot to take a break. The next step will involve SQLCMD mode.


At a client who is a credit card payment provider, with a 5 nines (99.999) availability requirement,  I've been asked to move a set of databases from one physical server to another with the least amount of down time possible. In addition to this requirement they have a Recovery Time Objective (RTO) of 15 seconds. Unfortunately this isn't possible with the environment they have. Good news is they know this and accept the downtime.

This is Phase One of moving to Microsoft SQL Server AlwaysOn Availability Groups.


Current State

  1. Two nodes of a Windows Server Failover Clustering (WSFC) cluster on Windows 2008 R2 Enterprise and SQL Server 2008 R2 Enterprise FCI in New York Data Center.
  2. Two nodes of a Windows Server Failover Clustering (WSFC) cluster on Windows 2008 R2 Enterprise and SQL Server 2008 R2 Enterprise FCI in Delaware Data Center.
  3. The databases on the WSFC cluster in NY are configured as a SQL Server Database Mirroring principal mirroring to the DE data center for high availability.

This is already a complex environment in that have the mirroring configured as synchronous over the WAN between NY and DE. The move is to a second set of servers already being used by a different set of application databases.

Future State

 One multi-subnet Windows 2012 R2 WSFC cluster across two data centers. The nodes in NY will serve as WSFC FCI active/passive failover partners and the nodes in DE will serve as WSFC FCI active/passive failover partners.

Mirroring will be configured with the principal homed on the active node in NY and the mirror on the active node in DE.



 Migrate the production databases to an upgraded set of servers with a downtime goal of zero.

 Strategy for Migration

In order to minimize down time as well as ensure a minimal RTO for the duration of the migration, the following strategy will be used.
Log shipping will be implemented to the two instances that will act as principal and mirror. This will allow the database to maintain the RTO while the backups are copied to the DR site in DE.

During the cut over, a tail log backup will be completed, putting the database in no recovery to prevent additional transactions from occurring, and the backup will be applied to the two log ship secondary databases, the log ship secondary databases that will act as principal will be recovered (brought online).
A log backup will be taken from the databases that will be the principal and restored to the secondary databases that will act as the mirror then immediately implement mirroring between the new principal and mirror.


 The process will be scripted to prevent missing steps, human error, and complete the process in a way to minimize downtime to a bare minimum.

 Testing Configuration

Because of the necessity to avoid downtime this will have to be tested thoroughly. The risk is very high for this process as minutes down would be a very high loss of revenue.The servers and databases we will use for our testing will be as follows:

  1. QA-MSSQL03-NY - SQL Server 2008 R2
  2. Q-MSSQL01-LB - SQL Server 2014
  3. Q-MSSQL11-LB - SQL Server 2014
  4. Q-MSSQL12-LB - SQL Server 2014
  5. Q-MSSQL13-LB - SQL Server 2014

Databases to migrate TST_SWITCH and TST_TRANSACTIONS

Mirror Configuration

We want to setup mirroring on our test databases to simulate the production starting point. In production the databases are in pairs and will be move two at a time. We'll just use the wizard to configure mirroring. This assumes you have configured mirroring previously and understand the steps. Prior to this step, ensure you have mirror endpoints configured for each server being used.