Windows Azure PowerShell Samples on github

Windows Azure platform provides many services which are able to access and manage via Windows Azure Management RESTful APIs. However, it is not that convenient to use Management APIs directly, instead we can use some existing tools like old Silverlight Portal and new HTML 5 Portal. It is user-friendly to use these two portals, but it is a manual process. What if you are a script guy like Windows PowerShell? Here comes to Windows Azure PowerShell Cmdlets. Windows Azure PowerShell is more efficient way, especially when you need to build some automated processes, for instance deploying a cloud service automatically.

In this blog post, I will demonstrate how you can use Windows Azure PowerShell to access and manage your subscriptions and services. Since all samples are written in PowerShell scripts, I have created a repo in github.com so it is easy to access, download, and reversion. Currently, it includes the following samples:

Please note that these samples only show a basic usage of Windows Azure PowerShell, and does not cover some other features like Web Sites, Virtual Machines. I will update samples at github repo and update this blob post from time to time.

Enjoy reading and stay tuned!

Getting started with new Windows Azure Package Format

From Windows Azure SDK 1.7 (June 2012), Windows Azure team introduces a new package format, called Open Packaging Conventions (OPC). Unlike the previous package file format which is undocumented and not possible to modify a package after created, the new OPC format is fully documented. The most beauty of new OPC format is that it can be modified and repackaged after its creation as it is a standard zip file. This is extremity useful in some scenarios, such as update configuration settings in definition file. More information about OPC, check MSDN Overview of the New Windows Azure Package Format.

So, how can we create a package in new OPC format? The answer is use cspack.exe command utility with  /useCtpPackageFormat option. Please note you need to use SDK 1.7 or later version of cspack.exe. It is under C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\<SDK Version>\bin\cspack.exe by default. The complete command line to create a package in OPC format is:

cspack <DirectoryName>\<ServiceDefinition>
   /role:<RoleName>;<RoleBinariesDirectory>
   /sites:<RoleName>;<VirtualPath>;<PhysicalPath>
   /rolePropertiesFile:<RoleName>;<RolePropertyFile>
   /out:<OutputFileName>
   /useCtpPackageFormat
More info: http://msdn.microsoft.com/en-us/library/windowsazure/jj151521.aspx

Also, we can use cspack.exe together with /convertToCtpPackage option to convert an existing old package format to new OPC. The syntax is:

cspack /convertToCtpPackage:<InputFileName>;<OutputFileName>

More info: http://msdn.microsoft.com/en-us/library/windowsazure/jj151527.aspx

Here are two examples:

# Create a package in OPC format
cspack.exe "D:\Workspace\Visual Studio\PackageFormatDemo1\PackageFormatDemo1\ServiceDefinition.csdef" /role:MvcWebRole1;"D:\Workspace\Visual Studio\PackageFormatDemo1\MvcWebRole1" /sites:MvcWebRole1;Web;"D:\Workspace\Visual Studio\PackageFormatDemo1\MvcWebRole1" /out:"D:\Workspace\Visual Studio\PackageFormatDemo1\package\PackageFormatDemo1.cspkg" /<strong>useCtpPackageFormat</strong>

# Convert a old format package to OPC package format
cspack /<strong>convertToCtpPackage</strong>:"D:\Workspace\Visual Studio\PackageFormatDemo\package\PackageFormatDemo.cspkg";"D:\Workspace\Visual Studio\PackageFormatDemo\package\PackageFormatDemo_NewOPC.cspkg"

If you are PowerShell guy, I would recommend you check a great article by Shay Yannay at Code Project: Windows Azure Application Package New Format.

Tips & Tricks

  • The attribute values in cspack.exe are case-sensitive, make sure all values are match exact the same cases.
  • By default, the ASP.NET Web/Mvc applications are packaged to target .NET Framework 2.0 runtime. If you web application is target .NET Framework 4.0 runtime, don’t forget to specify a rolePropertiesFile which includes a line: TargetFrameWorkVersion=v4.0.
  • New package file does not support guest OS family 3.0 based on my tests. It might be supported in a upcoming guest OS family 3.0 update.

That’s it and let me know if you have any comments. Enjoy reading :-)

Generate .pfx certificate file from GoDaddy .crt and .p7b files

When you bought a commercial certificate from GoDaddy (or any other CAs), you typically got two files, .crt and .p7b (.p7b is used to build certificate chain so that your certificate can be valided). However, in some cases, you need a .pfx certificate file. For instance, you want to deploy a web application with HTTPs enabled in Windows Azure. So, how can we generate a .pfx file from .crt and .p7b files? In this blog post, I will walk you through two ways to do that.

Option 1
The first option is using Microsoft Command Console, together with IIS in Windows, which can be completely done via GUI. The official GoDaddy support site has steps in a great detail, please refer to Installing an SSL Certificate in Microsoft IIS 7.

Option 2
in case the first option does not work for you, the second option comes in handy. We use CertUtil command line tool which is part of Visual Studio development tools. Here are brief steps:

  1. Perform the same steps 1-19 of Option 1 to import .crt certificate file to local Windows certificate store (local computer\Personal).
  2. Run the following command line:
    certutil -privatekey -exportPFX "ThumbPrint of the certificate" "<name of pfx file>.pfx"

    You can get the ThumbPrint by checking the Properties of certificate imported in step 1, please note you need to remove all spaces of the ThumbPrint. You can either remove those spaces manually, or via a simple PowerShell command line like below:

    $Thumbprint = "‎bn fe 5f 0d 59 5c f0 f2 5f a9 fa e9 73 ec 16 e0 b8 75 2b d2"
    $Thumbprint = $Thumbprint.Replace(" ", "");
    $Thumbprint
    ‎# Here is the result: bnfe5f0d595cf0f25fa9fae973ec16e0b8752bd2

Let me know if you have any feedback. Enjoy reading :-)

How to call a function in another script file using PowerShell?

If you have written many PowerShell script files, and want to reuse them in your daily life, one simple way to do this is making them as different functions with approrpiate input parameters. By doing this, you no longer need to open an existing file, modify it, and then execute the file; but to call the function with real-time input parameters to get the work done.

This sounds great. However, there is one more question, “How to call a function in another script file?”. That’s  the purpose why I am writing this blog post.

To make this works, we have two methods:

  1. Make a module file (.psm1)
  2. Dot source the script file (.ps1)

Demo:

1. Suppose we have a script file, named MyScriptLib.ps1 located in the E:WorkspacePS. The file script looks like below:

Function Get-PerfMonCounter([String] $Counter)
{
	
	$CookedValue = (Get-Counter -Counter $Counter).CounterSamples | `
	Select -ExpandProperty Cookedvalue
	return ([math]::Round($CookedValue, 2))
}

2. To call the Get-PerfMonCounter fuction using the 1st Method, make a copy of the script file and change the exetension to psm1, MyScriptLib.psm1 (PowerShell Module file) in the same directory as MyScriptLib.ps1. Then, use the following code to call the Get-PerfMonCounter:

PS C:Windowssystem32> Import-Module -Name "E:WorkspacePSMyScriptLib.psm1"
PS C:Windowssystem32> Get-PerfMonCounter -Counter "Processor(_Total)% Processor Time"
-- Here is execution result in the command line
2.9
PS C:Windowssystem32> 

3. To call the GetGet-PerfMonCounter fuction using the 2nd Method, we need first dot source the script file which means the script fill will be run within the current execution context. The script likes:

PS C:Windowssystem32> . E:WorkspacePSMyScriptLib.ps1 #Use . (dot) to dot source a script file, don’t forget the space after .(dot)
PS C:Windowssystem32> Get-PerfMonCounter -Counter "Processor(_Total)% Processor Time"
1.05
PS C:Windowssystem32> 

References:

Import-Modue: http://technet.microsoft.com/en-us/library/dd819454.aspx
dot source: http://technet.microsoft.com/en-us/library/ee176949.aspx

Enjoy learning PoweShell 微笑


					

SQL Server 2012 new feature: user-defined server role

This blog post will introduce a new feature in SQL Server 2012, user-defined server role. I think this feature has been requested many times by our DBAs, which can make our security management easily. If you don’t have installed SQL Server 2012 yet, you can download and install SQL Server 2012 RC0 now from here. By the way, SQL Server will be RTMed in the first half of year 2012.

With user-defined server role, we can easily manage the following but not limited to scenarios :

  • You want to give users more than one fixed server role (but sysadmin), or specific server permission such as VIEW SERVER STATE. Prior to SQL Server 2012, we need to grant each of those privileges to the users separately; and if there are more than one user requires those privileges, we need to do the same job delicately. With user-defined server role, we can create a custom server roe, grant required privileges, and add appropriate users to it as role members. By doing this, DBAs can easily add/remove privileges and users to or from this user-defined server role.
  • You want to give users more server-level privileges on Dev/QA servers, but less on the Production servers. With user-defined server role, you only need to create a user-defined server role and grant with additional server privileges on the Dev/QA servers, when moving (or converting Dev/QA) to Production servers, it is enough to remove all users from the role.

In SQL Server 2012, there are several new DDL statements to create/alter/drop user-defined server roles. To create an user-defined server role using CREATE SERVER ROLE T-SQL statement:

   1:  USE master;
   2:  GO
   3:  -- Create an user-defined server role, and grant VIEW SERVER STATE privilege
   4:  CREATE SERVER ROLE [perf_monitoring] AUTHORIZATION [sa];
   5:  GO
   6:  GRANT VIEW SERVER STATE TO [perf_monitoring];
   7:  GO

After the user-defined server role created, you can add server principals (server role, logins, etc..) to the role as role members using ALTER SERVER ROLE T-SQL statement:

   1:  -- Add a role member, alex is a Windows login here
   2:  ALTER SERVER ROLE [perf_monitoring] ADD MEMBER [MSSQLGuyalex];
   3:  GO

If you want to remove someone from the role, you can also user ALTER SERVER ROLE with DROP option. After all role members are removed, we can finally drop the role itself using DROP SERVER ROLE T-SQL statement:

   1:  -- Remove a role member from an user-defined server role
   2:  ALTER SERVER ROLE [perf_monitoring] DROP MEMBER [MSSQLGuyalex];
   3:  GO
   4:  -- If all members are removed from an user-define server role,
   5:  -- we can drop the role as well
   6:  DROP SERVER ROLE [perf_monitoring];
   7:  GO

Enjoy learning SQL Server 2012 :-)

How to restore a database when you have a SQL Server backup file?

Oftentimes, when I get a database backup file, I usually perform a quick database restore without any hesitation. This usually works fine, but sometimes I get an annoying error message without any clue. Actually, SQL Server provides some commands that can do some basic check against the backup file before restoring. Here are some steps I usually do and it helps a lot, so I would like to share with you all.

1. First of all, I would like to check whether it is a valid backup via running RESTORE VERIFYONLY command.

RESTORE VERIFYONLY
FROM DISK= N’D:WorkspaceWordpress2011-11-22AdventureWorks2008R2_Full.bak’;
GO
If the verify succeeds, it will show “The backup set on file 1 is valid.” which indicates the backup file is valid and we can perform the actual database restore later. If any error generated, it may states it is not a actual SQL Server database backup file or the file was corrupted and we should not go any further.

2. After verify check process and it is a valid backup file.  Now, I would like to get the header of the backup file via running RESTORE HEADERONLY T-SQL command. Don’t take a file with .bak extension as database backup and .trn extension as transactional-log backup for granted. Actually, file extension is really no matter, we can use any file extension for SQL Server backups, or even any file on Windows OS. Therefore, I highly recommend check it use RESTORE HEADERONLY:

RESTORE HEADERONLY
FROM DISK= N’D:WorkspaceWordpress2011-11-22AdventureWorks2008R2_Full.bak’;
GO

There are lots of useful backup/database properties in the result, and here are some interesting columns

RESTOREHEADERONLY

  • BackupType: indicates what kind of backup in the backup file. 1 = FULL database backup, 2 = T-Log backup, 5 = differential backup.
  • Compressed: indicates whether the backup is compressed or not when performing backup, 1 = Compressed, 0 = Not compressed.
  • Database Version: indicates the SQL Server version where the original database exists and backed up. 661 = SQL Server 2008 R2, 655 = SQL Server 2008, 611/612 = SQL Server 2005, 539 = SQL Server 2000, 515 = SQL Server 7.0.
  • Compatibility Level: indicates the database compatibility when the backup was performed. The compatibility has no indirect relation with database version, a SQL Server 2008 R2 (version 661) can has 100, 90, 80 as compatibility level.
  • SoftwareVersionMajor.SoftwareVersionMinor.SoftwareVersionBuild: indicates the build # of the SQL Server instance where original database exists. For example, 10.50.2500 is the build # of SQL Server 2008 R2 SP1. Please check a very detailed reference at here.
  • IsCopyOnly: indicates whether CopyOnly option used during the database backup. Use the CopyOnly won’t interrupt the existing backup/restore sequences. It is recommended to use this option if you want to get a test copy of the database.
  • RecoveryModel: indicates the recovery model of the original database when the backup was performed, will be one of SIMPLE, BULK-LOGGED, FULL.

3. If the above 1 and 2 are passed, I am confident to know that I can perform the actual database restore. The restore type depends on the backup type displayed in the step 2. At this point, I would like to get the file list contained in the backup using RESTORE FILELISTONLY command and then perform the last restore process.

RESTORE FILELISTONLY
FROM DISK= N’D:WorkspaceWordpress2011-11-22AdventureWorks2008R2_Full.bak’;
GO

In this test, I get the following:

RESTOREFILELISTONLY

So, I can see how may files in the backup, and when I restore the database I can use appropriate MOVE clause to move each logical file to existing folder. Although, the MOVE clause is optimal when full database restore, the corresponding to its original file location display in the above screenshot. However, if the corresponding file location is not exists, or the permission is not properly configured, you may get errors. Therefore, I would recommend using MOVE clause when do this. For more information about restore locaton, you may check another blog post Where the database data and log files will be restored?.

That’s it, hope it helps.

How to attach a database without log file (.ldf)?

Earily this morning, I was asked about how to attach a database without the lof file (.ldf). This may happen if someone drop the (.ldf) file incorrectly. For SQL Server 2005 or later, this can be eaisly done via both T-SQL or SQL Server Management Studio (SSMS).

T-SQL

USE master;
GO
-- Create a dummy database for testing
IF (DB_ID('TestDB')) IS NOT NULL
	DROP DATABASE TestDB;
GO
CREATE DATABASE TestDB ON PRIMARY
(
	NAME = 'TestDB',
	FILENAME = 'E:WorkspaceDataTestDB.mdf'
)
LOG ON
(
	NAME = 'TestDB_Log',
	FILENAME = 'E:WorkspaceDataTestDB_log.ldf'
);
GO

-- Create a dummy table and populate some test data
USE TestDB;
GO
SELECT * INTO dbo.dbs FROM master.sys.databases
-- Check the table data
SELECT * FROM dbo.dbs

-- Set the database offline and delete the ldf file manually from the OS
USE master;
ALTER DATABASE TestDB SET OFFLINE;
GO

-- In a real world, we may start from here (only mdf file left)
IF (DB_ID('TestDB')) IS NOT NULL
	DROP DATABASE TestDB;
GO
-- This is how it works - use CREATE DATABASE ... FOR ATTACH or ATTACH_REBUILD_LOG
-- For more information, check: http://msdn.microsoft.com/en-us/library/ms176061.aspx
CREATE DATABASE TestDB ON
(FILENAME = N'E:WorkspaceDataTestDB.mdf')
FOR ATTACH
GO
-- OR
--CREATE DATABASE TestDB ON
--(FILENAME = N'E:WorkspaceDataTestDB.mdf')
--FOR ATTACH_REBUILD_LOG
--GO

-- After attached, check the table data again
USE TestDB;
SELECT * FROM dbo.dbs;

-- Drop the dummy database
USE master;
DROP DATABASE TestDB;
GO

SSMS

image

Update: As a side note, if the database is not clean shundown, and the ldf file is lost, the Database Engine may not be able to re-create the log file in this case. In that case, there is a undocument option may work (ATTACH_FORCE_REBUILD_LOG). However, you may have data lost after the database created.

Hope it helps.

SQL Server database CHECKSUM page varify option

In SQL Server 2005, there is a new page varify option, CHECKSUM, which can detect the page corruption between time a database page is written to disk and read from subquently by SQL Server. This option (CHECKSUM) is more restricter than TORN_PAGE_DETECTION, and you can have either one of then eabled. Therefore, it is recommended to use CHECKSUM as page varify option. You may has some performance concerns about this option, however, Linch Shea (Blog) bloged that there almost non/mimnum performance impact on the SQL Server based on his benchmarking tests (check here).

You can use the following script to get all the user databases that are not using CHCEKSUM page varify option, update them accordingly.

USE master;
GO

-- Check before update
SELECT	name,
		page_verify_option /* 0 = NONE, 1 = TORN_PAGE_DETECTION, 2 = CHECKSUM */
FROM	sys.databases
WHERE	database_id > 4 AND /* Exclude the system databases */
		name NOT IN ('ReportServer', 'ReportServerTempDB') AND /* Exclude the reporting services databases or any other*/
		page_verify_option <> 2;
		
		
-- Update the databases

SELECT	name,
		page_verify_option /* 0 = NONE, 1 = TORN_PAGE_DETECTION, 2 = CHECKSUM */
INTO	#temp
FROM	sys.databases
WHERE	database_id > 4 AND /* Exclude the system databases */
		name NOT IN ('ReportServer', 'ReportServerTempDB') AND /* Exclude the reporting services databases or any other*/
		page_verify_option <> 2;
		

DECLARE @Command NVARCHAR(MAX);
DECLARE @DatabaseName SYSNAME;

WHILE EXISTS(SELECT * FROM #temp WHERE page_verify_option <> 2)
BEGIN
	SELECT TOP(1) @DatabaseName = name FROM #temp WHERE page_verify_option <> 2;
	SET @Command = 'ALTER DATABASE ' + QUOTENAME(@DatabaseName) + ' SET PAGE_VERIFY CHECKSUM WITH NO_WAIT';
	
	EXECUTE(@Command);
	
	UPDATE #temp SET page_verify_option = 2 WHERE name = @DatabaseName;
END;

DROP TABLE #temp;
GO


-- Check after update
SELECT	name,
		page_verify_option /* 0 = NONE, 1 = TORN_PAGE_DETECTION, 2 = CHECKSUM */
FROM	sys.databases
WHERE	database_id > 4 AND /* Exclude the system databases */
		name NOT IN ('ReportServer', 'ReportServerTempDB') /* Exclude the reporting services databases or any other */ AND
		page_verify_option <> 2;

Tips: Avoid single-use cached plan with ‘optimize for ad hoc workloads’ in the SQL Server 2008 or later

In SQL Server 2008, there is a new server configuration option introduced, called ‘optimize for ad hoc workloads’. This option can improve the efficiency of the plan cache for single use ad-hoc queries (or called dynamic SQL). When this option is on (set to 1, and 0 by default), SQL Server only stores a small compiled plan stub in the plan cache instead of the full compiled plan. When the same query executs next time, the database engine knows that this ad hoc query has compiled before based on the compiled plan stub in the plan cache, and then removes the compiled plan stub from the plan cache and stores the full compiled plan in the plan cache this time so that the plan can be reused in the following executions. For more informaiton, refer to BOL: http://msdn.microsoft.com/en-us/library/cc645587.aspx.

Here is a script to check the ‘optimize for ad hoc workloads’ option, and update it to 1 if required.

-- Check the total server memory currently used by buffer pool only
SELECT cntr_value / 1024 AS [Total Server Memory (MB)]
FROM	sys.dm_os_performance_counters 
WHERE object_name = 'SQLServer:Memory Manager' 
AND counter_name = 'Total Server Memory (KB)'

-- Cached plan usage statistics  (from kimberly)
SELECT objtype AS [CacheType]
        , COUNT_BIG(*)AS [Total Plans]
        , SUM(CAST(size_in_bytes AS DECIMAL(18,2)))/1024/1024 AS [Total MBs]
        , AVG(usecounts) AS [AVG USE COUNT]
        , SUM(CAST((CASE WHEN usecounts = 1 THEN size_in_bytes ELSE 0 END) as decimal(18,2)))/1024/1024 AS [Total MBs - USE COUNT 1]
        , SUM(CASE WHEN usecounts = 1 THEN 1 ELSE 0 END) AS [Total Plans - USE COUNT 1]
FROM sys.dm_exec_cached_plans
GROUP BY objtype
ORDER BY [Total MBs - USE COUNT 1] DESC
GO 

-- Check the size of the total cached plans
SELECT SUM(CAST(size_in_bytes AS DECIMAL(18,2)))/1024/1024 AS [Total MBs]
FROM sys.dm_exec_cached_plans

-- Check the size of total cached plans that were used only once
SELECT SUM(CAST(size_in_bytes AS DECIMAL(18,2)))/1024/1024 AS [Total MBs of Single-user Plans]
FROM sys.dm_exec_cached_plans
WHERE usecounts = 1

-- You can clean the whole plan cache since 'optimize for ad hoc workloads' 
-- only works for following queries
/*
DBCC FREEPROCCACHE;
GO
*/

-- Check the optimize for ad hoc workloads option, should be 1
SELECT	*
FROM sys.configurations
WHERE	name = 'optimize for ad hoc workloads';
GO

-- Update optimize for ad hoc workloads to 1 if it is 0
EXEC sp_configure 'show advanced options', 1;
GO
RECONFIGURE;
GO
EXEC sp_configure 'optimize for ad hoc workloads', 1;
GO
RECONFIGURE;
GO

/*
	References:

http://msdn.microsoft.com/en-us/library/cc645587.aspx


http://msdn.microsoft.com/en-us/library/ms187404.aspx


http://www.sqlskills.com/blogs/kimberly/post/procedure-cache-and-optimizing-for-adhoc-workloads.aspx


http://sqlblog.com/blogs/kalen_delaney/archive/2007/09/29/geek-city-clearing-a-single-plan-from-cache.aspx


http://sqlblog.com/blogs/kalen_delaney/archive/2007/11/04/did-you-know-sp2-does-not-limit-the-amount-of-plan-cache-you-can-have.aspx

*/

FAQs: Database backup and restore for a mirrored database

Recently, I was asked several regarding database backup and restore for a mirrored database; therefore I think I need to write a blog post about them which may help others who has the similar doubts.

Q: Can I do database backups for a mirrored database?

A: Yes, it is allowed to do database backups in the Principal database. However, it is not allowed on the Mirror server since which is in the Restoring state.

Q: Can I do transactional log or differential database backups on mirror database after mirroring session fails over?

A: Yes. Someone may think full database can be performed on the mirror database, but not think so for transactional log (t-log) or differential database backups. Actually, we can t-log or differential database backups as well as long as there was already one full database backed up before in either server and both database has been synced up once after that..

Q: Can I restore the database with backup files which are backed up from differential server?

A: Yes, but these backup file must be in a valid Recovery Path. For more information, see: Working with Restore Sequences for SQL Server Databases.

Demo:

Make sure you have already setup a mirroring environment before following the demo steps below. If not, you can set it up following a previous blog post: http://ifcsong.wordpress.com/2011/10/11/setup-database-mirroring-using-windows-authentication-t-sql/.

:SETVAR principal_server "PRINCIPAL"
:SETVAR mirror_server "MIRROR"
:SETVAR restore_test "TEST"

-- Step 1: Perform some changes on the principal server
:CONNECT $(principal_server)
USE DBMirroringTest;
GO
CREATE TABLE dbo.backuptest
(
	Id int not null identity(1,1) primary key,
	Operation nvarchar(100)
);
GO
INSERT INTO dbo.backuptest(Operation) VALUES ('1st row inserted on the principal server');
GO

-- Step 2: Perform the 1st full database backup on the principal server
:CONNECT $(principal_server)
USE master;
GO
BACKUP DATABASE DBMirroringTest
TO DISK = 'E:WorkspaceBackups1stfullbackup.bak'
WITH INIT, FORMAT, STATS = 10;
GO

-- Step 3: Insert some more records in the table on the principal server
:CONNECT $(principal_server)
USE DBMirroringTest
INSERT INTO dbo.backuptest(Operation) VALUES ('2st row inserted on the principal server after full backup');
GO

-- Step 4: Perform the 1st log backup on the principal server
:CONNECT $(principal_server)
USE master;
GO
BACKUP LOG DBMirroringTest
TO DISK = 'E:WorkspaceBackups1stlogbackup.trn'
WITH INIT, FORMAT, STATS = 10;
GO

-- Step 5: Failover the database mirroring from principal server
:CONNECT $(principal_server)
USE master;
GO
ALTER DATABASE DBMirroringTest SET PARTNER FAILOVER;
GO

-- Step 6: Insert some more records in the table on the new principal (oringal mirror) server
:CONNECT $(mirror_server)
USE DBMirroringTest;
GO
INSERT INTO dbo.backuptest(Operation) VALUES ('3st row inserted on the new principal (original mirror) server after 1st log backup');
GO

-- Step 7: Perform the 1st differential backup on the new principal server (original mirror)
:CONNECT $(mirror_server)
USE master;
GO
BACKUP DATABASE DBMirroringTest
TO DISK = 'E:WorkspaceBackups1stdiffbackup.bak'
WITH DIFFERENTIAL, INIT, FORMAT, STATS = 10;
GO


-- Step 8: Perform the 2nd log backup on the new principal server (original mirror)
:CONNECT $(mirror_server)
USE master;
GO
BACKUP LOG DBMirroringTest
TO DISK = 'E:WorkspaceBackups2stlogbackup.trn'
WITH INIT, FORMAT, STATS = 10;
GO


:CONNECT $(restore_test)
-- Test 1:
-- Restore the 1st full backup and 1st log backup from the principal server and 2nd log backup
-- from the new principal server (original mirror)
USE master;
GO
RESTORE DATABASE DBMirroringTest
FROM DISK = 'E:WorkspaceBackups1stfullbackup.bak'
WITH 
	MOVE 'DBMirroringTest'
	TO 'E:WorkspaceDatabasesMONITORDBMirroringTest_data.mdf',
	MOVE 'DBMirroringTest_log'
	TO 'E:WorkspaceDatabasesMONITORDBMirroringTest_log.ldf',
REPLACE, NORECOVERY;
GO
RESTORE LOG DBMirroringTest
FROM DISK = 'E:WorkspaceBackups1stlogbackup.trn'
WITH NORECOVERY, STATS = 10;
GO
RESTORE LOG DBMirroringTest
FROM DISK = 'E:WorkspaceBackups2stlogbackup.trn'
WITH NORECOVERY, STATS = 10;
GO
RESTORE DATABASE DBMirroringTest WITH RECOVERY;
GO

-- Check the result
USE DBMirroringTest;
GO 
SELECT * FROM dbo.backuptest;
GO

-- Drop the database
USE master;
GO
DROP DATABASE DBMirroringTest;
GO

-- Test 2:
-- Restore the 1st full backup and 1st differential backup  and 2nd log backup
-- from the new principal server (original mirror)
USE master;
GO
RESTORE DATABASE DBMirroringTest
FROM DISK = 'E:WorkspaceBackups1stfullbackup.bak'
WITH 
	MOVE 'DBMirroringTest'
	TO 'E:WorkspaceDatabasesMONITORDBMirroringTest_data.mdf',
	MOVE 'DBMirroringTest_log'
	TO 'E:WorkspaceDatabasesMONITORDBMirroringTest_log.ldf',
REPLACE, NORECOVERY;
GO
RESTORE DATABASE DBMirroringTest
FROM DISK = 'E:WorkspaceBackups1stdiffbackup.trn'
WITH NORECOVERY, STATS = 10;
GO
RESTORE LOG DBMirroringTest
FROM DISK = 'E:WorkspaceBackups2stlogbackup.trn'
WITH NORECOVERY, STATS = 10;
GO
RESTORE DATABASE DBMirroringTest WITH RECOVERY;
GO

-- Check the result
USE DBMirroringTest;
GO 
SELECT * FROM dbo.backuptest;
GO

-- Drop the database
USE master;
GO
DROP DATABASE DBMirroringTest;
GO