SQL Server 2014 news – datafiles in Azure

Oopps! Upgrade your browser pretty please. Oopps! Upgrade your browser pretty please.

When I first heard that you could create a datafile for your “local” database in Azure, I was a little bit confused as I wondered why would I want do that. It doesn’t sound like a good Idea as the latency would be too high and I may loose internet connection.

But do I need the best performance for every table? Probably not, I might have an archive table that is not frequently used. What if I create a filegroup in Azure and create an archive tabe on that filegroup, how would that affect my performance? Well… I had to try.

I created a storage container in my Azure account, downloaded “Azure Storage Explorer” from Codeplex, clicked the security button to create a shared access policy and a shared access signature key. I copied the key to the clipboard and created this script:

-- press CTRL + SHIFT + M to fill in the parameters
CREATE MASTER KEY ENCRYPTION BY PASSWORD ='<password,sysname,>'; GO CREATE CERTIFICATE <certificationname, sysname,SQLDemoCert> WITH SUBJECT ='<subject, sysname,>'; GO BACKUP CERTIFICATE <certificationname, sysname,SQLDemoCert> TO FILE = '<path,sysname,><certificationname, sysname,SQLDemoCert>.cer' WITH PRIVATE KEY ( FILE = '<path,sysname,><certificationname, sysname,SQLDemoCert>.pvk',
ENCRYPTION BY PASSWORD = '<password,sysname,>'
);
GO
 
 
--don't forget to create to download an create a signature for your storage container in 'Azure Storage Explorer'
IF EXISTS(SELECT * FROM sys.credentials WHERE name ='https://<blobstorage, sysname,>.blob.core.windows.net/data/')
DROP CREDENTIAL [https://<blobstorage, sysname,>.blob.core.windows.net/data/];
 
CREATE CREDENTIAL [https://<blobstorage, sysname,>.blob.core.windows.net/data/]
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = '<signature, sysname,>'; --the signature starts with sr=
 
 
 
CREATE DATABASE [<dbname,sysname,AzureDemo>];
 
USE [<dbname,sysname,AzureDemo>];
GO
ALTER DATABASE [<dbname,sysname,AzureDemo>] ADD FILEGROUP AzureStorage;
ALTER DATABASE [<dbname,sysname,AzureDemo>] ADD FILE ( NAME = N'<dbname,sysname,AzureDemo>_AzureDataFile', FILENAME = N'https://<blobstorage, sysname,>.blob.core.windows.net/data/>_dataFile.ndf' , SIZE = 5120KB , FILEGROWTH = 1024KB )TTO FILEGROUP [AzureStorage]
GO
 
CREATE DATABASE ENCRYPTION KEY WITH ALGORITHM = AES_256 ENCRYPTION BY SERVER CERTIFICATE
GO
 
ALTER DATABASE [<dbname,sysname,AzureDemo>] SET ENCRYPTION ON;
 
IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES t WHERE t.TABLE_NAME = 'AzureTable' AND t.TABLE_SCHEMA = 'dbo')
DROP TABLE dbo.AzureTable;
 
CREATE TABLE dbo.AzureTable
(
ID INT NOT NULL IDENTITY(1,1) PRIMARY KEY CLUSTERED,
Namn VARCHAR(50) NOT NULL
) ON AzureStorage
 
SET STATISTICS TIME ON;

-- I had to repeat this steps a number of times to investigate the performance impact with more records 
INSERT INTO dbo.AzureTable (Namn)
SELECT TOP 10000 c1.name FROM sys.columns c1
CROSS APPLY sys.columns c2
 
DBCC DROPCLEANBUFFERS
 
SELECT distinct namn FROM dbo.AzureTable
WHERE id = 9101 -- changed this value a

Before I ran the script I pressed CTRL + SHIFT + M to fill in the necessary paraters for certificate name, dbname, path, blobstorage name and shared accesskey.

I got a surprise when I executed the code, it was much faster then I expected, even though I was running on my laptop with wireless connection. I increased the number of records multiple times to see I the performance would suffer from more records but I found the performane to be acceptable.

I also found the blobstorage really cheap, especially as the storage i redundant and if my server (in this case my laptop) will criash, I could attach the files to another server, and I would be up and runing in no time.