Azure – Database Migration Process


  • The migration process is split into several parts and requires the use of a staging SQL Server instance, this is currently FRAASAZUREU1.DE.DB.COM
  • The first step is to grab a backup of the database you want to migrate and restore it into the Staging server.
  • We next run the DMA tool against it to see if it is compatible for migration into an Azure Managed SQL Instance, if not we look at what needs to be done to make it compatible
  • (OPTIONAL) Before we upload the database(s) into the Azure cloud they all need to be encrypted using TDE encryption and below are the steps needed to do this:

If its not already been done then you need to do parts 1-3 in the following guide, https://docs.microsoft.com/en-us/sql/relational-databases/security/encryption/setup-steps-for-extensible-key-management-using-the-azure-key-vault?view=sql-server-2017. All these steps should be done but if this is anew staging server then we need to download and install SQL Server Connector for Azure Key Vault https://www.microsoft.com/en-us/download/details.aspx?id=45344

Part 4 of the above guide may also have been done you can check for this by having a look for the credential sysadmin_ekm_cred, if its not present then please do the following steps:

    1. Configure SQL Server to use EKMExecute the following Transact-SQL script to configure the Database Engine to use an EKM provider.– Enable advanced options.  USE master; GO 
      sp_configure ‘show advanced options’, 1; GO RECONFIGURE; GO 
      — Enable EKM provider sp_configure ‘EKM provider enabled’, 1; GO RECONFIGURE;
    2. Register (create) the SQL Server Connector as an EKM provider with SQL Server— Create a cryptographic provider, using the SQL Server Connector, which is an EKM provider for the Azure Key Vault.
      This example uses the name AzureKeyVault_EKM_Prov.
      CREATE CRYPTOGRAPHIC PROVIDER AzureKeyVault_EKM_Prov FROM FILE = 'C:\Program Files\SQL Server Connector for Microsoft Azure Key Vault\Microsoft.AzureKeyVaultService.EKM.dll'; GO  
    3. Setup a SQL Server credential for a SQL Server login to use the key vaultA credential must be added to each login that will be performing encryption using a key from the Key Vault. This might include:
      • A SQL Server administrator login who will use key vault in order to setup and manage SQL Server encryption scenarios.
      • Other SQL Server logins who might enable Transparent Data Encryption (TDE), or other SQL Server encryption features.
      There is one-to-one mapping between credentials and logins. That is, each login must have a unique credential.Modify the Transact-SQL script below in the following ways:
      • Edit the IDENTITY argument (ContosoDevKeyVault) to point to your Azure Key Vault.
        • If you’re using global Azure, replace the IDENTITY argument with the name of your Azure Key Vault from Part II.
        • If you’re using a private Azure cloud (ex. Azure Government, Azure China, or Azure Germany), replace the IDENTITYargument with the Vault URI that is returned in Part II, step 3. Do not include “https://” in the Vault URI.
      • Replace the first part of the SECRET argument with the Azure Active Directory Client ID from Part I. In this example, the Client ID is EF5C8E094D2A4A769998D93440D8115D. ImportantYou must remove the hyphens from the Client ID.
      • Complete the second part of the SECRET argument with Client Secret from Part I. In this example the Client Secret from Part 1 is Replace-With-AAD-Client-Secret. The final string for the SECRET argument will be a long sequence of letters and numbers, with no hyphens.

      USE master; CREATE CREDENTIAL sysadmin_ekm_cred WITH IDENTITY = 'dev-kv-dbaas-db-0001', SECRET = 'EF5C8E094D2A4A769998D93440D8115DReplace-With-AAD-Client-Secret' FOR CRYPTOGRAPHIC PROVIDER AzureKeyVault_EKM_Prov; -- Add the credential to the SQL Server administrator's domain login ALTER LOGIN [dbg\olaftor-a] ADD CREDENTIAL sysadmin_ekm_cred;
    4. Open your Azure Key Vault key in SQL ServerIf you imported an asymmetric key as described in Part II, open the key by providing your key name in the following Transact-SQL script.
      • Replace CONTOSO_KEY with the name you’d like the key to have in SQL Server.
      • Replace ContosoRSAKey0 with the name of your key in Azure Key Vault.

      CREATE ASYMMETRIC KEY CONTOSO_KEY  FROM PROVIDER [AzureKeyVault_EKM_Prov] WITH PROVIDER_KEY_NAME = 'ContosoRSAKey0', CREATION_DISPOSITION = OPEN_EXISTING;
    5. Create a SQL ServerSQL Server login for the Datenbank-EngineDatabase Engine for TDECreate a SQL Server login and add the credential from Step 1 to it.
      USE master; -- Create a SQL Server login associated with the asymmetric key -- for the Database engine to use when it loads a database -- encrypted by TDE. CREATE LOGIN TDE_Login FROM ASYMMETRIC KEY CONTOSO_KEY; GO -- Alter the TDE Login to add the credential for use by the -- Database Engine to access the key vault ALTER LOGIN TDE_Login ADD CREDENTIAL Azure_EKM_TDE_cred ; GO

!!!!There is another guide, https://docs.microsoft.com/en-us/sql/relational-databases/security/encryption/enable-tde-on-sql-server-using-ekm?view=sql-server-2017, which might also be useful if the above doesn’t work!!!!

Create the Database Encryption Key (DEK)

The DEK will encrypt your data and log files in the database instance, and in turn be encrypted by the Azure Key Vault asymmetric key. The DEK can be created using any SQL Server supported algorithm or key length.

  1. USE ContosoDatabase; GO CREATE DATABASE ENCRYPTION KEY WITH ALGORITHM = AES_256 ENCRYPTION BY SERVER ASYMMETRIC KEY CONTOSO_KEY; GO
  2. Turn On TDE-- Alter the database to enable transparent data encryption. ALTER DATABASE ContosoDatabase SET ENCRYPTION ON; GO
  • Once the databases are encrypted they can be backed up, uploaded  and restored where the same key in Azure should be able to decrypt them.

Alternate method using master key and certificates but it’s not supported by the bank:

The first thing to do is to create a master key and certificate if they don’t already exist, this can be tested with:

select * from sys.symmetric_keys where name like ‘%DatabaseMasterKey%’;

select * from sys.certificates where pvt_key_encryption_type = ‘MK’;

If both or either of these are blank then proceed with creating the master key and/or certificate. If they are already there then just use the existing ones.

Create Master Key

We must first create the master key. It must be created in the master database, so as a precautionary measure I like to begin this statement with the USE MASTER command.

 
USE Master;
GO
CREATE MASTER KEY ENCRYPTION
BY PASSWORD=’InsertStrongPasswordHere’;
GO
 

Create Certificate protected by master key

Once the master key is created along with the strong password (that you should remember or save in a secure location), we will go ahead and create the actual certificate.

 
CREATE CERTIFICATE TDE_Cert
WITH
SUBJECT=’Database_Encryption’;
GO
 

The certificate’s name is “TDE_Cert” and I gave it a generic subject. Some Database Administrators like to put the name of the actual database that they are going to encrypt in there. It is totally up to you.

Create Database Encryption Key

Now, we must utilize our USE command to switch to the database that we wish to encrypt. Then we create a connection or association between the certificate that we just created and the actual database. Then we indicate the type of encryption algorithm we are going to use. In this case it will be AES_256 encryption.

 
USE <DB>
GO
CREATE DATABASE ENCRYPTION KEY
WITH ALGORITHM = AES_256
ENCRYPTION BY SERVER CERTIFICATE TDE_Cert;
GO
 

Enable Encryption

Finally, we can enable encryption on our database by using the ALTER DATABASE command.

 
ALTER DATABASE <DB>
SET ENCRYPTION ON;
GO
 

Once the encryption is turned on, depending on the size of the database, it may take some time to complete. You can monitor the status by querying the sys.dm_database_encryption_keys DMV.

Backup Certificate

It’s important to backup the certificate you created and store it in a secure location. If the server ever goes down and you need to restore it elsewhere, you will have to import the certificate to the server. In certain environments, the DR servers are already stood up and on warm/hot standby, so it’s a good idea to just preemptively import the saved certificate to these servers.

 
BACKUP CERTIFICATE TDE_Cert
TO FILE = ‘C:\temp\TDE_Cert’
WITH PRIVATE KEY (file=’C:\temp\TDE_CertKey.pvk’,
ENCRYPTION BY PASSWORD=’InsertStrongPasswordHere’)
 

Unless the certificate is already in place we now need to upload this certificate to the AZURE Managed SQL Instance with the following steps:

First we convert the previous certificate backup to a Personal Information Exchange (.pfx) file, using Pvk2Pfx tool in Powershell:

.\pvk2pfx -pvk c:/full_path/TDE_CertKey.pvk  -pi “<SomeStrongPassword>” -spc c:/full_path/TDE_Cert -pfx c:/full_path/TDE_Cert.pfx

If pvk2pfx is not present on the staging server then we need to be installed along with the relevant SDK, https://docs.microsoft.com/de-de/azure/sql-database/sql-database-managed-instance-migrate-tde-certificate

Now take the new pfx file and upload it to a blob container.

Azure – High Level Migration Steps

  • First you need to backup the database and copy it to a place you can upload to Azure from, e.g fraazurreu1.de.db.com
  • You then need to log into the Azure portal and go to the storage blob container, e.g devsadbaasdb0002 – sqlbackups, and select upload.
  • You now need to log into the Azure Managed instance and make sure that you have credentials set up in order to access the blob container above. CREATE CREDENTIAL [https://devsadbaasdb0002.blob.core.windows.net/sqlbackups]
    WITH IDENTITY=’SHARED ACCESS SIGNATURE’,
    SECRET = ‘sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2019-02-07T22:35:35Z&st=2019-02- 07T14:35:35Z&spr=https&sig=9lEWSK%2FxFIkMQOjr2qQ38aHOD9LCtuQDqLUN2PX3kOg%3D’
    GO
  • Next restore the database from URL:
    restore database HDB_v5 from url=’https://devsadbaasdb0002.blob.core.windows.net/sqlbackups/HDB_v5.bak’
    go
  • Final part would be to sync up any logins.

Creating the Oracle Linked Server

  • Install the Oracle client software by running D:\Install-Staging\Oracle-Client-sw-12102\000\DB_106896.msi
  • Copy the tnsnames.ora from D:\Install-Staging\Oracle-Client-sw-12102 to C:\APPS\ORACLE\PRODUCT\12.1.0\CLIENT_64\network\admin
  • Restart the SQL Server
  • Under Server Objects-Linked Servers Expand Providers select properties for OraOLEDB.Oracle and tick Allow inprocess.
  • In SQL Server run the following:
    sp_addlinkedserver @server = ‘DBIBU’, @srvproduct = ‘Oracle’, @provider = ‘OraOLEDB.Oracle’, @datasrc = ‘DBIBU’
    go
    EXEC sp_addlinkedsrvlogin ‘DBIBU’, ‘FALSE’, NULL, ‘DBIB_094_USER’, ‘fgrekjt#186’
    go
  • That should be it, you can verify by selecting Test connection or by expanding catalogs and viewing the tables and doing a select from them.

Azure – Migration Main steps

/*

Please perform the following tasks:

Asessment

  1. Restore database(s) to the staging server from backup file
  2. Delete the backup file(s) from the staging server
  3. Perform compatibility assessment with the DMA Tool (Target=”Azure SQL Database Managed Instance“)
  4. Store output of assessment (csv) in the respective folder(s) for each source database server
  5. Export list of users from the database(s).

Restore to Azure

  1. Create encrypted backup(s) of database(s) on staging server with SQL Server certificate
  2. Upload encrypted backup(s) to Azure Storage Account
  3. Restore database(s) to target Azure SQL Managed Instance successfully
  4. Drop database(s) from staging server
  5. Delete database backup file from Storage Account
  6. Confirm everything is ok

*/

— To get the filenames for the restore below
RESTORE FILELISTONLY from disk=’D:\Dumps\DB1.bak’
go
RESTORE HEADERONLY from disk=’D:\Dumps\DB1.bak’
go

restore database [DB1] from disk=’D:\Dumps\DB1.bak’
with
move ‘DB1’ to ‘d:\MSSQL\DATA\DB1.mdf’,
move ‘DB1_Log’ to ‘e:\MSSQL\DATA\DB1.ldf’,
stats=1
go

— Encrypt the database
USE [DB1]
GO
CREATE DATABASE ENCRYPTION KEY WITH ALGORITHM = AES_256 ENCRYPTION BY SERVER CERTIFICATE TDE_Cert;
GO
ALTER DATABASE [DB1] SET ENCRYPTION ON;
GO

— Now get the list of users and roles in the database
— select name, islogin, issqlrole, isapprole, isntname, isntgroup, isntuser from sysusers
— Also run the DMA assessment and save the csv output file

–Verify encryption
USE master
go
SELECT
database_name = d.name,
dek.encryptor_type,
cert_name = c.name,
dek.percent_complete,
dek.encryption_state — 1 means no encryption, 2 means it is in the process of encrypting and 3 means fully encrypted
from sys.dm_database_encryption_keys dek left join sys.certificates c
on dek.encryptor_thumbprint = c.thumbprint
inner join sys.databases d
on dek.database_id = d.database_id
go

–Once the database is fully encrypted Backup the database ready for upload
backup database [DB1] to disk=’D:\Dumps\DB1_mig_enc_full.bak’
with compression,
stats=1

— Upload the backup file using Azure Storage Explorer – [https://Storage1.blob.core.windows.net/migration-backups]

–In the Azure Managed instance Restore the database from Blob storage
restore database [DB1] from url= ‘https://Storage1.blob.core.windows.net/migration-backups/DB1_mig_enc_full.bak’
go

— Final step is to restore any users and jobs etc.

Azure – AAD Logins and Connection Strings

AAD Logins

Azure does not have the concept of Windows logins but you can chose between having a SQL Login and an Azure AAD login. The Azure AAD accounts map to the Windows AD but instead of using usernames like “dba\User1” it uses email addresses like “User1@dba.com”.

To create an Azure AAD login issue the following:

CREATE LOGIN [User1@dba.com] FROM EXTERNAL PROVIDER
GO

That email address must belong to a Windows group synced with AAD.

To then login with your regular Windows account you need to chose “Active Directory – Integrated Authentication” as the authentication method or you can use “Active Directory – Password Authetication” where you need to specify the email address and corresponding Windows password.

One thing to note is that logins and user accounts will both need to be recreated in Azure Managed Instance, for example if you have a user account dba\User1 in a database then it will not relate/map automatically to a new login User1@dba.com, it will need to be manually recreated. You can however login to the Managed instance from SSMS, run by dba\User1 (which does map to the Azure AAD login User1@dba.com)

To add user account and role:

USE [RSCRES]
GO
CREATE USER [User1@dba.com] FOR LOGIN [User1@dba.com]
GO
ALTER ROLE [db_owner] ADD MEMBER [User1@dba.com]
GO

Azure Connection Strings

SQL Login:
Server=tcp:dev-we-dbaavgbhs-0004.a73ex6770af9b0f.database.windows.net,1433;Persist Security Info=False;User ID={your_username};Password={your_password};MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=True;Connection Timeout=30;

Azure AD (Email)
Server=tcp:dev-we-dbaavgbhs-0004.a73ex6770af9b0f.database.windows.net,1433;Initial Catalog=BCSDatapublic;Persist Security Info=False;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=True;Authentication=Active Directory Integrated;

— SQL to extract AAD Logins in Azure database
use [shredder]
go
select
‘CREATE LOGIN [‘ + name + ‘] FROM EXTERNAL PROVIDER WITH DEFAULT_DATABASE=[master], DEFAULT_LANGUAGE=[us_english]’ as val
from sysusers
where isntuser=0
and issqluser=0
and issqlrole=0

Azure Keyvaults

You can extract the dbbadmin passwords form the keyvaults after logging into Azureportal via Cyberark. Just go to relevant Key vault click on secret and follo

Azure – What is DBaaS

Want to know more about the technical background?

Azure Basics

DBaaS builds on the concept of PaaS (Platform-as-a-Service) and allows the customer to operate databases without having to manage the underlying infrastructure. The CSP (Cloud Service Provider – Microsoft) of a PaaS offers the possibility to use and develop an existing run-time environment or to configure a productive environment.

For DBaaS you are using solely PaaS services offered by Microsoft via the Azure platform. You can find detailed information about Azure here: https://docs.microsoft.com/en-us/azure/

All services you are using are billed against so-called subscriptions. They can be described as credit cards but also include additional capabilities. Subscriptions can also be used to separate environments, activities and access. We use these capabilities on DBaaS to split the services to our customers and our engineering capabilities.

Subscriptions also act as Azure’s logical resource containers and every resource is allocated to a subscription. Microsoft also allows to manage multiple items within each subscription via resource groups. We use them to separate different purposes, e.g. automation and self service pieces are in a different group than the part which is storing the key data for encrypting your databases.

Below you can see an overview of all environments we are having. For you as a customer only the “Delivery” environments are important. This is where your data is hosted and where we ensure operation capabilities for the platform.

Azure SQL Managed Instances

The Azure SQL Managed Instances is a PaaS product offered by Microsoft. A company required a high-secure managed database service, which complies with all regulatory controls. If you are reading through Microsoft’s documentation, make sure to understand the difference between a Single Database, a Managed Instance and Elastic Pools, which are all offered as Azure SQL Database and differ significantly.

The Azure SQL Managed Instance offers the capability to host a managed SQL Server Instance with no access to underlying Infrastructure and Operating System. Operation processes like patch management are handled by Microsoft. Some regulatory processes are also outsourced, e.g. Microsoft handles Disaster Recovery tests.

The DBaaS team came up with a standard design for Azure SQL Managed Instance, the standard design includes the following features:

  • Full data-at-rest and data-in-motion encryption
  • Health monitoring with automatic incident ticket creation if problems occur with your database
  • Security configuration monitoring, which monitors actively each day the security configuration without exceptions
  • Integration of DB AD identities in SQL databases
  • Brokered access capabilities for privileged access
  • Access monitoring to prevent unintended access
  • Automatic backups for a minimum of 7 days – backups can be restored from any specific point-in-time
  • Regular clean up activities to keep the SQL instance smooth running

What is DBaaS?

DBaaS builds on the concept of Platform-as-a-Service and allows app teams to consume databases without having to manage the underlying infrastructure. The Cloud Service Provider (CSP) takes over a large part of the operational effort and is paid for it on a pay-per-use basis.

DBaaS is based on the native Azure SQL Managed Instance Service. With the DBaaS service, new SQL Server databases can now be provisioned within a few minutes and scaled-up/down dynamically based on capacity/performance requirements.

Our Service for you

With DBaaS databases, you don’t need to worry about infrastructure and database management tasks such as server provisioning, patching, setup, configuration, backups, or recovery. You can be compliant with policies on user access, privileged access, configuration management etc.

The DBaaS cost model is based on application level consumption used to drive dbAptio.

Why do we use a cloud service and not implement a solution on premise?

By using Azure cloud, we see cost and stability advantages. The CSP delivers a standard service globally enabling additional economies of scale to kick-in. 

The effort and knowledge that Microsoft can invest in its platform engineering is many times greater than the bank is able to invest. This combined with Microsoft owning all components in the stack from hardware, hypervisor, operating system & database software means they are also best placed to solve any bugs/issues that may arise in the end-to-end stack.

In addition with the PaaS model the CSP ensures all software in the stack is patched and maintained to latest versions and all hardware is refreshed regularly ensuring an evergreen process that takes significant operational workload and cost away from the company. This also benefits application teams from an audit perspective.

Will I get a SQL Server instance or a SQL Server database?

Application teams do not have access to the SQL Server instance itself but rather a database on the instance as the server needs to be managed and maintained to strict regulatory and security requirements.

Furthermore, many SQL servers within a typical Database estate are not efficiently utilized. CPU is underutilized and the available storage is often over-sized. By consolidating and actively managing databases on Azure instances, significantly better utilization and license footprint can be achieved, resulting in significant cost savings for the bank. This requires careful management at the SQL instance level.

Why should you use DBaaS?

Why should you use DBaaS? – DBaaS – Confluence

What is the cost of a database?

DBaaS is composed of several Azure services, the main one being Azure SQL Managed Instance. Additional services from Microsoft are used to ensure the security and functionality of the environment such as Azure Storage Accounts to store logs and manual backups, Azure Key Vault to securely store keys for database encryption etc.

Each of these services has a defined price and unit which is metered and billed on a monthly basis. These costs have been aggregated and structured as part of DBaaS product pricing into a simple flat cost per Database and a variable storage cost per GB of data consumed that is published in the product store with billing handled via an interface to dbApptio.

Does my database performance not suffer from the move into cloud?

When migrating databases to Azure instances, great care must be taken to ensure that the instances are not overloaded. To avoid these overloads, DBaaS implements both preventive and reactive measures.

Reactive measures include extending the number of cores of an instance or increasing the number instances. Both these actions can be quickly executed in the cloud but with cost implications for the company. In contrast, the implementation of preventive measures while taking longer to implement result in better cost outcomes for the company. DBaaS will be introducing application specific preventive recommendations that enable further performance and/or cost reductions. 

Multiple nodes of both compute and storage are maintained behind the scenes by Microsoft for each managed instance to ensure high availability. The respective SLAs will be governed and checked by the DBaaS team.

Are the databases safe?

The DBaaS places great emphasis on security. The MSSQL service is based on isolated Azure SQL Managed Instances which are dedicated to the customer. All data stored is encrypted using Microsoft TDE encryption (data at rest encryption). This ensures that all SQL server log-, data- and backup files are stored securely. The encryption keys are exclusively managed and controlled by the customer. All connections to SQL Managed Instances and the respective enabling services are encrypted and secured using TLS (data in motion encryption). 

In addition, DBaaS ensures only persons on-boarded using a companies Active Directory (AD) identity are able to access SQL Managed Instances. The passwords of users stay within the company and are not transferred to the cloud. Furthermore, access to databases is not possible from the Internet and they are only accessible from the network of the company.

Microsoft is required to keep the SQL Server instance, operating system, hypervisor, VM layers, and all other abstracted layers up-to-date, so that the latest security patches are always available. Microsoft has a monthly patch cycle or even more frequently for security patches.

When should DBaaS not be used?

When should DBaaS not be used? – DBaaS – Confluence

High storage needs

  • If your application is using more than 8 TB storage you should continue using on premise solutions.

Regulatory limitations

  • You have not yet obtained regulatory approval or provided regulatory notification (if required) before implementing a cloud strategy. DBaaS Governance, BISO, Compliance, and the Data Protection team can assist you in determining the Cloud Regulatory requirements for your application.
  • Your data has been identified by the Compliance and Risk Function as not eligible for migration into the cloud.

Data classification

  • When your database is handling strictly confidential data, you are not allowed to migrate to the cloud at this point in time. Additional features are being explored currently which would allow strictly confidential data to be moved to the cloud.

Application IS critical

  • When your database is classified as IS critical you need to wait until the on-boarding of DBaaS to CSO dbDAM (database activity monitoring) has been completed. We envisage this being resolved by end of 2020 and working to speed this up.

Operational boundaries

  • MS SQL Server
    • No linked servers other than MS SQL or ORACLE
    • No services from the underlying OS, like MSDTC, etc.
    • No Database Mail using more than 1 profile. No attachments using the @file_attachments parameter.
    • No access to fileshares or Windows folders of the underlying OS
    • No credentials other than SHARED ACCESS SIGNATURE (to access Azure BLOB storage) or Azure Key Vault
    • No SQL users. No Windows Accounts. Only Azure AD Accounts or groups.
    • No Database export/import other than BACPAC or bulk import from Azure BLOB storage
    • Table types not supported:
      • FILESTREAM
      • FILETABLE
      • EXTERNAL TABLE
      • MEMORY_OPTIMIZED ? supported in business critical only
    • No distributed transactions
    • No external libraries (R, Python)
    • NO SSIS ? This is Azure Data Factory at a later stage
    • NO SSAS

Azure – Backup and Auditing to Blob Storage

–Configure Backup BLOB Storage
— First create the Blob container on a storage account (best to use a new storage account for this)
— https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-auditing
/*
CREATE CREDENTIAL []
WITH IDENTITY=’SHARED ACCESS SIGNATURE’,
SECRET = ”
GO
*/

CREATE CREDENTIAL [https://devsadbcvfdaasdb0003.blob.core.windows.net/auditing]
WITH IDENTITY=’SHARED ACCESS SIGNATURE’,
SECRET = ‘sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2026-02-21T18:26:07Z&st=2019-02-21T10:26:07Z&spr=https&sig=Uyqk2%2FnyxeI6RdsfsdfswibblezdIDCWGk8vnsQvPTlelsROxJG5lb90%3D’
GO

CREATE CREDENTIAL [https://devsadbcvfdaasdb0003.blob.core.windows.net/sqlbackups]
WITH IDENTITY=’SHARED ACCESS SIGNATURE’,
SECRET = ‘sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2026-02-21T18:26:07Z&st=2019-02-21T10:26:07Z&spr=https&sig=Uyqk2%2FnyxeI6RzdfsdfswibbledfdIDCWGk8vnsQvPTlelsROxJG5lb90%3D’
GO

backup database [model] to url=’https://devsawibble0003.blob.core.windows.net/sqlbackups/model2.bak’
with copy_only
go

restore database [model_copy] from url=’https://devsawibble0003.blob.core.windows.net/sqlbackups/model2.bak’
go

–Configure Auditing to BLOB Storage
/*
CREATE CREDENTIAL [https://devsawibble0003.blob.core.windows.net/auditing]
WITH IDENTITY=’SHARED ACCESS SIGNATURE’,
SECRET = ‘sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2026-02-21T18:10:56Z&st=2019-02-21T10:10:56Z&spr=https&sig=yYeFX4Bwqs1kO4xfdsfdswibblefrx900Xl7B7t0E1MtkFkU3jlDCkvM%3D’
GO
/ /
CREATE SERVER AUDIT []
TO URL ( PATH =” [, RETENTION_DAYS = integer ])
GO
*/
use master
go
CREATE SERVER AUDIT [SecAudit]
TO URL ( PATH =’https://devsawibble0003.blob.core.windows.net/auditing’ )
GO

CREATE SERVER AUDIT SPECIFICATION [SecAudit_spec]
FOR SERVER AUDIT [SecAudit]
ADD (DATABASE_ROLE_MEMBER_CHANGE_GROUP),
ADD (SERVER_ROLE_MEMBER_CHANGE_GROUP),
ADD (AUDIT_CHANGE_GROUP),
ADD (FAILED_LOGIN_GROUP),
ADD (SUCCESSFUL_LOGIN_GROUP),
ADD (DATABASE_PRINCIPAL_CHANGE_GROUP),
ADD (SERVER_PRINCIPAL_CHANGE_GROUP),
ADD (APPLICATION_ROLE_CHANGE_PASSWORD_GROUP),
ADD (LOGIN_CHANGE_PASSWORD_GROUP),
ADD (TRACE_CHANGE_GROUP)
WITH (STATE = OFF)
GO

— You can also create auditing to Log analytics by following the steps in https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-auditing to turn diagnostics on and select — SQLSecurityAuditEvents with destination being Log Analytics.

CREATE SERVER AUDIT SQL_Audit_Log_Analytics TO EXTERNAL_MONITOR;
GO

CREATE SERVER AUDIT SPECIFICATION [SQL_Audit_Log_Analytics_spec]
FOR SERVER AUDIT [SQL_Audit_Log_Analytics]
ADD (DATABASE_ROLE_MEMBER_CHANGE_GROUP),
ADD (SERVER_ROLE_MEMBER_CHANGE_GROUP),
ADD (AUDIT_CHANGE_GROUP),
ADD (FAILED_LOGIN_GROUP),
ADD (SUCCESSFUL_LOGIN_GROUP),
ADD (DATABASE_PRINCIPAL_CHANGE_GROUP),
ADD (SERVER_PRINCIPAL_CHANGE_GROUP),
ADD (APPLICATION_ROLE_CHANGE_PASSWORD_GROUP),
ADD (LOGIN_CHANGE_PASSWORD_GROUP),
ADD (TRACE_CHANGE_GROUP)
WITH (STATE = OFF)
GO

— If audit logs are written to Log Analytics, they are available in the Log Analytics workspace, where you can run advanced searches on the audit data.
— As a starting point, navigate to the Log Analytics and under General section click Logs and enter a simple query, such as: search “SQLSecurityAuditEvents” to view the audit logs.

— Log Analytics gives you real-time operational insights using integrated search and custom dashboards to — readily analyze millions of records across all your workloads and servers. For additional useful information — about Log Analytics search language and commands, see Log Analytics search reference.