Memory consumption SQL Server

Script for identifying which database and objects are consuming memory on a SQL Server:

–find out how big buffer pool is and determine percentage used by each database

DECLARE @total_buffer INT;
SELECT @total_buffer = cntr_value FROM sys.dm_os_performance_counters
WHERE RTRIM([object_name]) LIKE ‘%Buffer Manager’ AND counter_name = ‘Total Pages’;
;WITH src AS( SELECT database_id, db_buffer_pages = COUNT_BIG(*)
FROM sys.dm_os_buffer_descriptors –WHERE database_id BETWEEN 5 AND 32766
GROUP BY database_id)SELECT [db_name] = CASE [database_id] WHEN 32767 THEN ‘Resource DB’ ELSE DB_NAME([database_id]) END, db_buffer_pages, db_buffer_MB = db_buffer_pages / 128, db_buffer_percent = CONVERT(DECIMAL(6,3), db_buffer_pages * 100.0 / @total_buffer)
FROM src
ORDER BY db_buffer_MB DESC;

–then drill down into memory used by objects in database of your choice

USE ;

WITH src AS( SELECT [Object] = o.name, [Type] = o.type_desc, [Index] = COALESCE(i.name, ”), [Index_Type] = i.type_desc, p.[object_id], p.index_id, au.allocation_unit_id
FROM sys.partitions AS p INNER JOIN sys.allocation_units AS au ON p.hobt_id = au.container_id INNER JOIN sys.objects AS o ON p.[object_id] = o.[object_id] INNER JOIN sys.indexes AS i ON o.[object_id] = i.[object_id] AND p.index_id = i.index_id WHERE au.[type] IN (1,2,3) AND o.is_ms_shipped = 0)
SELECT src.[Object], src.[Type], src.[Index], src.Index_Type, buffer_pages = COUNT_BIG(b.page_id), buffer_mb = COUNT_BIG(b.page_id) / 128
FROM src
INNER JOIN sys.dm_os_buffer_descriptors AS b
ON src.allocation_unit_id = b.allocation_unit_id
WHERE b.database_id = DB_ID()
GROUP BY src.[Object], src.[Type], src.[Index], src.Index_Type
ORDER BY buffer_pages DESC;

Database Roles – Find permissions granted

— SQLto determine Database role permissions granted

SELECT
USER_NAME(grantee_principal_id) AS ‘User’
, state_desc AS ‘Permission’
, permission_name AS ‘Action’
, CASE class
WHEN 0 THEN ‘Database::’ + DB_NAME()
WHEN 1 THEN OBJECT_NAME(major_id)
WHEN 3 THEN ‘Schema::’ + SCHEMA_NAME(major_id) END AS ‘Securable’
FROM sys.database_permissions dp
WHERE class IN (0, 1, 3)
AND minor_id = 0
order by User;

— The Database securable refers to the whole database, so a select there woudl give select permissions to all objects etc.

Database last access time Query

select d.name, x1 =
(select X1= max(bb.xx)
from (
select xx = max(last_user_seek)
where max(last_user_seek) is not null
union all
select xx = max(last_user_scan)
where max(last_user_scan) is not null
union all
select xx = max(last_user_lookup)
where max(last_user_lookup) is not null
union all
select xx = max(last_user_update)
where max(last_user_update) is not null) bb)
FROM master.dbo.sysdatabases d
left outer join
sys.dm_db_index_usage_stats s
on d.dbid= s.database_id
group by d.name

Installing certificate Keys for SQL Server

SQL Server Connection Encryption – Inhouse Operations – Confluence

Prerequisites

 A SSL Certificate is needed to enable the connection encryption.

The following prerequisites must fulfill to setup the connection encryption:

  • The app team requests the certificate on dbPKI with the FQDN of the server as Subject without any sub- or prefix, i.e. (fradbhdimu1.de.db.com:65953-1.p12) would not work.
  • The certificate must include the public and private key (*.P12  file) and protected with a password.
  • Password provided to the implementer.

To setup the certificate it must import into the Windows certificate store first and permissions to the SQL Server Service Account must setup.

  1. Open up Management Console (mmc.exe) and add the Certificates Snap-In to the Computer Account – Local computer (use File/Add or Remove Snap-ins):

  2. Import the extracted certificates (both .crt and .p12 ,you’ll need a password for the p12 one) into the personal folder(click on the personal or certificate folder and then right click in the middle window – All Tasks – Import) :
     
  3. Open “Manged Private Keys” on the .p12 certificate (see below) to grant permission on the certificate to the SQL Server Service Account:

  4. Add the SQL Server Service Account (normally NT Service\MSSQLSERVER) and grant read permission:

  5. Start the SQL Server Configuration Manager and open the Properties of the Network Configuration:

  6. Enable Fore Encryption:

  7. Select the Certificate from the list:
  8. Restart the SQL Server Service

A quick test to see if its working is to use SSMS to connect and in the options enable “Encrypt Connection”

Troubleshooting

If you have a server with a misconfigured certificate which won’t start up then you can follow these steps to try and fix it and bring it up again:

1) in SQL Configuration Manager turn off “Force Encryption”

2) In Regedit go to key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.x\MSSQLServer\SuperSocketNetLib and remove the value for Certificate

3) You can now try to restart the SQL Server, if it works great if not then one final step to try is to restore the master database from a recent working backup to another instance of the same version (as oldmasterdb) and then detach it and replace the existing master.mdf and mastlog.ldf files with these restored files.

If you have a situation where you have followed all the steps but can’t see the certificate in the drop down list (7) then you can try these steps:

1) Get the value for the certificate by opening the certificate and looking at details-Thumbprint

2) Convert this thumbprint value to uppercase and remove all spaces

3) Open up Regedit and navigate to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.x\MSSQLServer\SuperSocketNetLib and enter the converted value from above in the Certificate field

4) Final step is to set Force Encryption to Yes and then restart the SQL Server

5) To see if its worked check the SQL Server Error log for the following string: “The certificate [Cert Hash(sha1) “ED806573B335D5DAB8C6037278F5CF6AC4C06804″] was successfully loaded for encryption.”

MSSQL – Creating an Oracle Linked Server

  • Install the Oracle client software by running D:\Install-Staging\Oracle-Client-sw-12102\000\DB_106896.msi
  • Copy the tnsnames.ora from D:\Install-Staging\Oracle-Client-sw-12102 to C:\APPS\ORACLE\PRODUCT\12.1.0\CLIENT_64\network\admin
  • Restart the SQL Server
  • Under Server Objects-Linked Servers Expand Providers select properties for OraOLEDB.Oracle and tick Allow inprocess.
  • In SQL Server run the following:
    sp_addlinkedserver @server = ‘DBIBUxx’, @srvproduct = ‘Oracle’, @provider = ‘OraOLEDB.Oracle’, @datasrc = ‘DBIBUxx’
    go
    EXEC sp_addlinkedsrvlogin ‘DBIBUxx’, ‘FALSE’, NULL, ‘DBIBxx_026_USER’, ‘passxxxx’
    go
  • That should be it, you can verify by selecting Test connection or by expanding catalogs and viewing the tables and doing a select from them.

Azure – Migration Main steps

/*

Please perform the following tasks:

Asessment

  1. Restore database(s) to the staging server from backup file
  2. Delete the backup file(s) from the staging server
  3. Perform compatibility assessment with the DMA Tool (Target=”Azure SQL Database Managed Instance“)
  4. Store output of assessment (csv) in the respective folder(s) for each source database server
  5. Export list of users from the database(s).

Restore to Azure

  1. Create encrypted backup(s) of database(s) on staging server with SQL Server certificate
  2. Upload encrypted backup(s) to Azure Storage Account
  3. Restore database(s) to target Azure SQL Managed Instance successfully
  4. Drop database(s) from staging server
  5. Delete database backup file from Storage Account
  6. Confirm everything is ok

*/

— To get the filenames for the restore below
RESTORE FILELISTONLY from disk=’D:\Dumps\DB1.bak’
go
RESTORE HEADERONLY from disk=’D:\Dumps\DB1.bak’
go

restore database [DB1] from disk=’D:\Dumps\DB1.bak’
with
move ‘DB1’ to ‘d:\MSSQL\DATA\DB1.mdf’,
move ‘DB1_Log’ to ‘e:\MSSQL\DATA\DB1.ldf’,
stats=1
go

— Encrypt the database
USE [DB1]
GO
CREATE DATABASE ENCRYPTION KEY WITH ALGORITHM = AES_256 ENCRYPTION BY SERVER CERTIFICATE TDE_Cert;
GO
ALTER DATABASE [DB1] SET ENCRYPTION ON;
GO

— Now get the list of users and roles in the database
— select name, islogin, issqlrole, isapprole, isntname, isntgroup, isntuser from sysusers
— Also run the DMA assessment and save the csv output file

–Verify encryption
USE master
go
SELECT
database_name = d.name,
dek.encryptor_type,
cert_name = c.name,
dek.percent_complete,
dek.encryption_state — 1 means no encryption, 2 means it is in the process of encrypting and 3 means fully encrypted
from sys.dm_database_encryption_keys dek left join sys.certificates c
on dek.encryptor_thumbprint = c.thumbprint
inner join sys.databases d
on dek.database_id = d.database_id
go

–Once the database is fully encrypted Backup the database ready for upload
backup database [DB1] to disk=’D:\Dumps\DB1_mig_enc_full.bak’
with compression,
stats=1

— Upload the backup file using Azure Storage Explorer – [https://Storage1.blob.core.windows.net/migration-backups]

–In the Azure Managed instance Restore the database from Blob storage
restore database [DB1] from url= ‘https://Storage1.blob.core.windows.net/migration-backups/DB1_mig_enc_full.bak’
go

— Final step is to restore any users and jobs etc.

Azure – AAD Logins and Connection Strings

AAD Logins

Azure does not have the concept of Windows logins but you can chose between having a SQL Login and an Azure AAD login. The Azure AAD accounts map to the Windows AD but instead of using usernames like “dba\User1” it uses email addresses like “User1@dba.com”.

To create an Azure AAD login issue the following:

CREATE LOGIN [User1@dba.com] FROM EXTERNAL PROVIDER
GO

That email address must belong to a Windows group synced with AAD.

To then login with your regular Windows account you need to chose “Active Directory – Integrated Authentication” as the authentication method or you can use “Active Directory – Password Authetication” where you need to specify the email address and corresponding Windows password.

One thing to note is that logins and user accounts will both need to be recreated in Azure Managed Instance, for example if you have a user account dba\User1 in a database then it will not relate/map automatically to a new login User1@dba.com, it will need to be manually recreated. You can however login to the Managed instance from SSMS, run by dba\User1 (which does map to the Azure AAD login User1@dba.com)

To add user account and role:

USE [RSCRES]
GO
CREATE USER [User1@dba.com] FOR LOGIN [User1@dba.com]
GO
ALTER ROLE [db_owner] ADD MEMBER [User1@dba.com]
GO

Azure Connection Strings

SQL Login:
Server=tcp:dev-we-dbaavgbhs-0004.a73ex6770af9b0f.database.windows.net,1433;Persist Security Info=False;User ID={your_username};Password={your_password};MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=True;Connection Timeout=30;

Azure AD (Email)
Server=tcp:dev-we-dbaavgbhs-0004.a73ex6770af9b0f.database.windows.net,1433;Initial Catalog=BCSDatapublic;Persist Security Info=False;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=True;Authentication=Active Directory Integrated;

Azure – What is DBaaS

Want to know more about the technical background?

Azure Basics

DBaaS builds on the concept of PaaS (Platform-as-a-Service) and allows the customer to operate databases without having to manage the underlying infrastructure. The CSP (Cloud Service Provider – Microsoft) of a PaaS offers the possibility to use and develop an existing run-time environment or to configure a productive environment.

For DBaaS you are using solely PaaS services offered by Microsoft via the Azure platform. You can find detailed information about Azure here: https://docs.microsoft.com/en-us/azure/

All services you are using are billed against so-called subscriptions. They can be described as credit cards but also include additional capabilities. Subscriptions can also be used to separate environments, activities and access. We use these capabilities on DBaaS to split the services to our customers and our engineering capabilities.

Subscriptions also act as Azure’s logical resource containers and every resource is allocated to a subscription. Microsoft also allows to manage multiple items within each subscription via resource groups. We use them to separate different purposes, e.g. automation and self service pieces are in a different group than the part which is storing the key data for encrypting your databases.

Below you can see an overview of all environments we are having. For you as a customer only the “Delivery” environments are important. This is where your data is hosted and where we ensure operation capabilities for the platform.

Azure SQL Managed Instances

The Azure SQL Managed Instances is a PaaS product offered by Microsoft. A company required a high-secure managed database service, which complies with all regulatory controls. If you are reading through Microsoft’s documentation, make sure to understand the difference between a Single Database, a Managed Instance and Elastic Pools, which are all offered as Azure SQL Database and differ significantly.

The Azure SQL Managed Instance offers the capability to host a managed SQL Server Instance with no access to underlying Infrastructure and Operating System. Operation processes like patch management are handled by Microsoft. Some regulatory processes are also outsourced, e.g. Microsoft handles Disaster Recovery tests.

The DBaaS team came up with a standard design for Azure SQL Managed Instance, the standard design includes the following features:

  • Full data-at-rest and data-in-motion encryption
  • Health monitoring with automatic incident ticket creation if problems occur with your database
  • Security configuration monitoring, which monitors actively each day the security configuration without exceptions
  • Integration of DB AD identities in SQL databases
  • Brokered access capabilities for privileged access
  • Access monitoring to prevent unintended access
  • Automatic backups for a minimum of 7 days – backups can be restored from any specific point-in-time
  • Regular clean up activities to keep the SQL instance smooth running

What is DBaaS?

DBaaS builds on the concept of Platform-as-a-Service and allows app teams to consume databases without having to manage the underlying infrastructure. The Cloud Service Provider (CSP) takes over a large part of the operational effort and is paid for it on a pay-per-use basis.

DBaaS is based on the native Azure SQL Managed Instance Service. With the DBaaS service, new SQL Server databases can now be provisioned within a few minutes and scaled-up/down dynamically based on capacity/performance requirements.

Our Service for you

With DBaaS databases, you don’t need to worry about infrastructure and database management tasks such as server provisioning, patching, setup, configuration, backups, or recovery. You can be compliant with policies on user access, privileged access, configuration management etc.

The DBaaS cost model is based on application level consumption used to drive dbAptio.

Why do we use a cloud service and not implement a solution on premise?

By using Azure cloud, we see cost and stability advantages. The CSP delivers a standard service globally enabling additional economies of scale to kick-in. 

The effort and knowledge that Microsoft can invest in its platform engineering is many times greater than the bank is able to invest. This combined with Microsoft owning all components in the stack from hardware, hypervisor, operating system & database software means they are also best placed to solve any bugs/issues that may arise in the end-to-end stack.

In addition with the PaaS model the CSP ensures all software in the stack is patched and maintained to latest versions and all hardware is refreshed regularly ensuring an evergreen process that takes significant operational workload and cost away from the company. This also benefits application teams from an audit perspective.

Will I get a SQL Server instance or a SQL Server database?

Application teams do not have access to the SQL Server instance itself but rather a database on the instance as the server needs to be managed and maintained to strict regulatory and security requirements.

Furthermore, many SQL servers within a typical Database estate are not efficiently utilized. CPU is underutilized and the available storage is often over-sized. By consolidating and actively managing databases on Azure instances, significantly better utilization and license footprint can be achieved, resulting in significant cost savings for the bank. This requires careful management at the SQL instance level.

Why should you use DBaaS?

Why should you use DBaaS? – DBaaS – Confluence

What is the cost of a database?

DBaaS is composed of several Azure services, the main one being Azure SQL Managed Instance. Additional services from Microsoft are used to ensure the security and functionality of the environment such as Azure Storage Accounts to store logs and manual backups, Azure Key Vault to securely store keys for database encryption etc.

Each of these services has a defined price and unit which is metered and billed on a monthly basis. These costs have been aggregated and structured as part of DBaaS product pricing into a simple flat cost per Database and a variable storage cost per GB of data consumed that is published in the product store with billing handled via an interface to dbApptio.

Does my database performance not suffer from the move into cloud?

When migrating databases to Azure instances, great care must be taken to ensure that the instances are not overloaded. To avoid these overloads, DBaaS implements both preventive and reactive measures.

Reactive measures include extending the number of cores of an instance or increasing the number instances. Both these actions can be quickly executed in the cloud but with cost implications for the company. In contrast, the implementation of preventive measures while taking longer to implement result in better cost outcomes for the company. DBaaS will be introducing application specific preventive recommendations that enable further performance and/or cost reductions. 

Multiple nodes of both compute and storage are maintained behind the scenes by Microsoft for each managed instance to ensure high availability. The respective SLAs will be governed and checked by the DBaaS team.

Are the databases safe?

The DBaaS places great emphasis on security. The MSSQL service is based on isolated Azure SQL Managed Instances which are dedicated to the customer. All data stored is encrypted using Microsoft TDE encryption (data at rest encryption). This ensures that all SQL server log-, data- and backup files are stored securely. The encryption keys are exclusively managed and controlled by the customer. All connections to SQL Managed Instances and the respective enabling services are encrypted and secured using TLS (data in motion encryption). 

In addition, DBaaS ensures only persons on-boarded using a companies Active Directory (AD) identity are able to access SQL Managed Instances. The passwords of users stay within the company and are not transferred to the cloud. Furthermore, access to databases is not possible from the Internet and they are only accessible from the network of the company.

Microsoft is required to keep the SQL Server instance, operating system, hypervisor, VM layers, and all other abstracted layers up-to-date, so that the latest security patches are always available. Microsoft has a monthly patch cycle or even more frequently for security patches.

When should DBaaS not be used?

When should DBaaS not be used? – DBaaS – Confluence

High storage needs

  • If your application is using more than 8 TB storage you should continue using on premise solutions.

Regulatory limitations

  • You have not yet obtained regulatory approval or provided regulatory notification (if required) before implementing a cloud strategy. DBaaS Governance, BISO, Compliance, and the Data Protection team can assist you in determining the Cloud Regulatory requirements for your application.
  • Your data has been identified by the Compliance and Risk Function as not eligible for migration into the cloud.

Data classification

  • When your database is handling strictly confidential data, you are not allowed to migrate to the cloud at this point in time. Additional features are being explored currently which would allow strictly confidential data to be moved to the cloud.

Application IS critical

  • When your database is classified as IS critical you need to wait until the on-boarding of DBaaS to CSO dbDAM (database activity monitoring) has been completed. We envisage this being resolved by end of 2020 and working to speed this up.

Operational boundaries

  • MS SQL Server
    • No linked servers other than MS SQL or ORACLE
    • No services from the underlying OS, like MSDTC, etc.
    • No Database Mail using more than 1 profile. No attachments using the @file_attachments parameter.
    • No access to fileshares or Windows folders of the underlying OS
    • No credentials other than SHARED ACCESS SIGNATURE (to access Azure BLOB storage) or Azure Key Vault
    • No SQL users. No Windows Accounts. Only Azure AD Accounts or groups.
    • No Database export/import other than BACPAC or bulk import from Azure BLOB storage
    • Table types not supported:
      • FILESTREAM
      • FILETABLE
      • EXTERNAL TABLE
      • MEMORY_OPTIMIZED ? supported in business critical only
    • No distributed transactions
    • No external libraries (R, Python)
    • NO SSIS ? This is Azure Data Factory at a later stage
    • NO SSAS

Azure – Backup and Auditing to Blob Storage

–Configure Backup BLOB Storage
— First create the Blob container on a storage account (best to use a new storage account for this)
— https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-auditing
/*
CREATE CREDENTIAL []
WITH IDENTITY=’SHARED ACCESS SIGNATURE’,
SECRET = ”
GO
*/

CREATE CREDENTIAL [https://devsadbcvfdaasdb0003.blob.core.windows.net/auditing]
WITH IDENTITY=’SHARED ACCESS SIGNATURE’,
SECRET = ‘sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2026-02-21T18:26:07Z&st=2019-02-21T10:26:07Z&spr=https&sig=Uyqk2%2FnyxeI6RdsfsdfswibblezdIDCWGk8vnsQvPTlelsROxJG5lb90%3D’
GO

CREATE CREDENTIAL [https://devsadbcvfdaasdb0003.blob.core.windows.net/sqlbackups]
WITH IDENTITY=’SHARED ACCESS SIGNATURE’,
SECRET = ‘sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2026-02-21T18:26:07Z&st=2019-02-21T10:26:07Z&spr=https&sig=Uyqk2%2FnyxeI6RzdfsdfswibbledfdIDCWGk8vnsQvPTlelsROxJG5lb90%3D’
GO

backup database [model] to url=’https://devsawibble0003.blob.core.windows.net/sqlbackups/model2.bak’
with copy_only
go

restore database [model_copy] from url=’https://devsawibble0003.blob.core.windows.net/sqlbackups/model2.bak’
go

–Configure Auditing to BLOB Storage
/*
CREATE CREDENTIAL [https://devsawibble0003.blob.core.windows.net/auditing]
WITH IDENTITY=’SHARED ACCESS SIGNATURE’,
SECRET = ‘sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2026-02-21T18:10:56Z&st=2019-02-21T10:10:56Z&spr=https&sig=yYeFX4Bwqs1kO4xfdsfdswibblefrx900Xl7B7t0E1MtkFkU3jlDCkvM%3D’
GO
/ /
CREATE SERVER AUDIT []
TO URL ( PATH =” [, RETENTION_DAYS = integer ])
GO
*/
use master
go
CREATE SERVER AUDIT [SecAudit]
TO URL ( PATH =’https://devsawibble0003.blob.core.windows.net/auditing’ )
GO

CREATE SERVER AUDIT SPECIFICATION [SecAudit_spec]
FOR SERVER AUDIT [SecAudit]
ADD (DATABASE_ROLE_MEMBER_CHANGE_GROUP),
ADD (SERVER_ROLE_MEMBER_CHANGE_GROUP),
ADD (AUDIT_CHANGE_GROUP),
ADD (FAILED_LOGIN_GROUP),
ADD (SUCCESSFUL_LOGIN_GROUP),
ADD (DATABASE_PRINCIPAL_CHANGE_GROUP),
ADD (SERVER_PRINCIPAL_CHANGE_GROUP),
ADD (APPLICATION_ROLE_CHANGE_PASSWORD_GROUP),
ADD (LOGIN_CHANGE_PASSWORD_GROUP),
ADD (TRACE_CHANGE_GROUP)
WITH (STATE = OFF)
GO

— You can also create auditing to Log analytics by following the steps in https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-auditing to turn diagnostics on and select — SQLSecurityAuditEvents with destination being Log Analytics.

CREATE SERVER AUDIT SQL_Audit_Log_Analytics TO EXTERNAL_MONITOR;
GO

CREATE SERVER AUDIT SPECIFICATION [SQL_Audit_Log_Analytics_spec]
FOR SERVER AUDIT [SQL_Audit_Log_Analytics]
ADD (DATABASE_ROLE_MEMBER_CHANGE_GROUP),
ADD (SERVER_ROLE_MEMBER_CHANGE_GROUP),
ADD (AUDIT_CHANGE_GROUP),
ADD (FAILED_LOGIN_GROUP),
ADD (SUCCESSFUL_LOGIN_GROUP),
ADD (DATABASE_PRINCIPAL_CHANGE_GROUP),
ADD (SERVER_PRINCIPAL_CHANGE_GROUP),
ADD (APPLICATION_ROLE_CHANGE_PASSWORD_GROUP),
ADD (LOGIN_CHANGE_PASSWORD_GROUP),
ADD (TRACE_CHANGE_GROUP)
WITH (STATE = OFF)
GO

— If audit logs are written to Log Analytics, they are available in the Log Analytics workspace, where you can run advanced searches on the audit data.
— As a starting point, navigate to the Log Analytics and under General section click Logs and enter a simple query, such as: search “SQLSecurityAuditEvents” to view the audit logs.

— Log Analytics gives you real-time operational insights using integrated search and custom dashboards to — readily analyze millions of records across all your workloads and servers. For additional useful information — about Log Analytics search language and commands, see Log Analytics search reference.