Installing certificate Keys for SQL Server

SQL Server Connection Encryption – Inhouse Operations – Confluence

Prerequisites

 A SSL Certificate is needed to enable the connection encryption.

The following prerequisites must fulfill to setup the connection encryption:

  • The app team requests the certificate on dbPKI with the FQDN of the server as Subject without any sub- or prefix, i.e. (fradbhdimu1.de.db.com:65953-1.p12) would not work.
  • The certificate must include the public and private key (*.P12  file) and protected with a password.
  • Password provided to the implementer.

To setup the certificate it must import into the Windows certificate store first and permissions to the SQL Server Service Account must setup.

  1. Open up Management Console (mmc.exe) and add the Certificates Snap-In to the Computer Account – Local computer (use File/Add or Remove Snap-ins):

  2. Import the extracted certificates (both .crt and .p12 ,you’ll need a password for the p12 one) into the personal folder(click on the personal or certificate folder and then right click in the middle window – All Tasks – Import) :
     
  3. Open “Manged Private Keys” on the .p12 certificate (see below) to grant permission on the certificate to the SQL Server Service Account:

  4. Add the SQL Server Service Account (normally NT Service\MSSQLSERVER) and grant read permission:

  5. Start the SQL Server Configuration Manager and open the Properties of the Network Configuration:

  6. Enable Fore Encryption:

  7. Select the Certificate from the list:
  8. Restart the SQL Server Service

A quick test to see if its working is to use SSMS to connect and in the options enable “Encrypt Connection”

Troubleshooting

If you have a server with a misconfigured certificate which won’t start up then you can follow these steps to try and fix it and bring it up again:

1) in SQL Configuration Manager turn off “Force Encryption”

2) In Regedit go to key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.x\MSSQLServer\SuperSocketNetLib and remove the value for Certificate

3) You can now try to restart the SQL Server, if it works great if not then one final step to try is to restore the master database from a recent working backup to another instance of the same version (as oldmasterdb) and then detach it and replace the existing master.mdf and mastlog.ldf files with these restored files.

If you have a situation where you have followed all the steps but can’t see the certificate in the drop down list (7) then you can try these steps:

1) Get the value for the certificate by opening the certificate and looking at details-Thumbprint

2) Convert this thumbprint value to uppercase and remove all spaces

3) Open up Regedit and navigate to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL.x\MSSQLServer\SuperSocketNetLib and enter the converted value from above in the Certificate field

4) Final step is to set Force Encryption to Yes and then restart the SQL Server

5) To see if its worked check the SQL Server Error log for the following string: “The certificate [Cert Hash(sha1) “ED806573B335D5DAB8C6037278F5CF6AC4C06804″] was successfully loaded for encryption.”

MSSQL – Creating an Oracle Linked Server

  • Install the Oracle client software by running D:\Install-Staging\Oracle-Client-sw-12102\000\DB_106896.msi
  • Copy the tnsnames.ora from D:\Install-Staging\Oracle-Client-sw-12102 to C:\APPS\ORACLE\PRODUCT\12.1.0\CLIENT_64\network\admin
  • Restart the SQL Server
  • Under Server Objects-Linked Servers Expand Providers select properties for OraOLEDB.Oracle and tick Allow inprocess.
  • In SQL Server run the following:
    sp_addlinkedserver @server = ‘DBIBUxx’, @srvproduct = ‘Oracle’, @provider = ‘OraOLEDB.Oracle’, @datasrc = ‘DBIBUxx’
    go
    EXEC sp_addlinkedsrvlogin ‘DBIBUxx’, ‘FALSE’, NULL, ‘DBIBxx_026_USER’, ‘passxxxx’
    go
  • That should be it, you can verify by selecting Test connection or by expanding catalogs and viewing the tables and doing a select from them.

Setting up Log Shipping MSSQL 2008

 – First you need to create the backup/log shipping directory at the primary and you need to share this folder and add full permissions for the primary dataserver SQL Server AND SQL Agent service accounts and read permissions for the secondary dataserver’s SQL Server AND SQL Agent service accounts.

– You need to repeat this at the destination server giving read access to the primary server sql server service account and read/write to the secondary dataserver’s SQL Agent AND SQL Server service accounts.

– You’ll next need to create the secondary database, this obviously needs to be sized the same as the primary database and make it simple recovery mode.

– You can now fire up the log shipping wizard from the primary dataserver by ticking the log shipping radio button under properties for the database. NOTE: The wizard is a bit rubbish and should not be used to set up log shipping, but it is useful for generating the various stored procs which need to be run on the primary and secondary dataservers.

– The wizard is fairly self explanatory, You select the Primary dataserver log shipping directory as the backup folder (use network share path). Also and VERY IMPORTANT make sure that you select “No, the secondary database is initialised” in the Initialize Secondary database tab. Also select the secondary logshipping directory(configured in step 2 above) as the “destination folder for copied files” in the Copy Files tab. The final step is to select Standby mode and “disconnect users” from the Restore transaction log tab.

– Click the button “Script Configuration” to output all this information to a new query window and save it, you can then close the wizard.

– First make sure the primary database is set to Full logged mode in properties then back it up into the log shipping directory created earlier.

– Copy this dump across to the secondary server log shipping directory.

– Now from the secondary server load the database previously created from this backup. Make sure you leave this database in RESTORE WITH STANDBY mode! Place the standby file in the secondary server logshipping directory.

– On the primary server run the following commands from the script you generated earlier; sp_add_log_shipping_primary_database to add a primary database. sp_add_schedule to add a schedule for the backup job, sp_attach_schedule and sp_update_job.

– (Optional) On the monitor server, execute sp_add_log_shipping_alert_job to add the alert job.

– On the secondary server, execute the jobs from the script, sp_add_log_shipping_secondary_primary, sp_add_schedule and sp_attach_schedule for the copy and restore jobs, and finally the  sp_add_log_shipping_secondary_database followed by 2 sp_update_job procs for copy and restore

– On the primary server, execute sp_add_log_shipping_alert_job and sp_add_log_shipping_primary_secondary to add the required information about the new secondary database to the primary server and get things up and running.

Performance Tuning MSSQL


MSSQL has many excellent tools which can be used for discovering performance bottlenecks.



Sp_monitor and dbcc perfmon


To get a snapshot of what the SQL Server is doing then these commands are very usefull.

Run sp_monitor twice with a gap of 30 seconds in between to get a high level overview of how busy the sql server is.

dbcc perfmon will give you much more detailed information on the state of the sql server but the command has been supervised by various SQL Server counters which can be used in the System monitor.



SSMS Reports


Sql Server has a set of excellent reports which are generated from the Dynamic Management Views, these are both at the server and database level. If you right click on the server or an individual database in the SQL Server Management Studio then you can select which report to run. You can even export the report output to a pdf. This will highlight any problems which you can then focus on.



System monitor


To drill down further the first port of call should be system monitor or perfmon, which is started by typing perfmon into the run box.


One excellent feature of this is that you can export to a Counter Log at the same time as running a SQL Profiler session, (discussed later), you can then merge them which will let you see how bad queries affect the CPU, memory etc or vice versa.


The objects to normally check for in system monitor are:

Memory, Network Interface, Physical Disk, Processor, SQLServer:Buffermanager, SQLServer:Databases, SQLServer:Exec Statistics, SQLServer:General Statistics, SQLServer:Latches, SQLServer:Locks, SQLServer:Memory Manager, SQLServer:PlanCache, SQLServer:SQL Statistics and System.




SQL Profiler


The system monitor might highlight the bad performance but you will typically need to drill down a bit further to find the bad queries which are causing the problems. To do this you use SQL Profiler.


First step is to establish a performance baseline as follows;


–              Start Profiler, Select file, New trace and connect to the relevant instance.

–              Specify a trace name, template(standard) and a location to save the trace file too. Set the maximum fiel size to 50MB and enable file rollover.

–              In the events selection tab remove Audit Login, Audit Logout, ExistingConnection and SQL:BatchStarting radio boxes.

–              Click on show all columns and select the TextData, NTUserName, LoginName, CPU, Reads, Writes, Duration, SPID, StartTime, EndTime, BinaryData, DatabaseName, ServerName and ObjectName coulumns.

–              Click on Column Filters and select the Database name to be the database you are interested in.

–              Click Run to start the trace and simultaneously start the System Monitor Counter Log.





Correlating the counter Log with Profiler Trace


–              Start Profiler, Select File, Open and select the trace file you created from the Performance baseline previously.

–              Next go to File, select Import Performance Data and select the Counter Log file you created simultaneously with the performance baseline trace.

–              In the performance Counters Limit Dialog window select Network Interface: Output Queue Length, Processor:% processor Time, System:Processor Queue Length and SQLServer:Buffer Manager:Page life expectancy. Click OK.

–              You can then scroll through the trace and find what was happening at peak activity on the Windows machine and also what effect certain commands had on the overall server performance.