Archive

Archive for the ‘TSQL’ Category

SQL :- Many Flavors (PostGreSQL).

SQL :- Many Flavors (PostGreSQL).

Recently I installed PostGreSQL on windows so, I can work on a little project that I’ve been wanting to do.  The objective was to learn PostgreSQL and since I’ve started to dig into the free software DAZ Studios (DAZ3D) I wanted a better understanding of the data in the tables and wanted to be able to learn various configurations of PostgreSQL,
By default DAZ Studio installs pgAdmin3 LTS as the graphical interface manager. But, there is a new pgAdmin4 1.4 that I’ll eventually be switching too for viewing data tables.  For now as pgAdmin30 is out I’ll use that for this article mainly because that’s what DAZ Studio utilizes.  After all items are connected I’ll show you how to create  external  connections via ODBCmanager for 32bit and 64bit connections. With those connections, I’ll show you how to connect to Excel and SSRS reporting. I’m keeping the examples short for now and in the future I will be posting more complex ideas and samples of how to work with the data to enhance more control over content management.

Installed PostgreSQL By DAZ

It is not necessary to actually download PostgreSQL as DAZ Studio installs a version in to the default setup as in the below image. I won’t go over DAZ Studio setup here.

And, to run the PostgreSQL graphical interface program you can select the pgAdmine3.exe

Download and Install PostgreSQL

If you want more power then what the DAZ studio installed version you can download
and install from here.

DAZ STUDIO CMS Version

Once you get here with your installation or DAZ defaults, you will need to connect PostgreSQL to your DAZ content database. From within DAZ studio you can find the
default port that DAZ Studio has used for setup. I’ve kept things at the default but you can change to suit your environment.
To get the port number to use for PostgreSQL for DAZ Studio select Edit preferences or “F2”. Select the CMS Settings tab and you’ll see the installation folder and the Port number that daz uses to communicate with the database.

To connect PostgreSQL to DAZ Studio I’ll keep the default port at 17237.
Although I’ve already connected to DAZ3D CMS database in this image, the application displayed should look like this.

Next we will want to add a new server in this case is 127.0.0.1 which is the local machine.

Fill out the values like this. If you want a password on the database you can add one here.
I named my Server “MyDaz” and used the user name of “dzcms” a name to use that I found
on DAZ3d forums.

After setup if you’re not familiar with PostgreSQL make sure you use DAZ Studio Interface to backup your database.

DAZ Studio Query Tables

To show connectivity is working I’ll show you a couple quick methods for displaying data from within tables without major coding. 🙂
From the PostgreSQL interface you can choose to view rows

Selected Content Table Rows (100 rows) is the default. On real large databases you’ll want to limit the number of results returned.

And, you want to select rows it’s similar with a slightly different interface. Select the “Select Script”

Just a few ways to query the data in the tables.

PostgreSQL ODBC Driver Installation

To access PostgreSQL form other windows applications you will need to install ODBC drivers  for PostgreSQL for both 32/64bit versions.

You can find them at https://odbc.postgresql.org/
And since I’m running 64bit windows I’ll just download the MSI version for now and the 32bit if required.  And, I’ve downloaded the code so I can see how it is constructed.  https://www.postgresql.org/ftp/odbc/versions/msi/ get both
Psqlodbc_09_06_0200-x86.zip and Psqlodbc_09_06_0200-x64.zip as I’ve noticed that some apps that I’ve tried to connect with like 32bit versus the 64bit version.

Download and virus checked.  And, unzipped into a folder. Always read the readme file.
PSQLODBC.DLL – A library to talk to the PostgreSQL DBMS using ODBC.

ODBC Setup Screens

Select Next

Select Check BOX- Select Next

Select to install documentation or just select next.

Select Install

After installation complete select Finish.

Both installations for 32/64bit follow the same installation and setup style.

Create DSN for ODBC driver assocation

You can search for this “What is a DSN (Data Source Name)?” or go to this link https://support.microsoft.com/en-us/help/966849/what-is-a-dsn-data-source-name
further explanation of what a “DSN” name is.

From windows “RUN” type the following for 64BIT  DSN creation.

I chose Unicode for my DSN.

From windows “RUN” type the following for 32BIT  DSN creation.

When you select finish you can create your DSN name that will be used in other applications.

The name of the database is defined when you connect using pgAdmin3. The ports, server name and user name are the same as used

for PostgreSQL database attachment.

You can actually create this with corresponding names based on 32/64bit naming conventions.

Using “DSN” created in Excel

After you open up a new sheet in Excel, select item “From Other Sources” located on the “Data Tab”

We can select the “From Data Connection Wizard” and from there select the ODBC DSN.

Select next after ODBC DSN

What is nice about this method we get a list of tables that we can work with.

Joseph Kravis

Select a table

Choosing next gives us this screen.

Review make adjustments if needed and select Finish

You can paste into existing sheet or create a new sheet.

 

As this isn’t really a tutorial on using Excel, I thought I might add that you can sort columns filter and organize

And, by filtering you can see what purchases you made for a given date much faster then having to track this down.

Column Filter By Date

Using “DSN” Created in SSRS

Not many people have SSRS or the desire to create reports with SSRS. This isn’t a full blown tutorial on using SSRS, but on

creation of the data sources. Either Shard Data Sources or Shared Data Sets and using those to connect the DSN for reporting

from DAZ Studios CMS database.

After opening up Visual Studio I’m using 2013 here. right click on Shared Data Sources

Create a name for your Shared Data Source with type as ODBC, then select edit.

Select the PostgreSQL30_32 item after testing the connection select OK.

Your connection string will be “Dsn=PostgreSQL30_32”  Select OK on the Shared Data Source Properties screen.

I’ll create a new report.

Select the newly created “Shared Data Source” and select next. If you have a query

already created and is syntax checked enter that as your query string.  You can select next or

enter the query builder where if there are errors will show you.

Product ‘4422’ is one that I’ve been using in my postings about DAZ3d Content Management. Or you could just drop the

WHERE clause and return all records. And if you enter the query editor you can select tables or views to work with for this report

this is the selection returned form query editor

SELECT store_id, name, guid, last_update
FROM dzcontent.product

From that Here is a simple created report from or DSN through created report.

Create Simple Tabular Report

Select how items are grouped

Jump to the end and select finish

Simple report with adjust column headings

Article Recap

Hopefully I’ve kept this article simple and easy to use. I like the follow along approach an article like this offers.

  • PostgreSQL up front information
  • Setup and installation of ODBC drivers
  • Simple Excel Report Process
  • Simple SSRS Report Process

Also, these steps for using PostgreSQL aren’t only meant for usage with DAZ Studio.  You can use theses tips to install ODBC for
other usages as well. You could also set this up with Microsoft access.
My goal is to build a mini accounting system for my content. Knowing how much I paid in relation to content in my database and content on my machine
will help me out if I ever get to that PAID ARTIST category. 🙂

If you see errors or omissions please let me know. Comments are always welcome.
Thank you,

Joseph

 

Categories: DAZ3D, PostgreSQL, Technology, TSQL Tags: ,

Certificate :- Test Website SSL Protocols

April 22, 2017 1 comment

Certificate :- Test Website SSL Protocols

NOTE: This turned out to be a long post.

The “S” in HTTPS:\\ means that the site has security and is supposed to be secure.  There is a lot more to the “S” though and that’s where the SSL part of the title comes from.  Hypertext Transfer Protocol Secure (HTTPS) is a combination of the Hypertext Transfer Protocol (HTTP) with the Secure Socket Layer (SSL)/Transport Layer Security (TLS) protocol. TLS is an authentication and security protocol widely implemented in browsers and Web servers. So, as you surf around the web, you’ll encounter HTTPS:\\ and HTTP:\\ sites. I would never enter personal information into an HTTP:\\ site as this isn’t secure.  And, supposedly with HTTPS:\\ sites like banks and online shopping there is some expectation that the site is secure enough for us to do our transactions. All these things you should be aware of.

So, as we take precautions for doing business online I’ve recently noticed that so many certificates are expiring on web sites I’m visiting and it’s occurring on an assortment of sites. I use a tool that popups when I encounter issues with certificates. You’d think that a company’s overall strategy for ensuring that security certificates remain up to date and monitored for expiration dates should be in the daily SOP.

After looking into the issue I decided I’d start by searching for something along the lines of this code by  Chris Duck  I already had some simple code that I’ve been using but wanted something where I could modify to suite my needs. In fact found many similar ideas while searching the web and this is how I’ve put my research to use.

Loading and running Test-WebSiteSslProtocols

I made some simple modifications to the script as this location:
begin
{
$ProtocolNames = [System.Security.Authentication.SslProtocols] | gm -static -MemberType Property | ?{ $_.Name -notin @(“Default”, “None”) } | %{ $_.Name }
$global:certinfo = @() #- ADD THIS
}
process
And at this place in the code.

[PSCustomObject]$ProtocolStatus
$global:certinfo = $ProtocolStatus #- ADD THIS

This is the modified function without all the comments.
function Test-WebSiteSslProtocols
{
<#
.SYNOPSIS
Check validity of SSL on web sites visited populate collection to be used for updating into database
Ideas found searching the web
#>

param (
[Parameter(Mandatory = $true, ValueFromPipelineByPropertyName = $true, ValueFromPipeline = $true)]
$URLChecked,
[Parameter(ValueFromPipelineByPropertyName = $true)]
[int]$Port = 443
)
begin
{
$ProtocolNames = [System.Security.Authentication.SslProtocols] | gm -static -MemberType Property | ?{ $_.Name -notin @(“Default”, “None”) } | %{ $_.Name }
$global:certinfo = @()
}
process
{
$ProtocolStatus = [Ordered]@{ }
$ProtocolStatus.Add(“URLChecked”, $URLChecked)
$ProtocolStatus.Add(“Port”, $Port)
$ProtocolStatus.Add(“KeyLength”, $null)
$ProtocolStatus.Add(“SignatureAlgorithm”, $null)

$ProtocolNames | %{
$ProtocolName = $_
$Socket = New-Object System.Net.Sockets.Socket([System.Net.Sockets.SocketType]::Stream, [System.Net.Sockets.ProtocolType]::Tcp)
$Socket.Connect($URLChecked, $Port)
try
{
$NetStream = New-Object System.Net.Sockets.NetworkStream($Socket, $true)
$SslStream = New-Object System.Net.Security.SslStream($NetStream, $true)
$SslStream.AuthenticateAsClient($URLChecked, $null, $ProtocolName, $false)
$RemoteCertificate = [System.Security.Cryptography.X509Certificates.X509Certificate2]$SslStream.RemoteCertificate
$ProtocolStatus[“KeyLength”] = $RemoteCertificate.PublicKey.Key.KeySize
$ProtocolStatus[“SignatureAlgorithm”] = $RemoteCertificate.SignatureAlgorithm.FriendlyName
$ProtocolStatus[“Certificate”] = $RemoteCertificate
$ProtocolStatus.Add($ProtocolName, $true)
}
catch
{
$ProtocolStatus.Add($ProtocolName, $false)
}
finally
{
$SslStream.Close()
}
}
[PSCustomObject]$ProtocolStatus
$global:certinfo = $ProtocolStatus
}
}

The [PSCustomObject]$ProtocolStatus is local to the function. As I wanted to use this for input into a function to update
a database I’ve added $global:certinfo = $ProtocolStatus
$global:certinfo can now be used as input to other functions.

I’ll be investigating running this same process in the cloud where it would be cool cataloging information contained in certificates or to look
for anomalies that might be found in a certificate.

Running the function (Test-WebSiteSslProtocols http://www.chase.com) above will return as a data set:

As I’ve also added the global collection which will allow you to drill down into the results.

Peering into the global collection

I’ve used 3 different methods to display information from the collection.

  1. $global:certinfo
  2. $global:certinfo.Certificate
  3. $global:certinfo.Certificate | format-list

As you see each returns varying views that can be used for further analysis or updating into a database.  With method 1($global:certinfo) we have a collection of values. ComputerName,
Port, KeyLength, SignatureAlgorithm, Ssl2, Ssl3,  Tls, Tls11,  and Tls12 are single values that don’t require a drill down to collect populate values for updating into a database.

Certificates returns a collection.  Item 2 ($global:certinfo.Certificate), dose not return enough visual information I use Item 3. $global:certinfo.Certificate | format-list

as returned by or function to get web records.

Extensions is returned as an object. As of yet I’ve not found a sql datatype at least in 2012 to use for import directly. So, I’ve created this snippet that can extract and serialize into xml.

To see all the elements available I’m using method 3( $global:certinfo.certificate | Format-List) As this returns the Extensions information.

The area that contains Subject, Issuer, Thumbprint, FriendlyName, NotBefore, NotAfter and Extensions.  Extensions contains another collection for the Oid values.

$global:certinfo.certificate.Extensions and $global:certinfo.certificate.Extensions.Oid (not very friendly)

The following code snippet I’ve wrapped into a function just for simplicity. Just strait code without try/catch blocks at least for now. 🙂

This function, takes the the $global:certinfo.certificate.Extensions data and generates a more friendly way to view the data.

function CompactExtensions() # Also, in Insert-SslWebSite function (slightly different format)
$global:MyExt = @()
$global:ext = @()
$Extensions = & {
for ($i = 0; $i -le $global:certinfo.Certificate.Extensions.count – 1; $i++)
{
Write-Output ([pscustomobject]@{

Extensions = & {
Write-Output ([pscustomobject]@{
ValueU = $global:certinfo.Certificate.Extensions.EnhancedKeyUsages[$i].Value
FriendlyNameU = $global:certinfo.Certificate.Extensions.EnhancedKeyUsages[$i].FriendlyName
Value = $global:certinfo.Certificate.Extensions.oid[$i].Value
FriendlyName = $global:certinfo.Certificate.Extensions.oid[$i].FriendlyName
RawData = ([string]::Join(“,”, $global:certinfo.Certificate.Extensions[$i].RawData))
})
# #End foreach objectcollection
} #End Object Collection$
})
}
}
$global:ext = $extensions
$global:MyExt = $global:ext.extensions

$EnhancedKeyUsageList = & {
Write-Output ([pscustomobject]@{
EnhancedKeyUsageList = {
foreach ($itm in ($global:certinfo.Certificate.Extensions.EnhancedKeyUsages))
{
Write-Output ([pscustomobject]@{
Value = $itm.Value
FriendlyName = $itm.FriendlyName
})
} #End foreach objectcollection
} #End Object Collection
})
}
}

When you run CompactExtensions the Oid collection is extracted and merged into $global:extensions and when presented in this way is more readable.

A version of CompactExtensions is also in the Insert-SslWebSite

As, I want to insert or update records into a database  I want to convert the $global:extensions.extensions into xml for insertion into the database field extensions.

So, using a small piece of .NET $ExtensionsXml = [System.Management.Automation.PSSerializer]::Serialize($global:extensions.extensions, 3)

has created an in memory variable that will be used to input into the extensions field in the table.

Database Fields.

Adding this information into a database will allow for further analysis.  I’ve created (WIP) an API for backend calls to assist with parsing and validating information in the Certificate Subject field and OID values from the extensions xml field.  My API tracks OIDs, serial numbers and generally information contained in the certificate. Still WIP but soon to be published.

CertificateSubject
CN=www.chase.com, OU=GTI GNS, O=JPMorgan Chase and Co., STREET=270 Park Ave, L=New York, S=New York, PostalCode=10017, C=US, SERIALNUMBER=0691011, OID.2.5.4.15=Private Organization, OID.1.3.6.1.4.1.311.60.2.1.2=Delaware, OID.1.3.6.1.4.1.311.60.2.1.3=US
CN=www.umpquabank.com, OU=Umpqua Bank, O=Umpqua Bank, STREET=445 SE Main St., L=Roseburg, S=Oregon, PostalCode=97470, C=US, SERIALNUMBER=143662, OID.2.5.4.15=Private Organization, OID.1.3.6.1.4.1.311.60.2.1.3=US

My first pass at setting up the table that I needed used varchar and nvarchar on fields.

CREATE TABLE [dbo].[SSLByWebSite](
 [URLChecked] [varchar](255) NULL,
 [Port] INT NULL,
 [KeyLength] [varchar](255) NULL,
 [SignatureAlgorithm] [varchar](255) NULL,
 [Ssl2] BIT NULL,
 [Ssl3] BIT NULL,
 [CertificateSubject] [nvarchar](max) NULL,
 [Extensions] [nvarchar](max) NULL,
 [CertificateIssuer] [varchar](255) NULL,
 [CertificateSerialNumber] [varchar](255) NULL,
 [CertificateNotBefore] [varchar](255) NULL,
 [CertificateNotAfter] [varchar](255) NULL,
 [CertificateThumbprint] [varchar](255) NULL,
 [Tls] BIT NULL,
 [Tls11] BIT NULL,
 [Tls12] BIT NULL
 ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

GO

Function to upload into database.

The fields that are in the table are represented in the Insert-SslWebSite function.

This function builds up the parameter list that is passed into SQL to be inserted into the table.

Currently no data is checked for dups, nor indexed. In this release. I will add at least one to URLChecked as I do want to make delta

comparison checks on future calls to the web sites.

function Insert-SslWebSite ()
{ # Currently not manditory
param (
[string]$URLChecked = $global:certinfo.URLChecked
,
[string]$ServerName = $env:servername
,
[int]$NewCompPKID
)
[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.Smo’) | out-null

$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = “Server=$ServerName; Database=<DATABASENAME>; Integrated Security=true”
$conn.Open()

$cmd = New-Object System.Data.SqlClient.SqlCommand
$cmd.CommandText = “INSERT INTO SSLByWebSite(URLChecked,Port `
,KeyLength,SignatureAlgorithm,Ssl2,Ssl3,CertificateSubject,Extensions,CertificateIssuer,CertificateSerialNumber`
,CertificateNotBefore, CertificateNotAfter, CertificateThumbprint, Tls,Tls11, Tls12) `
VALUES (@URLChecked, @Port, @KeyLength, @SignatureAlgorithm, @Ssl2, @Ssl3
,@CertificateSubject, @Extensions, @CertificateIssuer `
,@CertificateSerialNumber `
,@CertificateNotBefore `
,@CertificateNotAfter `
,@CertificateThumbprint `
,@Tls `
,@Tls11 `
,@Tls12);”

if ($global:certinfo.KeyLength) { “has VALUE” }
else { $global:certinfo.KeyLength = 0 }

$cmd.Connection = $conn
$CompInfoEntryDate = Get-Date

# Split Extensions prior to insertion into database
$Extensions = & {
for ($i = 0; $i -le $global:certinfo.Certificate.Extensions.count – 1; $i++)
{
Write-Output ([pscustomobject]@{
Extensions = & {

Write-Output ([pscustomobject]@{
EnhancedKeyUsageList = & {
TRY
{
foreach ($itm in ($global:certinfo.Certificate.Extensions[$i].EnhancedKeyUsages))
{
Write-Output ([pscustomobject]@{
Value = $itm.Value
FriendlyName = $itm.FriendlyName
})
} #End foreach objectcollection
} #End Object Collection
catch [System.IO.IOException] {
}
# Catch all other exceptions thrown by one of those commands
catch
{
# not really doing anything with these but here in case
}
# Execute these commands even if there is an exception thrown from the try block
finally
{
# not really doing anything with these but here in case
}
}
Critical = $global:certinfo.Certificate.Extensions[$i].Critical
Value = $global:certinfo.Certificate.Extensions.oid[$i].Value
FriendlyName = $global:certinfo.Certificate.Extensions.oid[$i].FriendlyName
RawData = ([string]::Join(“,”, $global:certinfo.Certificate.Extensions[$i].RawData))
})
# #End foreach objectcollection
} #End Object Collection$
})
}
}
# Quick export and import as string to for db insert
# Used as example. In practice can use different methods to generate. Both methods work great
# $extensions.extensions | export-clixml c:\temp\temp.txt
# [string]$mdxml = get-Content c:\temp\temp.txt # Value is inserted as a string
# $extensions.extensions | export-clixml c:\temp\temp.txt
$Depth = 3
[string]$mdxml = [System.Management.Automation.PSSerializer]::Serialize($extensions.extensions, $Depth)

# $mdxml = “”
$cmd.Parameters.AddWithValue(“@URLChecked”, $global:certinfo.URLChecked) | Out-Null
$cmd.Parameters.AddWithValue(“@Port”, $global:certinfo.Port) | Out-Null
$cmd.Parameters.AddWithValue(“@KeyLength”, $global:certinfo.KeyLength) | Out-Null
$cmd.Parameters.AddWithValue(“@SignatureAlgorithm”, $global:certinfo.SignatureAlgorithm) | Out-Null
$cmd.Parameters.AddWithValue(“@Ssl2”, $global:certinfo.Ssl2) | Out-Null
$cmd.Parameters.AddWithValue(“@Ssl3”, $global:certinfo.Ssl3) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateSubject”, $global:certinfo.Certificate.Subject) | Out-Null
$cmd.Parameters.AddWithValue(“@Extensions”, $mdxml) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateIssuer”, $global:certinfo.Certificate.Issuer) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateSerialNumber”, $global:certinfo.Certificate.SerialNumber) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateNotBefore”, $global:certinfo.Certificate.NotBefore) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateNotAfter”, $global:certinfo.Certificate.NotAfter) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateThumbprint”, $global:certinfo.Certificate.Thumbprint) | Out-Null
$cmd.Parameters.AddWithValue(“@Tls”, $global:certinfo.Tls) | Out-Null
$cmd.Parameters.AddWithValue(“@Tls11”, $global:certinfo.Tls11) | Out-Null
$cmd.Parameters.AddWithValue(“@Tls12”, $global:certinfo.Tls12) | Out-Null

$cmd.ExecuteNonQuery() | Out-Null

$conn.Close()
}

Once you have both functions implemented it’s easy to run as follows:

Test-WebSiteSslProtocols http://www.chase.com | Insert-SslWebSite.

It’s nice to be able to pipe this into the Insert-SslWebSite function. And the results of these simple methods show xml for extensions and an inserted

record for URLChecked. I will add a key to URLChecked. 🙂

I intend on placing all code with comments on gethub after I complete my testing and some minor updates to functions to allow for parallel workflows.

So, in a future post I’ll layout how I use PowerShell with Workflows and running parallel while reading in list of websites I want to monitor.

In larger shops this can be beneficial.

Let me know what you think.

Joseph

Find database Schema Users

February 26, 2017 Leave a comment

I’ve imported and migrated a tremendous amount of databases into SQLServer versions 2012 and above. Finding and identifying old owners as they would be orphans now or users that should not have access can be made easier by running or fully automating an import workflow by adding this script to the process.

It is pretty basic in structure and uses #TempTable rather than @TempTable.

SELECT Mp.Type AS UserType, Mp.name AS  UserName , Mp.default_Schema_name AS UserSchema
INTO #ModifySchemaTemp
FROM sys.database_role_members drm
join sys.database_principals rp on (drm.role_principal_id = rp.principal_id)
join sys.database_principals mp on (drm.member_principal_id = mp.principal_id)
WHERE Mp.Type = ‘U’

I select from sys.database_role_members and self join with database_principlas against databse_principals selecting user type “U” into my #ModifySchemaTemp table.
A simple WHILE loop provides the core of the script.  For the amount of data I choose not to use CURSORS.

SELECT @cnt=COUNT(*) from #ModifySchemaTemp

WHILE(@cnt >0)
BEGIN
–Set Loop While Keys
SELECT @UserKey=UserName from #ModifySchemaTemp
SELECT @SchemaKey=UserSchema from #ModifySchemaTemp

–REST OF PROCESSING

END

Modify the script to suit your needs. As you see from the script rather than take action immediately which could be done I’ve choose to print to the screen for later documentation.

SET NOCOUNT ON

SELECT Mp.Type AS UserType, Mp.name AS  UserName , Mp.default_Schema_name AS UserSchema
INTO #ModifySchemaTemp
FROM sys.database_role_members drm
join sys.database_principals rp on (drm.role_principal_id = rp.principal_id)
join sys.database_principals mp on (drm.member_principal_id = mp.principal_id)
WHERE Mp.Type = ‘U’

— Check results if you want uncomment the following
–SELECT * FROM #ModifySchemaTemp

DECLARE @UserKey VARCHAR(50)
, @SchemaKey VARCHAR(80)
, @cnt INT
, @SchemaString VARCHAR(512)
, @UserString VARCHAR(512)

SELECT @cnt=COUNT(*) from #ModifySchemaTemp

WHILE(@cnt >0)
BEGIN
SELECT @UserKey=UserName from #ModifySchemaTemp
SELECT @SchemaKey=UserSchema from #ModifySchemaTemp

— Don’t delete the default schema for DBO
IF @SchemaKey <> ‘DBO’
BEGIN

SELECT @SchemaString = ‘IF  EXISTS (SELECT * FROM sys.schemas WHERE name = N”’ + @SchemaKey + ”’)
DROP SCHEMA [‘+ @SchemaKey + ‘]
GO’
print @SchemaString

–DROP USER ACCOUNTS ONLY FROM NEWLY CREATED RESTORED DATABASE!
SELECT @UserString = ‘IF  EXISTS (SELECT * FROM sys.database_principals WHERE name = N”’+ @UserKey + ”’)
DROP USER [‘ + @UserKey + ‘]
GO’
PRINT @UserString

END
ELSE
BEGIN
–DROP USER ACCOUNTS ONLY FROM NEWLY CREATED RESTORED DATABASE!
SELECT @UserString = ‘IF  EXISTS (SELECT * FROM sys.database_principals WHERE name = N”’ + @UserKey + ”’)
DROP USER [‘ + @UserKey + ‘]
GO’

PRINT @UserString
END

–END

DELETE FROM #ModifySchemaTemp WHERE UserName=@UserKey
SELECT @cnt=COUNT(*) FROM #ModifySchemaTemp

END
DROP TABLE #ModifySchemaTemp

Categories: TSQL Tags: , ,