Archive for April, 2017

3D ART :- DAZ Studio Content Organization PART 1

3D ART :- DAZ Studio Content Organization


Even before using DAZ Studio content organization has been an issue. Because of that I’ve setup my workflow to download, virus check etc., Organize purchased and not purchased items and making sure that Items I do have appear in the content database(s) with appropriate meta data that helps describe my content. Someday I’ll be exporting from DAZ into formats for usage within Unity and Unreal environments, and that management can even be more tedious. I also follow the same and similar formatting rules for using light room, Photoshop and processing my GoPro videos’ So, it’s good to identify and setup a workflow or a workflow that has few well thought out steps so you’ll continue using a method.

Another thing, content is expensive and to not protect it like other assets is crazy stupid. Along with your time to setup organize and process can be very time consuming. So, I’ve started to take a look at how I can organize my content the automated way. With this in mind, this posting covers simple hard drive organization first and foremost and what to look for in the zip files you get from various sources that you’ll be using in DAZ3D. From my usage of DAZ3D,  and reading the forums I’ve seen the pattern and structure that exists for content downloads.

The best way to learn is to dig in on this issue. Hopefully you’ll find the information useful as well. Before I get to content structure on my
PC, I thought I would start with some break down of zip files and their organization for DAZ and Renderosity which are two sites I’ve purchased content and downloaded free content from.

ZIP File Structure and DAZ Content Downloads

I’m sure that not all sites follow the same format and organization for producing packaged ZIP files for import, either using the DIM :- Daz Install Manager or by manual installation. My topic also covers stuff after January 01 2017 as that’s the month I started using DAZ and don’t really have much older content but this posting should fit for multi methods.

Daz studio names content manually and DIM downloaded zip files with IM99999999-01_NameOfProduct.ZIP. If there are more than one to install for same SKU# It will be IM99999999-02_NameOfProduct.ZIP etc. This is nice as the product SKU# and name is included in the in the name.

For Example This file is called that I manually downloaded. Searching for 4422 in the product store

When you select the product image displays the page Medieval-Furniture-Pack. Which is also part of the name.

That makes it easy when you want to look up content pages. It could be important if you want to search for something. If the dashes “-” are missing when trying to run the url you’ll recive a 404 error so just add the “-” back in.

By changing the number
A link like this and you will be taken to the product page where you can download related items that I’ve found to include templates, or additional usage documentation.

XML Manifest

Included in the root portion of the zip file is an Manifest.dsx text wise is an XML file as *.XML file which is basically a text file organized kind of like the hierarchy of a folder structure.
\\Libraries etc.
The outer tags just tell DAZ Install what version the install manage is and the GlobalID is an identifier called a GUID that identifies the product id.
SIDE NOTE: Powershell can use $myGuid = [guid]::NewGuid()

 <DAZInstallManifest VERSION="0.1">
  <GlobalID VALUE="7449d014-cc18-e696-c9d6-a90a165153f6"/>
  <File TARGET="Content" ACTION="Install" VALUE="Content/ReadMe's/MedievalFurnitureREADME.htm"/>
Some ZIP Files have an additional file called Supplement.dsx. This particular ZIP doesn’t include one. In it's simple form it looks something like this:
 <ProductSupplement VERSION="0.1">
 <ProductName VALUE="Medieval Furniture Pack"/>
 <InstallTypes VALUE="Content"/>
 <ProductTags VALUE="DAZStudio4,DAZStudio3,DAZStudioLegacy,Poser9,PoserLegacy"/>

I’ll show why this could be helpful.
Now if you look at the hierarchy you can see that the manifest values are showing DAZ Install Manager where to install content.

Actually it is pretty straight forward way to organize and get thinking along the lines of organizing and managing content.

If you’re included to. 🙂 you can download a tool from Microsoft called XML Notepad. Its free and that makes
The tool even cooler. Here is the Manifest.dsx for I’d post a link, you just never know if a link is active or not.

Who knows all of this could change in the future. But, for now this has helped me understand the structure of content and how to manage this content

I also bring up the Manifest.xml file now, as I’ve built a tool that will read thru content folders without manifests and generate manifests for that content
Regardless of where you found freebies and made purchases. I’ve actually used PowerShell for this process as I’m still new to python. I’ve been recreating the same process with Python so hopefully someday can be ran as a python script via DAZ.

DAZ Install Manager-DIM

And, using DIM as it’s called installs the content into the default “Content Files Install To:”

If you downloaded to the default download folder

And looking at that folder for SKU# 4422 (been using all along for this article) You can see the same structure of zip file from above along with its manifest.
Again no supplement file from content downloaded using DIM.
If you notice there is IM00004422-01_MedievalFurniturePack.dsx which is essential an *.XML file that I mentioned.

Included in ZIP(if one were to exist)
 <ProductSupplement VERSION="0.1">
 <ProductName VALUE="Medieval Furniture Pack"/>
 <InstallTypes VALUE="Content"/>
 <ProductTags VALUE="DAZStudio4,DAZStudio3,DAZStudioLegacy,Poser9,PoserLegacy"/>
Product download had dramatically more information and is the same name as zip download with *.dsx extension. 
 <?xml version="1.0" encoding="UTF-8"?>
 <ProductSupplement VERSION="0.1">
 <ProductName VALUE="Medieval Furniture Pack"/>
 <ProductStoreIDX VALUE="4422-1"/>
 <UserOrderId VALUE="24410813"/>
 <UserOrderDate VALUE="2017-04-05T20:19:08Z"/>
 <InstallerDate VALUE="2013-02-22T19:49:20Z"/>
 <ProductFileGuid VALUE="9857ba43-ad04-d6b1-a258-80799256b889"/>
 <InstallTypes VALUE="Content"/>
 <ProductTags VALUE="DAZStudio4_5,DAZStudio4,DAZStudio3,DAZStudioLegacy,Poser9,PoserLegacy,CloudAvailable"/>

Having 2 different essentially supplement files is not good unless that’s how it’s intended to be. From my short example above
There is a <GlobalID VALUE=”7449d014-cc18-e696-c9d6-a90a165153f6″/> and <ProductFileGuid VALUE=”9857ba43-ad04-d6b1-a258-80799256b889″/>
The names are different but I wonder if they are being used for the same purpose and should be the same name and GUID value. One could
be a ProductID and the other could be VendorID I’ll see If I can find out.

Content Organization Hard Drive

Organizing your hard drive is much like the ordering of content free and otherwise. You go to a site order. :-). After the purchase
You can download your content into a decent organized folder structure. I keep mine by order # if purchased and free by “content creator” If
There is a name found.










After my downloads are complete,  I run from Powershell command line tools to extract, create manifest and create DUF files for text files.

I start by reading thru the unzipped files and zip to the passed in variable

$CopyFromFolder = “Drive:\Downloads\Purchase from DAZ”
$CopyToFolder = “Drive:\Downloads\Purchase from DAZ\UnZipped”
$CopyFromFolder = “Drive:\Downloads\Purchase from Renderosity”
$CopyToFolder = “Drive:\Downloads\Purchase from Renderosity\UnZipped”

Files are unzipped into folders based on ZIP file names.

That’s it for now. Next time I’ll go over the Powershell that does the work.

Comments are welcome.

Thank you,

TOPIC :- Fake News What If?

TOPIC :- Fake News What If?

I had orginally published this on LinkedIn.

According to ancient astronaut theorists 🙂 what if:

Facebook, Google, and the rest of the sites that allowed for the display of said “FAKE NEWS” were most likely RECEIVING TONS of click dough. Each time you clicked on one of those news stories they received some form of payment. They only mentioned that they would investigate this fake news when some folks said it might have swayed the election. If the sites that they say originated the “FAKE NEWS” were receiving between 5k – 20k a month the amount Facebook, Google and others must have received could be millions. So, why give up the easy money?

If you think for a moment that companies with huge budgets and large analytics teams pouring over the logs and results of those clicks didn’t take notice that would be a misnomer. Facebook has already been called out on faking or tweaking the trending’s listed.

Here is another way to look at it. If you’ve ever played some type of video game then you know that obstacles are created to enhance and challenge game play. They are placed at certain points in a game. Some obstacles are in the same place every time allowing for you to practice and get past the difficult parts. Others are random and hopefully that will keep you on your toes.

Think of game play on your personal feeds with Facebook, in particular you can adjust based on “Top Stories” and “Most Recent”. Adjusting these settings organize the feed stream you see based on that. On your feed you see many items appearing along with some sponsored and others from the games you’ve played or are playing. You also might see items appear that you just looked at on Amazon or Ebay (Click income).

(How this could work). Let’s suppose that you have certain political views. You might get news that reflects left or right points of view. If your views were only left you might get an assortment of right laced views to continue fanning the flame. You know the obstacles as you encountered in games played. Further adjustment of your stream, are items that you click on (Like, Love, Haha, Wow, Sad & Angry) to help identify a post and give nonverbal feelings to the post. Those clicks also generate metrics and data that formats how your Facebook will eventually be viewed by you.

So, regardless of service Twitter, Facebook, Google etc.… In an instant opinions and views can be adjusted at the speed of a post. To intensify the scope you can link Facebook with Twitter. So, what you post on Facebook it will automatically post on Twitter. Then with that post have Twitter Post to LinkedIn. With one post you could have the potential to engage 1000’s by retweeting and reposting by others.

Feeds from most services are tailored to users based on many factors. It would be nice to have better filters that we could tailor. While even a free service requires some form of income other then selling our metrics to others, the tailoring is getting better at predicting which types of things you click on where they will make the most money. And, that’s the part we don’t have the ability to filter

Yes it could be malicious or lucrative if used for that intent so…What If?

Categories: #kravis, Thoughts and Ideas Tags:

3D ART :- Shaded Heaven

3D ART :- Shaded Heaven

Environment settings for this render.

*Higher resolution images available. 



3D ART :- The Hidden Alley

The Hidden Alley

The nice thing about 3D art is I get a tremendous amount of practice on camera settings. Things that work in the real world can be translated into the 3D world. Designing digital sets is what I used to do in the real world when I was doing more out in the field photography.  Digital sets and placement are almost identical. But, there are differences.

Getting just the right amount of DOF was challenging as I had 1 main hair light and 4 surrounding lights with barn doors to help mold and direct the light. Lights from the right had blue filter.  With lighting using iray, I’m having to discover the best way to keep an image form getting or being too grainy.

Here are my settings for this image.  I derived at these after reading and watching many videos on how to setup and establish lighting in the 3D world.

DAZ3D Environment Settings

Here are the environment settings I used with this image.

Game Ready Object

Most object I work with can be exported and used in either Unity, Unreal or Blender.  Learning all 3 at the same time. Is taxing on the brain. 🙂

Certificate :- Test Website SSL Protocols

April 22, 2017 1 comment

Certificate :- Test Website SSL Protocols

NOTE: This turned out to be a long post.

The “S” in HTTPS:\\ means that the site has security and is supposed to be secure.  There is a lot more to the “S” though and that’s where the SSL part of the title comes from.  Hypertext Transfer Protocol Secure (HTTPS) is a combination of the Hypertext Transfer Protocol (HTTP) with the Secure Socket Layer (SSL)/Transport Layer Security (TLS) protocol. TLS is an authentication and security protocol widely implemented in browsers and Web servers. So, as you surf around the web, you’ll encounter HTTPS:\\ and HTTP:\\ sites. I would never enter personal information into an HTTP:\\ site as this isn’t secure.  And, supposedly with HTTPS:\\ sites like banks and online shopping there is some expectation that the site is secure enough for us to do our transactions. All these things you should be aware of.

So, as we take precautions for doing business online I’ve recently noticed that so many certificates are expiring on web sites I’m visiting and it’s occurring on an assortment of sites. I use a tool that popups when I encounter issues with certificates. You’d think that a company’s overall strategy for ensuring that security certificates remain up to date and monitored for expiration dates should be in the daily SOP.

After looking into the issue I decided I’d start by searching for something along the lines of this code by  Chris Duck  I already had some simple code that I’ve been using but wanted something where I could modify to suite my needs. In fact found many similar ideas while searching the web and this is how I’ve put my research to use.

Loading and running Test-WebSiteSslProtocols

I made some simple modifications to the script as this location:
$ProtocolNames = [System.Security.Authentication.SslProtocols] | gm -static -MemberType Property | ?{ $_.Name -notin @(“Default”, “None”) } | %{ $_.Name }
$global:certinfo = @() #- ADD THIS
And at this place in the code.

$global:certinfo = $ProtocolStatus #- ADD THIS

This is the modified function without all the comments.
function Test-WebSiteSslProtocols
Check validity of SSL on web sites visited populate collection to be used for updating into database
Ideas found searching the web

param (
[Parameter(Mandatory = $true, ValueFromPipelineByPropertyName = $true, ValueFromPipeline = $true)]
[Parameter(ValueFromPipelineByPropertyName = $true)]
[int]$Port = 443
$ProtocolNames = [System.Security.Authentication.SslProtocols] | gm -static -MemberType Property | ?{ $_.Name -notin @(“Default”, “None”) } | %{ $_.Name }
$global:certinfo = @()
$ProtocolStatus = [Ordered]@{ }
$ProtocolStatus.Add(“URLChecked”, $URLChecked)
$ProtocolStatus.Add(“Port”, $Port)
$ProtocolStatus.Add(“KeyLength”, $null)
$ProtocolStatus.Add(“SignatureAlgorithm”, $null)

$ProtocolNames | %{
$ProtocolName = $_
$Socket = New-Object System.Net.Sockets.Socket([System.Net.Sockets.SocketType]::Stream, [System.Net.Sockets.ProtocolType]::Tcp)
$Socket.Connect($URLChecked, $Port)
$NetStream = New-Object System.Net.Sockets.NetworkStream($Socket, $true)
$SslStream = New-Object System.Net.Security.SslStream($NetStream, $true)
$SslStream.AuthenticateAsClient($URLChecked, $null, $ProtocolName, $false)
$RemoteCertificate = [System.Security.Cryptography.X509Certificates.X509Certificate2]$SslStream.RemoteCertificate
$ProtocolStatus[“KeyLength”] = $RemoteCertificate.PublicKey.Key.KeySize
$ProtocolStatus[“SignatureAlgorithm”] = $RemoteCertificate.SignatureAlgorithm.FriendlyName
$ProtocolStatus[“Certificate”] = $RemoteCertificate
$ProtocolStatus.Add($ProtocolName, $true)
$ProtocolStatus.Add($ProtocolName, $false)
$global:certinfo = $ProtocolStatus

The [PSCustomObject]$ProtocolStatus is local to the function. As I wanted to use this for input into a function to update
a database I’ve added $global:certinfo = $ProtocolStatus
$global:certinfo can now be used as input to other functions.

I’ll be investigating running this same process in the cloud where it would be cool cataloging information contained in certificates or to look
for anomalies that might be found in a certificate.

Running the function (Test-WebSiteSslProtocols above will return as a data set:

As I’ve also added the global collection which will allow you to drill down into the results.

Peering into the global collection

I’ve used 3 different methods to display information from the collection.

  1. $global:certinfo
  2. $global:certinfo.Certificate
  3. $global:certinfo.Certificate | format-list

As you see each returns varying views that can be used for further analysis or updating into a database.  With method 1($global:certinfo) we have a collection of values. ComputerName,
Port, KeyLength, SignatureAlgorithm, Ssl2, Ssl3,  Tls, Tls11,  and Tls12 are single values that don’t require a drill down to collect populate values for updating into a database.

Certificates returns a collection.  Item 2 ($global:certinfo.Certificate), dose not return enough visual information I use Item 3. $global:certinfo.Certificate | format-list

as returned by or function to get web records.

Extensions is returned as an object. As of yet I’ve not found a sql datatype at least in 2012 to use for import directly. So, I’ve created this snippet that can extract and serialize into xml.

To see all the elements available I’m using method 3( $global:certinfo.certificate | Format-List) As this returns the Extensions information.

The area that contains Subject, Issuer, Thumbprint, FriendlyName, NotBefore, NotAfter and Extensions.  Extensions contains another collection for the Oid values.

$global:certinfo.certificate.Extensions and $global:certinfo.certificate.Extensions.Oid (not very friendly)

The following code snippet I’ve wrapped into a function just for simplicity. Just strait code without try/catch blocks at least for now. 🙂

This function, takes the the $global:certinfo.certificate.Extensions data and generates a more friendly way to view the data.

function CompactExtensions() # Also, in Insert-SslWebSite function (slightly different format)
$global:MyExt = @()
$global:ext = @()
$Extensions = & {
for ($i = 0; $i -le $global:certinfo.Certificate.Extensions.count – 1; $i++)
Write-Output ([pscustomobject]@{

Extensions = & {
Write-Output ([pscustomobject]@{
ValueU = $global:certinfo.Certificate.Extensions.EnhancedKeyUsages[$i].Value
FriendlyNameU = $global:certinfo.Certificate.Extensions.EnhancedKeyUsages[$i].FriendlyName
Value = $global:certinfo.Certificate.Extensions.oid[$i].Value
FriendlyName = $global:certinfo.Certificate.Extensions.oid[$i].FriendlyName
RawData = ([string]::Join(“,”, $global:certinfo.Certificate.Extensions[$i].RawData))
# #End foreach objectcollection
} #End Object Collection$
$global:ext = $extensions
$global:MyExt = $global:ext.extensions

$EnhancedKeyUsageList = & {
Write-Output ([pscustomobject]@{
EnhancedKeyUsageList = {
foreach ($itm in ($global:certinfo.Certificate.Extensions.EnhancedKeyUsages))
Write-Output ([pscustomobject]@{
Value = $itm.Value
FriendlyName = $itm.FriendlyName
} #End foreach objectcollection
} #End Object Collection

When you run CompactExtensions the Oid collection is extracted and merged into $global:extensions and when presented in this way is more readable.

A version of CompactExtensions is also in the Insert-SslWebSite

As, I want to insert or update records into a database  I want to convert the $global:extensions.extensions into xml for insertion into the database field extensions.

So, using a small piece of .NET $ExtensionsXml = [System.Management.Automation.PSSerializer]::Serialize($global:extensions.extensions, 3)

has created an in memory variable that will be used to input into the extensions field in the table.

Database Fields.

Adding this information into a database will allow for further analysis.  I’ve created (WIP) an API for backend calls to assist with parsing and validating information in the Certificate Subject field and OID values from the extensions xml field.  My API tracks OIDs, serial numbers and generally information contained in the certificate. Still WIP but soon to be published.

CertificateSubject, OU=GTI GNS, O=JPMorgan Chase and Co., STREET=270 Park Ave, L=New York, S=New York, PostalCode=10017, C=US, SERIALNUMBER=0691011, OID. Organization, OID., OID., OU=Umpqua Bank, O=Umpqua Bank, STREET=445 SE Main St., L=Roseburg, S=Oregon, PostalCode=97470, C=US, SERIALNUMBER=143662, OID. Organization, OID.

My first pass at setting up the table that I needed used varchar and nvarchar on fields.

CREATE TABLE [dbo].[SSLByWebSite](
 [URLChecked] [varchar](255) NULL,
 [Port] INT NULL,
 [KeyLength] [varchar](255) NULL,
 [SignatureAlgorithm] [varchar](255) NULL,
 [Ssl2] BIT NULL,
 [Ssl3] BIT NULL,
 [CertificateSubject] [nvarchar](max) NULL,
 [Extensions] [nvarchar](max) NULL,
 [CertificateIssuer] [varchar](255) NULL,
 [CertificateSerialNumber] [varchar](255) NULL,
 [CertificateNotBefore] [varchar](255) NULL,
 [CertificateNotAfter] [varchar](255) NULL,
 [CertificateThumbprint] [varchar](255) NULL,
 [Tls] BIT NULL,
 [Tls11] BIT NULL,
 [Tls12] BIT NULL


Function to upload into database.

The fields that are in the table are represented in the Insert-SslWebSite function.

This function builds up the parameter list that is passed into SQL to be inserted into the table.

Currently no data is checked for dups, nor indexed. In this release. I will add at least one to URLChecked as I do want to make delta

comparison checks on future calls to the web sites.

function Insert-SslWebSite ()
{ # Currently not manditory
param (
[string]$URLChecked = $global:certinfo.URLChecked
[string]$ServerName = $env:servername
[System.Reflection.Assembly]::LoadWithPartialName(‘Microsoft.SqlServer.Smo’) | out-null

$conn = New-Object System.Data.SqlClient.SqlConnection
$conn.ConnectionString = “Server=$ServerName; Database=<DATABASENAME>; Integrated Security=true”

$cmd = New-Object System.Data.SqlClient.SqlCommand
$cmd.CommandText = “INSERT INTO SSLByWebSite(URLChecked,Port `
,CertificateNotBefore, CertificateNotAfter, CertificateThumbprint, Tls,Tls11, Tls12) `
VALUES (@URLChecked, @Port, @KeyLength, @SignatureAlgorithm, @Ssl2, @Ssl3
,@CertificateSubject, @Extensions, @CertificateIssuer `
,@CertificateSerialNumber `
,@CertificateNotBefore `
,@CertificateNotAfter `
,@CertificateThumbprint `
,@Tls `
,@Tls11 `

if ($global:certinfo.KeyLength) { “has VALUE” }
else { $global:certinfo.KeyLength = 0 }

$cmd.Connection = $conn
$CompInfoEntryDate = Get-Date

# Split Extensions prior to insertion into database
$Extensions = & {
for ($i = 0; $i -le $global:certinfo.Certificate.Extensions.count – 1; $i++)
Write-Output ([pscustomobject]@{
Extensions = & {

Write-Output ([pscustomobject]@{
EnhancedKeyUsageList = & {
foreach ($itm in ($global:certinfo.Certificate.Extensions[$i].EnhancedKeyUsages))
Write-Output ([pscustomobject]@{
Value = $itm.Value
FriendlyName = $itm.FriendlyName
} #End foreach objectcollection
} #End Object Collection
catch [System.IO.IOException] {
# Catch all other exceptions thrown by one of those commands
# not really doing anything with these but here in case
# Execute these commands even if there is an exception thrown from the try block
# not really doing anything with these but here in case
Critical = $global:certinfo.Certificate.Extensions[$i].Critical
Value = $global:certinfo.Certificate.Extensions.oid[$i].Value
FriendlyName = $global:certinfo.Certificate.Extensions.oid[$i].FriendlyName
RawData = ([string]::Join(“,”, $global:certinfo.Certificate.Extensions[$i].RawData))
# #End foreach objectcollection
} #End Object Collection$
# Quick export and import as string to for db insert
# Used as example. In practice can use different methods to generate. Both methods work great
# $extensions.extensions | export-clixml c:\temp\temp.txt
# [string]$mdxml = get-Content c:\temp\temp.txt # Value is inserted as a string
# $extensions.extensions | export-clixml c:\temp\temp.txt
$Depth = 3
[string]$mdxml = [System.Management.Automation.PSSerializer]::Serialize($extensions.extensions, $Depth)

# $mdxml = “”
$cmd.Parameters.AddWithValue(“@URLChecked”, $global:certinfo.URLChecked) | Out-Null
$cmd.Parameters.AddWithValue(“@Port”, $global:certinfo.Port) | Out-Null
$cmd.Parameters.AddWithValue(“@KeyLength”, $global:certinfo.KeyLength) | Out-Null
$cmd.Parameters.AddWithValue(“@SignatureAlgorithm”, $global:certinfo.SignatureAlgorithm) | Out-Null
$cmd.Parameters.AddWithValue(“@Ssl2”, $global:certinfo.Ssl2) | Out-Null
$cmd.Parameters.AddWithValue(“@Ssl3”, $global:certinfo.Ssl3) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateSubject”, $global:certinfo.Certificate.Subject) | Out-Null
$cmd.Parameters.AddWithValue(“@Extensions”, $mdxml) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateIssuer”, $global:certinfo.Certificate.Issuer) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateSerialNumber”, $global:certinfo.Certificate.SerialNumber) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateNotBefore”, $global:certinfo.Certificate.NotBefore) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateNotAfter”, $global:certinfo.Certificate.NotAfter) | Out-Null
$cmd.Parameters.AddWithValue(“@CertificateThumbprint”, $global:certinfo.Certificate.Thumbprint) | Out-Null
$cmd.Parameters.AddWithValue(“@Tls”, $global:certinfo.Tls) | Out-Null
$cmd.Parameters.AddWithValue(“@Tls11”, $global:certinfo.Tls11) | Out-Null
$cmd.Parameters.AddWithValue(“@Tls12”, $global:certinfo.Tls12) | Out-Null

$cmd.ExecuteNonQuery() | Out-Null


Once you have both functions implemented it’s easy to run as follows:

Test-WebSiteSslProtocols | Insert-SslWebSite.

It’s nice to be able to pipe this into the Insert-SslWebSite function. And the results of these simple methods show xml for extensions and an inserted

record for URLChecked. I will add a key to URLChecked. 🙂

I intend on placing all code with comments on gethub after I complete my testing and some minor updates to functions to allow for parallel workflows.

So, in a future post I’ll layout how I use PowerShell with Workflows and running parallel while reading in list of websites I want to monitor.

In larger shops this can be beneficial.

Let me know what you think.


3D Art :- Try to escape! :-)

I had an odd thought of what it would be like if we discovered if we’re just living in a big box and attempted to get out.  I remember the Star Trek episodes where captain pike having been captured by an alien race to live out his live in a cage with a glass view. Lots of shows like that. Matrix, The Truman show and on and on.

That’s kind of what I had in mind. But, I thought more of being kept in an asylum and trying to escape.

Categories: #kravis, 3D, ART, MODELING

3D Art :- A Lovely Pose

I rendered this image at 10,000 x 10,000!  To get this far it took about 3 hours before I cancelled the render. Pulled the image into Lightroom so I could reduce the 110MB file into something manageable that I could post.   I will have to let this do an all-night render for comparisons.

Let me know what you think. 🙂

Copyright Joseph Kravis 2017

A lovely pose.

Categories: #kravis, 3D, ART, MODELING